期刊文献+
共找到8篇文章
< 1 >
每页显示 20 50 100
Proposal for Energy Consumption Reduction between Connected Objects in a Network Running on MQTT Protocol
1
作者 Saidou Haman Djorwe Temoa +1 位作者 eric michel deussom djomadji Kolyang   《Journal of Computer and Communications》 2024年第10期177-188,共12页
The “Internet of Things” (IoT) refers to a set of intelligent “objects” that can communicate with each other directly or through a network. The IoT is the embodiment of the idea that everything can be connected an... The “Internet of Things” (IoT) refers to a set of intelligent “objects” that can communicate with each other directly or through a network. The IoT is the embodiment of the idea that everything can be connected anywhere and at any time. The concept can be applied to sectors such as e-health, e-government, automotive, geographic information systems, remote sensing, home networking, e-commerce and climate change mitigation. Unlike the Internet, the IoT has its own constraints, notably those linked to heterogeneity. This divergence is linked to different protocols, technologies and algorithms implemented in these connected objects for their interconnection. It should be noted that IoT devices can communicate with each other using different protocols and dedicated M2M (Machine to Machine) communication technologies. The aim of this work is to find solutions for optimising energy consumption during data exchanges between connected objects, with respect to certain constraints by using firstly this exchange for only Message Queuing Telemetry Transport (MQTT) and secondly the combination of the MQTT protocol and the Constrained Application Protocol (CoAP) protocol to check the quantity of the energy optimized. The MQTT protocol, for example, is one of the most widely used protocols for connected objects. Admittedly, this protocol consumes less energy, but in the situation of a very large number of users, the problem of saturation inevitably arises. In this article, we propose a solution of optimising energy consumption by combining the MQTT protocol with the CoAP protocol which can allow to use the standby mode contrary to the use of MQTT where the broker is always being turning. This solution has not yet been implemented but is being discussed. In this article, we’re going to use the joulemeter which is an application developed by Microsoft to measure and estimate the energy consumption of computers and applications. In our case, we take the example of the “Service Broker for network connections” of the Windows’s 10 Operating System, in my own computer to show the difference between the consumption of energy without the standby mode and with standby mode, because with the MQTT, the Broker’s MQTT is always on. Now, with the combination MQTT and CoAP, it is possible that we have standby mode and to compare these two cases in term of consumption of an energy. And to do it, we must use the joulemeter that we installed in our computer to simulate it. This is achieved by using the CoAP protocol combined with the MQTT protocol. The aim of our work is to reduce energy consumption in order to solve the problem of saturation of the MQTT by linking it to CoAP protocol by using Joulemeter mentioned above. 展开更多
关键词 Internet of Things HETEROGENEITY Message Queuing Telemetry Transport Constrained Application Protocol Application-Layer Semantic Gateway GATEWAY
下载PDF
Modelling of a WDM Network Using Graph Theory and Dijkstra Algorithm for Traffic Redirection
2
作者 eric michel deussom djomadji Ebude Carine Awasume Eloundou Boris Donald 《Journal of Computer and Communications》 2024年第7期78-93,共16页
Optical transport networks are now the basic infrastructure of modern communications systems, including the SDH and WDM backbone network of local network operators, in the case of Cameroon. Given the colossal investme... Optical transport networks are now the basic infrastructure of modern communications systems, including the SDH and WDM backbone network of local network operators, in the case of Cameroon. Given the colossal investments required to deploy these networks, particularly related to the cost of equipment (optical fibers, transponders and multiplexers), the optimization of bandwidth and dynamic allocation of resources is essential to control operating costs and ensure continuity of service. Automatic switching technology for optical networks brings intelligence to the control plane to fully facilitate bandwidth utilization, traffic redirection, and automatic configuration of end-to-end services. This paper considers a local network operator’s WDM network without the implementation of the automatic switching technology, develops a network modeling software platform called Graphic Networks and using graph theory integrates a particularity of the automatic switching technology, which is the automatic rerouting of traffic in case of incident in the network. The incidents considered here are those links or route failures and node failures. 展开更多
关键词 Graph Theory Backbone Network WDM Djikstra Algorithm
下载PDF
Dynamic Resource Allocation in LTE Radio Access Network Using Machine Learning Techniques
3
作者 eric michel deussom djomadji Ivan Basile Kabiena +2 位作者 Valery Nkemeni Ayrton Garcia Belinga À Njere Michael Ekonde Sone 《Journal of Computer and Communications》 2023年第6期73-93,共21页
Current LTE networks are experiencing significant growth in the number of users worldwide. The use of data services for online browsing, e-learning, online meetings and initiatives such as smart cities means that subs... Current LTE networks are experiencing significant growth in the number of users worldwide. The use of data services for online browsing, e-learning, online meetings and initiatives such as smart cities means that subscribers stay connected for long periods, thereby saturating a number of signalling resources. One of such resources is the Radio Resource Connected (RRC) parameter, which is allocated to eNodeBs with the aim of limiting the number of connected simultaneously in the network. The fixed allocation of this parameter means that, depending on the traffic at different times of the day and the geographical position, some eNodeBs are saturated with RRC resources (overused) while others have unused RRC resources. However, as these resources are limited, there is the problem of their underutilization (non-optimal utilization of resources at the eNodeB level) due to static allocation (manual configuration of resources). The objective of this paper is to design an efficient machine learning model that will take as input some key performance indices (KPIs) like traffic data, RRC, simultaneous users, etc., for each eNodeB per hour and per day and accurately predict the number of needed RRC resources that will be dynamically allocated to them in order to avoid traffic and financial losses to the mobile network operator. To reach this target, three machine learning algorithms have been studied namely: linear regression, convolutional neural networks and long short-term memory (LSTM) to train three models and evaluate them. The model trained with the LSTM algorithm gave the best performance with 97% accuracy and was therefore implemented in the proposed solution for RRC resource allocation. An interconnection architecture is also proposed to embed the proposed solution into the Operation and maintenance network of a mobile network operator. In this way, the proposed solution can contribute to developing and expanding the concept of Self Organizing Network (SON) used in 4G and 5G networks. 展开更多
关键词 RRC Resources 4G Network Linear Regression Convolutional Neural Networks Long Short-Term Memory PRECISION
下载PDF
New Ecg Signal Compression Model Based on Set Theory Applied to Images
4
作者 Ivan Basile Kabiena eric michel deussom djomadji Emmanuel Tonye 《Journal of Computer and Communications》 2023年第8期29-43,共15页
Cardiovascular diseases are the origin of many causes of death worldwide. They impose on practitioners optimal diagnostic methods such as telemedicine in order to be able to quickly detect anomalies for daily care and... Cardiovascular diseases are the origin of many causes of death worldwide. They impose on practitioners optimal diagnostic methods such as telemedicine in order to be able to quickly detect anomalies for daily care and monitoring of patients. The Electrocardiogram (ECG) is an examination that can detect abnormal functioning of the heart and generates a large number of digital data which can be stored or transmitted for further analysis. For storage or transmission purposes, one of the challenges is to reduce the space occupied by ECG signal and for that, it is important to offer more and more efficient algorithms capable of achieving high compression rates, while offering a good quality of reconstruction in a relatively short time. We propose in this paper a new ECG compression scheme that is based on a subset of signal splitting and 2D processing, the wavelet transform (DWT) and SPIHT coding which has proved their worth in the field of signal processing and compression. They are exploited for decorrelation and coding of the signal. The results obtained are significant and offer many perspectives. 展开更多
关键词 Compression ECG DWT Sub-Set 2D
下载PDF
Design of a Neural Network Based Stable State Observer for Mimo Systems
5
作者 Jean Gutenbert Kenfack Wamba eric michel deussom djomadji +2 位作者 Jean Claude Lionel Ng’anyogo Arsene Roger Bienvenu Fouba Alain Tiedeu 《Journal of Computer and Communications》 2023年第11期87-110,共24页
MIMO (Multiple Input Multiple Output) is a key technology underpinning fourth generation or 4G networks. This technology allows 4G networks to increase throughput. However, the dynamics of the MIMO system are not unde... MIMO (Multiple Input Multiple Output) is a key technology underpinning fourth generation or 4G networks. This technology allows 4G networks to increase throughput. However, the dynamics of the MIMO system are not under control due to the many uncertainties that destabilize the system. This work is therefore very relevant in the sense that an observer can be used to monitor the dynamics of such a system. This work presents a neuro-adaptive observer based on a radial basis function neural network for generic non-linear MIMO systems. Unlike most neuro-adaptive observers, the proposed observer uses a neural network that is non-linear in its parameters. It can therefore be applied to systems with high degrees of nonlinearity without any a priori knowledge of the system dynamics. Indeed, in addition to the fact that neural networks are very good nonlinear approximators, their adaptive behavior makes them powerful tools for observing the state without any a priori knowledge of the dynamics of the system. The learning rule of the neural network is an approach based on the modified backpropagation algorithm: A term has been added to guarantee the robustness of the observer. The proposed approach is not limited by a strong assumption. The stability of the neuro-adaptive observer is demonstrated by the direct Lyapunov method. Simulation results are presented in the context of MIMO signal transmission applied in LTE, to demonstrate the performance of our observer. 展开更多
关键词 STABILITY Neural Network MIMO LTE Network Lyapunov Function
下载PDF
Machine Learning-Based Approach for Identification of SIM Box Bypass Fraud in a Telecom Network Based on CDR Analysis: Case of a Fixed and Mobile Operator in Cameroon
6
作者 eric michel deussom djomadji Kabiena Ivan Basile +2 位作者 Tchapga Tchito Christian Ferry Vaneck Kouam Djoko Michael Ekonde Sone 《Journal of Computer and Communications》 2023年第2期142-157,共16页
In the telecommunications sector, companies suffer serious damages due to fraud, especially in Africa. One of the main types of fraud is SIM box bypass fraud, which includes using SIM cards to divert incoming internat... In the telecommunications sector, companies suffer serious damages due to fraud, especially in Africa. One of the main types of fraud is SIM box bypass fraud, which includes using SIM cards to divert incoming international calls from mobile operators creating massive losses of revenue. In order to provide a solution to these shortcomings that apply almost to all network operators, we developed intelligent algorithms that exploit huge amounts of data from mobile operators and that detect fraud by analyzing CDRs from voice calls. In this paper we used three classification techniques: Random Forest, Support Vector Machine (SVM) and XGBoost to detect this type of fraud;we compared the performance of these different algorithms to evaluate the model by using data collected from an operator’s network in Cameroon. The algorithm that produced a better performance was the Random Forest with 92% accuracy, so we effectuated the detection of existing fraudulent numbers on the telecommunications operator’s network. 展开更多
关键词 CDR Fraud Detection Machine Learning Voice Calls
下载PDF
Okumura Hata Propagation Model Optimization in 400 MHz Band Based on Differential Evolution Algorithm: Application to the City of Bertoua
7
作者 eric michel deussom djomadji Ivan Basile Kabiena +2 位作者 Joel Thibaut Mandengue Felix Watching Emmanuel Tonye 《Journal of Computer and Communications》 2023年第5期52-69,共18页
Propagation models are the foundation for radio planning in mobile networks. They are widely used during feasibility studies and initial network deployment, or during network extensions, particularly in new cities. Th... Propagation models are the foundation for radio planning in mobile networks. They are widely used during feasibility studies and initial network deployment, or during network extensions, particularly in new cities. They can be used to calculate the power of the signal received by a mobile terminal, evaluate the coverage radius, and calculate the number of cells required to cover a given area. This paper takes into account the standard k factors model and then uses the differential evolution algorithm to set up a propagation model adapted to the physical environment of the Cameroonian cities of Bertoua. Drive tests were made on the LTE TDD network in the city of Bertoua. Differential evolution algorithm is used as the optimization algorithm to deduct a propagation model which fits the environment of the considered town. The calculation of the root mean square error between the actual data from the drive tests and the prediction data from the implemented model allows the validation of the obtained results. A comparative study made between the RMSE value obtained by the new model and those obtained by the Okumura Hata and free space models, allowed us to conclude that the new model obtained is better and more representative of our local environment than the Okumura Hata currently used. The implementation shows that Differential evolution can perform well and solve this kind of optimization problem;the newly obtained models can be used for radio planning in the city of Bertoua in Cameroon. 展开更多
关键词 Radio Measurements Root Mean Square Error Differential Evolution Algorithm
下载PDF
COST 231-Hata Propagation Model Optimization in 1800 MHz Band Based on Magnetic Optimization Algorithm: Application to the City of Limbé
8
作者 eric michel deussom djomadji Kabiena Ivan Basile +1 位作者 Fobasso Segnou Thierry Tonye Emanuel 《Journal of Computer and Communications》 2023年第2期57-74,共18页
Network planning is essential for the construction and the development of wireless networks. The network planning cannot be possible without an appropriate propagation model which in fact is its foundation. Initially ... Network planning is essential for the construction and the development of wireless networks. The network planning cannot be possible without an appropriate propagation model which in fact is its foundation. Initially used mainly for mobile radio networks, the optimization of propagation model is becoming essential for efficient deployment of the network in different types of environment, namely rural, suburban and urban especially with the emergence of concepts such as digital terrestrial television, smart cities, Internet of Things (IoT) with wide deployment for different use cases such as smart grid, smart metering of electricity, gas and water. In this paper we use an optimization algorithm that is inspired by the principles of magnetic field theory namely Magnetic Optimization Algorithm (MOA) to tune COST231-Hata propagation model. The dataset used is the result of drive tests carry out on field in the town of Limbe in Cameroon. We take into account the standard K-factor model and then use the MOA algorithm in order to set up a propagation model adapted to the physical environment of a town. The town of Limbe is used as an implementation case, but the proposed method can be used everywhere. The calculation of the root mean square error (RMSE) between the real data from the radio measurements and the prediction data obtained after the implementation of MOA allows the validation of the results. A comparative study between the value of the RMSE obtained by the new model and those obtained by the optimization using linear regression, by the standard COST231-Hata models, and the free space model is also done, this allows us to conclude that the new model obtained using MOA for the city of Limbe is better and more representative of this local environment than the standard COST231-Hata model. The new model obtained can be used for radio planning in the city of Limbé in Cameroon. 展开更多
关键词 Radio Measurements Root Mean Square Error Magnetic Optimization Algorithm
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部