期刊文献+
共找到3篇文章
< 1 >
每页显示 20 50 100
Dynamic Resource Allocation in LTE Radio Access Network Using Machine Learning Techniques
1
作者 Eric Michel Deussom Djomadji Ivan Basile Kabiena +2 位作者 Valery Nkemeni Ayrton Garcia Belinga À Njere michael ekonde sone 《Journal of Computer and Communications》 2023年第6期73-93,共21页
Current LTE networks are experiencing significant growth in the number of users worldwide. The use of data services for online browsing, e-learning, online meetings and initiatives such as smart cities means that subs... Current LTE networks are experiencing significant growth in the number of users worldwide. The use of data services for online browsing, e-learning, online meetings and initiatives such as smart cities means that subscribers stay connected for long periods, thereby saturating a number of signalling resources. One of such resources is the Radio Resource Connected (RRC) parameter, which is allocated to eNodeBs with the aim of limiting the number of connected simultaneously in the network. The fixed allocation of this parameter means that, depending on the traffic at different times of the day and the geographical position, some eNodeBs are saturated with RRC resources (overused) while others have unused RRC resources. However, as these resources are limited, there is the problem of their underutilization (non-optimal utilization of resources at the eNodeB level) due to static allocation (manual configuration of resources). The objective of this paper is to design an efficient machine learning model that will take as input some key performance indices (KPIs) like traffic data, RRC, simultaneous users, etc., for each eNodeB per hour and per day and accurately predict the number of needed RRC resources that will be dynamically allocated to them in order to avoid traffic and financial losses to the mobile network operator. To reach this target, three machine learning algorithms have been studied namely: linear regression, convolutional neural networks and long short-term memory (LSTM) to train three models and evaluate them. The model trained with the LSTM algorithm gave the best performance with 97% accuracy and was therefore implemented in the proposed solution for RRC resource allocation. An interconnection architecture is also proposed to embed the proposed solution into the Operation and maintenance network of a mobile network operator. In this way, the proposed solution can contribute to developing and expanding the concept of Self Organizing Network (SON) used in 4G and 5G networks. 展开更多
关键词 RRC Resources 4G Network Linear Regression Convolutional Neural Networks Long Short-Term Memory PRECISION
下载PDF
Machine Learning-Based Alarms Classification and Correlation in an SDH/WDM Optical Network to Improve Network Maintenance
2
作者 Deussom Djomadji Eric Michel Takembo Ntahkie Clovis +2 位作者 Tchapga Tchito Christian Arabo Mamadou michael ekonde sone 《Journal of Computer and Communications》 2023年第2期122-141,共20页
The evolution of telecommunications has allowed the development of broadband services based mainly on fiber optic backbone networks. The operation and maintenance of these optical networks is made possible by using su... The evolution of telecommunications has allowed the development of broadband services based mainly on fiber optic backbone networks. The operation and maintenance of these optical networks is made possible by using supervision platforms that generate alarms that can be archived in the form of log files. But analyzing the alarms in the log files is a laborious and difficult task for the engineers who need a degree of expertise. Identifying failures and their root cause can be time consuming and impact the quality of service, network availability and service level agreements signed between the operator and its customers. Therefore, it is more than important to study the different possibilities of alarms classification and to use machine learning algorithms for alarms correlation in order to quickly determine the root causes of problems faster. We conducted a research case study on one of the operators in Cameroon who held an optical backbone based on SDH and WDM technologies with data collected from 2016-03-28 to “2022-09-01” with 7201 rows and 18. In this paper, we will classify alarms according to different criteria and use 02 unsupervised learning algorithms namely the K-Means algorithm and the DBSCAN to establish correlations between alarms in order to identify root causes of problems and reduce the time to troubleshoot. To achieve this objective, log files were exploited in order to obtain the root causes of the alarms, and then K-Means algorithm and the DBSCAN were used firstly to evaluate their performance and their capability to identify the root cause of alarms in optical network. 展开更多
关键词 Optical Network ALARMS Log Files Root Cause Analysis Machine Learning
下载PDF
Machine Learning-Based Approach for Identification of SIM Box Bypass Fraud in a Telecom Network Based on CDR Analysis: Case of a Fixed and Mobile Operator in Cameroon
3
作者 Eric Michel Deussom Djomadji Kabiena Ivan Basile +2 位作者 Tchapga Tchito Christian Ferry Vaneck Kouam Djoko michael ekonde sone 《Journal of Computer and Communications》 2023年第2期142-157,共16页
In the telecommunications sector, companies suffer serious damages due to fraud, especially in Africa. One of the main types of fraud is SIM box bypass fraud, which includes using SIM cards to divert incoming internat... In the telecommunications sector, companies suffer serious damages due to fraud, especially in Africa. One of the main types of fraud is SIM box bypass fraud, which includes using SIM cards to divert incoming international calls from mobile operators creating massive losses of revenue. In order to provide a solution to these shortcomings that apply almost to all network operators, we developed intelligent algorithms that exploit huge amounts of data from mobile operators and that detect fraud by analyzing CDRs from voice calls. In this paper we used three classification techniques: Random Forest, Support Vector Machine (SVM) and XGBoost to detect this type of fraud;we compared the performance of these different algorithms to evaluate the model by using data collected from an operator’s network in Cameroon. The algorithm that produced a better performance was the Random Forest with 92% accuracy, so we effectuated the detection of existing fraudulent numbers on the telecommunications operator’s network. 展开更多
关键词 CDR Fraud Detection Machine Learning Voice Calls
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部