期刊文献+
共找到17篇文章
< 1 >
每页显示 20 50 100
A Hybrid Machine Learning Approach for Improvised QoE in Video Services over 5G Wireless Networks
1
作者 K.B.Ajeyprasaath P.Vetrivelan 《Computers, Materials & Continua》 SCIE EI 2024年第3期3195-3213,共19页
Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications indu... Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications industry loses millions of dollars due to poor video Quality of Experience(QoE)for users.Among the standard proposals for standardizing the quality of video streaming over internet service providers(ISPs)is the Mean Opinion Score(MOS).However,the accurate finding of QoE by MOS is subjective and laborious,and it varies depending on the user.A fully automated data analytics framework is required to reduce the inter-operator variability characteristic in QoE assessment.This work addresses this concern by suggesting a novel hybrid XGBStackQoE analytical model using a two-level layering technique.Level one combines multiple Machine Learning(ML)models via a layer one Hybrid XGBStackQoE-model.Individual ML models at level one are trained using the entire training data set.The level two Hybrid XGBStackQoE-Model is fitted using the outputs(meta-features)of the layer one ML models.The proposed model outperformed the conventional models,with an accuracy improvement of 4 to 5 percent,which is still higher than the current traditional models.The proposed framework could significantly improve video QoE accuracy. 展开更多
关键词 Hybrid XGBStackQoE-model machine learning MOS performance metrics QOE 5g video services
下载PDF
A Collaborative Machine Learning Scheme for Traffic Allocation and Load Balancing for URLLC Service in 5G and Beyond
2
作者 Andreas G. Papidas George C. Polyzos 《Journal of Computer and Communications》 2023年第11期197-207,共11页
Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is t... Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is the trickiest to support and current research is focused on physical or MAC layer solutions, while proposals focused on the network layer using Machine Learning (ML) and Artificial Intelligence (AI) algorithms running on base stations and User Equipment (UE) or Internet of Things (IoT) devices are in early stages. In this paper, we describe the operation rationale of the most recent relevant ML algorithms and techniques, and we propose and validate ML algorithms running on both cells (base stations/gNBs) and UEs or IoT devices to handle URLLC service control. One ML algorithm runs on base stations to evaluate latency demands and offload traffic in case of need, while another lightweight algorithm runs on UEs and IoT devices to rank cells with the best URLLC service in real-time to indicate the best one cell for a UE or IoT device to camp. We show that the interplay of these algorithms leads to good service control and eventually optimal load allocation, under slow load mobility. . 展开更多
关键词 5g and B5g Networks Ultra Reliable Low Latency Communications (URLLC) machine learning (ml) for 5g Temporal Difference Methods (TDM) Monte Carlo Methods Policy Gradient Methods
下载PDF
Detection Collision Flows in SDN Based 5G Using Machine Learning Algorithms
3
作者 Aqsa Aqdus Rashid Amin +3 位作者 Sadia Ramzan Sultan S.Alshamrani Abdullah Alshehri El-Sayed M.El-kenawy 《Computers, Materials & Continua》 SCIE EI 2023年第1期1413-1435,共23页
The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoup... The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoupled in software-defined networking(SDN)and allow the network to be programmable.Each switch in SDN keeps track of forwarding information in a flow table.The SDN switches must search the flow table for the flow rules that match the packets to handle the incoming packets.Due to the obvious vast quantity of data in data centres,the capacity of the flow table restricts the data plane’s forwarding capabilities.So,the SDN must handle traffic from across the whole network.The flow table depends on Ternary Content Addressable Memorable Memory(TCAM)for storing and a quick search of regulations;it is restricted in capacity owing to its elevated cost and energy consumption.Whenever the flow table is abused and overflowing,the usual regulations cannot be executed quickly.In this case,we consider lowrate flow table overflowing that causes collision flow rules to be installed and consumes excessive existing flow table capacity by delivering packets that don’t fit the flow table at a low rate.This study introduces machine learning techniques for detecting and categorizing low-rate collision flows table in SDN,using Feed ForwardNeuralNetwork(FFNN),K-Means,and Decision Tree(DT).We generate two network topologies,Fat Tree and Simple Tree Topologies,with the Mininet simulator and coupled to the OpenDayLight(ODL)controller.The efficiency and efficacy of the suggested algorithms are assessed using several assessment indicators such as success rate query,propagation delay,overall dropped packets,energy consumption,bandwidth usage,latency rate,and throughput.The findings showed that the suggested technique to tackle the flow table congestion problem minimizes the number of flows while retaining the statistical consistency of the 5G network.By putting the proposed flow method and checking whether a packet may move from point A to point B without breaking certain regulations,the evaluation tool examines every flow against a set of criteria.The FFNN with DT and K-means algorithms obtain accuracies of 96.29%and 97.51%,respectively,in the identification of collision flows,according to the experimental outcome when associated with existing methods from the literature. 展开更多
关键词 5g networks software-defined networking(SDN) OpenFlow load balancing machine learning(ml) feed forward neural network(FFNN) k-means and decision tree(DT)
下载PDF
A Model Training Method for DDoS Detection Using CTGAN under 5GC Traffic
4
作者 Yea-Sul Kim Ye-Eun Kim Hwankuk Kim 《Computer Systems Science & Engineering》 SCIE EI 2023年第10期1125-1147,共23页
With the commercialization of 5th-generation mobile communications(5G)networks,a large-scale internet of things(IoT)environment is being built.Security is becoming increasingly crucial in 5G network environments due t... With the commercialization of 5th-generation mobile communications(5G)networks,a large-scale internet of things(IoT)environment is being built.Security is becoming increasingly crucial in 5G network environments due to the growing risk of various distributed denial of service(DDoS)attacks across vast IoT devices.Recently,research on automated intrusion detection using machine learning(ML)for 5G environments has been actively conducted.However,5G traffic has insufficient data due to privacy protection problems and imbalance problems with significantly fewer attack data.If this data is used to train an ML model,it will likely suffer from generalization errors due to not training enough different features on the attack data.Therefore,this paper aims to study a training method to mitigate the generalization error problem of the ML model that classifies IoT DDoS attacks even under conditions of insufficient and imbalanced 5G traffic.We built a 5G testbed to construct a 5G dataset for training to solve the problem of insufficient data.To solve the imbalance problem,synthetic minority oversampling technique(SMOTE)and generative adversarial network(GAN)-based conditional tabular GAN(CTGAN)of data augmentation were used.The performance of the trained ML models was compared and meaningfully analyzed regarding the generalization error problem.The experimental results showed that CTGAN decreased the accuracy and f1-score compared to the Baseline.Still,regarding the generalization error,the difference between the validation and test results was reduced by at least 1.7 and up to 22.88 times,indicating an improvement in the problem.This result suggests that the ML model training method that utilizes CTGANs to augment attack data for training data in the 5G environment mitigates the generalization error problem. 展开更多
关键词 5g core traffic machine learning SMOTE GAN-CTGAN IoT DDoS detection tabular form cyber security B5g mobile network security
下载PDF
Detecting IoT Botnet in 5G Core Network Using Machine Learning
5
作者 Ye-Eun Kim Min-Gyu Kim Hwankuk Kim 《Computers, Materials & Continua》 SCIE EI 2022年第9期4467-4488,共22页
As Internet of Things(IoT)devices with security issues are connected to 5G mobile networks,the importance of IoT Botnet detection research in mobile network environments is increasing.However,the existing research foc... As Internet of Things(IoT)devices with security issues are connected to 5G mobile networks,the importance of IoT Botnet detection research in mobile network environments is increasing.However,the existing research focused on AI-based IoT Botnet detection research in wired network environments.In addition,the existing research related to IoT Botnet detection in ML-based mobile network environments have been conducted up to 4G.Therefore,this paper conducts a study on ML-based IoT Botnet traffic detection in the 5G core network.The binary and multiclass classification was performed to compare simple normal/malicious detection and normal/threetype IoT Botnet malware detection.In both classification methods,the IoT Botnet detection performance using only 5GC’s GTP-U packets decreased by at least 22.99%of accuracy compared to detection in wired network environment.In addition,by conducting a feature importance experiment,the importance of feature study for IoT Botnet detection considering 5GC network characteristics was confirmed.Since this paper analyzed IoT botnet traffic passing through the 5GC network using ML and presented detection results,think it will be meaningful as a reference for research to link AI-based security to the 5GC network. 展开更多
关键词 IoT botnet 5g B5g MALWARE machine learning
下载PDF
Machine Learning for Network Slicing Resource Management:A Comprehensive Survey
6
作者 HAN Bin Hans DSCHOTTEN 《ZTE Communications》 2019年第4期27-32,共6页
The emerging technology of multi-tenancy network slicing is considered as an es sential feature of 5G cellular networks.It provides network slices as a new type of public cloud services and therewith increases the ser... The emerging technology of multi-tenancy network slicing is considered as an es sential feature of 5G cellular networks.It provides network slices as a new type of public cloud services and therewith increases the service flexibility and enhances the network re source efficiency.Meanwhile,it raises new challenges of network resource management.A number of various methods have been proposed over the recent past years,in which machine learning and artificial intelligence techniques are widely deployed.In this article,we provide a survey to existing approaches of network slicing resource management,with a highlight on the roles played by machine learning in them. 展开更多
关键词 5g machine learning multi-tenancy NETWORK SLICING RESOURCE management
下载PDF
The Way to Apply Machine Learning to IoT Driven Wireless Network from Channel Perspective 被引量:5
7
作者 Wei Li Jianhua Zhang +3 位作者 Xiaochuan Ma Yuxiang Zhang Hua Huang Yongmei Cheng 《China Communications》 SCIE CSCD 2019年第1期148-164,共17页
Internet of Things(IoT) is one of the targeted application scenarios of fifth generation(5 G) wireless communication.IoT brings a large amount of data transported on the network.Considering those data,machine learning... Internet of Things(IoT) is one of the targeted application scenarios of fifth generation(5 G) wireless communication.IoT brings a large amount of data transported on the network.Considering those data,machine learning(ML) algorithms can be naturally utilized to make network efficiently and reliably.However,how to fully apply ML to IoT driven wireless network is still open.The fundamental reason is that wireless communication pursuits the high capacity and quality facing the challenges from the varying and fading wireless channel.So in this paper,we explore feasible combination for ML and IoT driven wireless network from wireless channel perspective.Firstly,a three-level structure of wireless channel fading features is defined in order to classify the versatile propagation environments.This three-layer structure includes scenario,meter and wavelength levels.Based on this structure,there are different tasks like service prediction and pushing,self-organization networking,self adapting largescale fading modeling and so on,which can be abstracted into problems like regression,classification,clustering,etc.Then,we introduce corresponding ML methods to different levelsfrom channel perspective,which makes their interdisciplinary research promisingly. 展开更多
关键词 5g Internet of THINGS machine learning WIRELESS CHANNEL
下载PDF
Telemedicine and Smart Healthcare—The Role of Artificial Intelligence, 5G, Cloud Services, and Other Enabling Technologies
8
作者 Taofik Ahmed Suleiman Abdulkareem Adinoyi 《International Journal of Communications, Network and System Sciences》 2023年第3期31-51,共21页
This paper discusses telemedicine and the employment of advanced mobile technologies in smart healthcare delivery. It covers the technological advances in connected smart healthcare, including the roles of artificial ... This paper discusses telemedicine and the employment of advanced mobile technologies in smart healthcare delivery. It covers the technological advances in connected smart healthcare, including the roles of artificial intelligence, machine learning, 5G and IoT platforms, and other enabling technologies. It also presents the challenges and potential risks that could arise from delivering connected smart healthcare services. Healthcare delivery is witnessing revolutions engineered by the developments in mobile connectivity and the plethora of platforms, applications, sensors, devices, and equipment that go along with it. Human society is evolving fast in response to these technological developments, which are also pushing the connectivity-providing sector to create and adopt new waves of network technologies. Consequently, new communications technologies have been introduced into the healthcare system and many novel applications have been developed to make it easier for sharing data in various forms and volumes within health-related services. These applications have also made it possible for telemedicine to be effectively adopted. This paper provides an overview of some of the recent developments within the space of mobile connectivity and telemedicine. 展开更多
关键词 TELEMEDICINE Smart Healthcare 5g Artificial Intelligence machine learning Internet-of-Medical-Things
下载PDF
Artificial Intelligence-Empowered Resource Management for Future Wireless Communications: A Survey 被引量:12
9
作者 Mengting Lin Youping Zhao 《China Communications》 SCIE CSCD 2020年第3期58-77,共20页
How to explore and exploit the full potential of artificial intelligence(AI)technologies in future wireless communications such as beyond 5G(B5G)and 6G is an extremely hot inter-disciplinary research topic around the ... How to explore and exploit the full potential of artificial intelligence(AI)technologies in future wireless communications such as beyond 5G(B5G)and 6G is an extremely hot inter-disciplinary research topic around the world.On the one hand,AI empowers intelligent resource management for wireless communications through powerful learning and automatic adaptation capabilities.On the other hand,embracing AI in wireless communication resource management calls for new network architecture and system models as well as standardized interfaces/protocols/data formats to facilitate the large-scale deployment of AI in future B5G/6G networks.This paper reviews the state-of-art AI-empowered resource management from the framework perspective down to the methodology perspective,not only considering the radio resource(e.g.,spectrum)management but also other types of resources such as computing and caching.We also discuss the challenges and opportunities for AI-based resource management to widely deploy AI in future wireless communication networks. 展开更多
关键词 5g BEYOND 5g(B5g) 6G artificial intelligence(AI) machine learning(ml) network SLICING RESOURCE management
下载PDF
Enabling Energy Efficiency in 5G Network 被引量:4
10
作者 LIU Zhuang GAO Yin +2 位作者 LI Dapeng CHEN Jiajun HAN Jiren 《ZTE Communications》 2021年第1期20-29,共10页
The mobile Internet and Internet of Things are considered the main driving forc⁃es of 5G,as they require an ultra-dense deployment of small base stations to meet the in⁃creasing traffic demands.5G new radio(NR)access ... The mobile Internet and Internet of Things are considered the main driving forc⁃es of 5G,as they require an ultra-dense deployment of small base stations to meet the in⁃creasing traffic demands.5G new radio(NR)access is designed to enable denser network deployments,while leading to a significant concern about the network energy consump⁃tion.Energy consumption is a main part of network operational expense(OPEX),and base stations work as the main energy consumption equipment in the radio access network(RAN).In order to achieve RAN energy efficiency(EE),switching off cells is a strategy to reduce the energy consumption of networks during off-peak conditions.This paper intro⁃duces NR cell switching on/off schemes in 3GPP to achieve energy efficiency in 5G RAN,including intra-system energy saving(ES)scheme and inter-system ES scheme.Addition⁃ally,NR architectural features including central unit/distributed unit(CU/DU)split and dual connectivity(DC)are also considered in NR energy saving.How to apply artificial in⁃telligence(AI)into 5G networks is a new topic in 3GPP,and we also propose a machine learning(ML)based scheme to save energy by switching off the cell selected relying on the load prediction.According to the experiment results in the real wireless environment,the ML based ES scheme can reduce more power consumption than the conventional ES scheme without load prediction. 展开更多
关键词 cell switch off energy efficiency energy saving 5g machine learning
下载PDF
Network-Aided Intelligent Traffic Steering in 5G Mobile Networks 被引量:1
11
作者 Dae-Young Kim Seokhoon Kim 《Computers, Materials & Continua》 SCIE EI 2020年第10期243-261,共19页
Recently,the fifth generation(5G)of mobile networks has been deployed and various ranges of mobile services have been provided.The 5G mobile network supports improved mobile broadband,ultra-low latency and densely dep... Recently,the fifth generation(5G)of mobile networks has been deployed and various ranges of mobile services have been provided.The 5G mobile network supports improved mobile broadband,ultra-low latency and densely deployed massive devices.It allows multiple radio access technologies and interworks them for services.5G mobile systems employ traffic steering techniques to efficiently use multiple radio access technologies.However,conventional traffic steering techniques do not consider dynamic network conditions efficiently.In this paper,we propose a network aided traffic steering technique in 5G mobile network architecture.5G mobile systems monitor network conditions and learn with network data.Through a machine learning algorithm such as a feed-forward neural network,it recognizes dynamic network conditions and then performs traffic steering.The proposed scheme controls traffic for multiple radio access according to the ratio of measured throughput.Thus,it can be expected to improve traffic steering efficiency.The performance of the proposed traffic steering scheme is evaluated using extensive computer simulations. 展开更多
关键词 Mobile network 5g traffic steering machine learning MEC
下载PDF
A Case Study on Intelligent Operation System for Wireless Networks
12
作者 LIU Jianwei YUAN Yifei HAN Jing 《ZTE Communications》 2019年第4期19-26,共8页
The emerging fifth generation(5G)network has the potential to satisfy the rapidly growing traffic demand and promote the transformation of smartphone-centric networks into an Internet of Things(IoT)ecosystem.Due to th... The emerging fifth generation(5G)network has the potential to satisfy the rapidly growing traffic demand and promote the transformation of smartphone-centric networks into an Internet of Things(IoT)ecosystem.Due to the introduction of new communication technologies and the increased density of 5G cells,the complexity of operation and operational expenditure(OPEX)will become very challenging in 5G.Self-organizing network(SON)has been researched extensively since 2G,to cope with the similar challenge,however by predefined poli cies,rather than intelligent analysis.The requirement for better quality of experience and the complexity of 5G network demands call for an approach that is different from SON.In several recent studies,the combination of machine learning(ML)technology with SON has been investi gated.In this paper,we focus on the intelligent operation of wireless network through ML algo rithms.A comprehensive and flexible framework is proposed to achieve an intelligent operation system.Two use cases are also studied to use ML algorithms to automate the anomaly detection and fault diagnosis of key performance indicators(KPIs)in wireless networks.The effectiveness of the proposed ML algorithms is demonstrated by the real data experiments,thus encouraging the further research for intelligent wireless network operation. 展开更多
关键词 5g SELF-ORGANIZING network machine learning ANOMALY detection FAULT diagnosis
下载PDF
Addressing the CQI feedback delay in 5G/6G networks via machine learning and evolutionary computing
13
作者 Andson Balieiro Kelvin Dias Paulo Guarda 《Intelligent and Converged Networks》 EI 2022年第3期271-281,共11页
5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channe... 5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems. 展开更多
关键词 channel quality indicator(CQI)feedback delay 5g/6G networks machine learning evolutionary computing
原文传递
未来移动通信网络智能优化技术研究 被引量:3
14
作者 顾昕钰 《信息通信技术与政策》 2018年第11期20-25,共6页
面对日益复杂的移动通信网络,智能化是未来网络自适应优化技术的发展方向。在自适应优化方案中采用机器学习算法,使网络具有智能,能够根据环境和状态的变化协调各种优化目标,实现最优参数配置。本文在分析常用的机器学习算法的基础上,... 面对日益复杂的移动通信网络,智能化是未来网络自适应优化技术的发展方向。在自适应优化方案中采用机器学习算法,使网络具有智能,能够根据环境和状态的变化协调各种优化目标,实现最优参数配置。本文在分析常用的机器学习算法的基础上,结合对未来网络数据特征的梳理,提出了初步的网络智能优化技术框架和步骤,并对各种网络优化功能下所适合采用的机器学习算法进行分类整理。 展开更多
关键词 网络自组织 5g 机器学习 自适应优化 智能优化
下载PDF
The Interdisciplinary Research of Big Data and Wireless Channel: A Cluster-Nuclei Based Channel Model 被引量:19
15
作者 Jianhua Zhang 《China Communications》 SCIE CSCD 2016年第S2期14-26,共13页
Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big... Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big volume feature,considering the massive antennas,huge bandwidth and versatile application scenarios.This article firstly presents a comprehensive survey of channel measurement and modeling research for mobile communication,especially for 5th Generation(5G) and beyond.Considering the big data research progress,then a cluster-nuclei based model is proposed,which takes advantages of both the stochastical model and deterministic model.The novel model has low complexity with the limited number of cluster-nuclei while the cluster-nuclei has the physical mapping to real propagation objects.Combining the channel properties variation principles with antenna size,frequency,mobility and scenario dug from the channel data,the proposed model can be expanded in versatile application to support future mobile research. 展开更多
关键词 channel model big data 5g massive MIMO machine learning CLUSTER
下载PDF
User Association in Ultra-Dense Small Cell Dynamic Vehicular Networks: A Reinforcement Learning Approach
16
作者 Shipra Kapoor David Grace Tim Clarke 《Journal of Communications and Information Networks》 CSCD 2019年第1期1-12,共12页
Network densification is envisioned as one of the key enabling technologies in the next generation and beyond wireless networks to satisfy the demand of high coverage and capacity whilst deliver an ultra-reliable low ... Network densification is envisioned as one of the key enabling technologies in the next generation and beyond wireless networks to satisfy the demand of high coverage and capacity whilst deliver an ultra-reliable low latency communication services especially to the users on the move.One of the fundamental tasks in wireless networks is user association.In the case of ultra-dense vehicular networks,due to the dense deployment and small coverage of the eNodeBs,there may be more than one eNodeB that may simultaneously satisfy the conventional maximum radio signal strength user association criteria.In addition to this,the spatial-temporal vehicle distribution in dynamic environments contribute significantly towards the rapidly changing radio environment that substantially impacts the user association,therefore,the network performance and user experience.This paper addresses the problem of user association in dynamic environments by proposing intelligent user association approach,variable-reward,quality-aware Q-learning(VR-QAQL)that has an ability to strike a balance between the number of handovers per transmission and system performance whilst a guaranteed network quality of service is delivered.The VR-QAQL technique integrates the control-theoretic concepts and the reinforcement learning approach in an LTE uplink,using the framework of an urban vehicular environment.The algorithm is assessed using large-scale simulation on a highway scenario at different vehicle speeds in an urban setting.The results demonstrate that the proposed VR-QAQL algorithm outperforms all the other investigated approaches across all mobility levels. 展开更多
关键词 5g access protocols adaptive algorithm control design dynamic range HANDOVER machine learning algorithms multiagent systems radio access networks UPLINK user centered design
原文传递
QSPCA: A two-stage efficient power control approach in D2D communication for 5G networks
17
作者 Saurabh Chandra Prateek +2 位作者 Rohit Sharma Rajeev Arya Korhan Cengiz 《Intelligent and Converged Networks》 2021年第4期295-305,共11页
The existing literature on device-to-device(D2D)architecture suffers from a dearth of analysis under imperfect channel conditions.There is a need for rigorous analyses on the policy improvement and evaluation of netwo... The existing literature on device-to-device(D2D)architecture suffers from a dearth of analysis under imperfect channel conditions.There is a need for rigorous analyses on the policy improvement and evaluation of network performance.Accordingly,a two-stage transmit power control approach(named QSPCA)is proposed:First,a reinforcement Q-learning based power control technique and;second,a supervised learning based support vector machine(SVM)model.This model replaces the unified communication model of the conventional D2D setup with a distributed one,thereby requiring lower resources,such as D2D throughput,transmit power,and signal-to-interference-plus-noise ratio as compared to existing algorithms.Results confirm that the QSPCA technique is better than existing models by at least 15.31%and 19.5%in terms of throughput as compared to SVM and Q-learning techniques,respectively.The customizability of the QSPCA technique opens up multiple avenues and industrial communication technologies in 5G networks,such as factory automation. 展开更多
关键词 device-to-device(D2D) interference Internet of Things(IoT) machine learning power control Q-learning support vector machine(SVM) 5g
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部