Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications indu...Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications industry loses millions of dollars due to poor video Quality of Experience(QoE)for users.Among the standard proposals for standardizing the quality of video streaming over internet service providers(ISPs)is the Mean Opinion Score(MOS).However,the accurate finding of QoE by MOS is subjective and laborious,and it varies depending on the user.A fully automated data analytics framework is required to reduce the inter-operator variability characteristic in QoE assessment.This work addresses this concern by suggesting a novel hybrid XGBStackQoE analytical model using a two-level layering technique.Level one combines multiple Machine Learning(ML)models via a layer one Hybrid XGBStackQoE-model.Individual ML models at level one are trained using the entire training data set.The level two Hybrid XGBStackQoE-Model is fitted using the outputs(meta-features)of the layer one ML models.The proposed model outperformed the conventional models,with an accuracy improvement of 4 to 5 percent,which is still higher than the current traditional models.The proposed framework could significantly improve video QoE accuracy.展开更多
Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is t...Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is the trickiest to support and current research is focused on physical or MAC layer solutions, while proposals focused on the network layer using Machine Learning (ML) and Artificial Intelligence (AI) algorithms running on base stations and User Equipment (UE) or Internet of Things (IoT) devices are in early stages. In this paper, we describe the operation rationale of the most recent relevant ML algorithms and techniques, and we propose and validate ML algorithms running on both cells (base stations/gNBs) and UEs or IoT devices to handle URLLC service control. One ML algorithm runs on base stations to evaluate latency demands and offload traffic in case of need, while another lightweight algorithm runs on UEs and IoT devices to rank cells with the best URLLC service in real-time to indicate the best one cell for a UE or IoT device to camp. We show that the interplay of these algorithms leads to good service control and eventually optimal load allocation, under slow load mobility. .展开更多
The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoup...The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoupled in software-defined networking(SDN)and allow the network to be programmable.Each switch in SDN keeps track of forwarding information in a flow table.The SDN switches must search the flow table for the flow rules that match the packets to handle the incoming packets.Due to the obvious vast quantity of data in data centres,the capacity of the flow table restricts the data plane’s forwarding capabilities.So,the SDN must handle traffic from across the whole network.The flow table depends on Ternary Content Addressable Memorable Memory(TCAM)for storing and a quick search of regulations;it is restricted in capacity owing to its elevated cost and energy consumption.Whenever the flow table is abused and overflowing,the usual regulations cannot be executed quickly.In this case,we consider lowrate flow table overflowing that causes collision flow rules to be installed and consumes excessive existing flow table capacity by delivering packets that don’t fit the flow table at a low rate.This study introduces machine learning techniques for detecting and categorizing low-rate collision flows table in SDN,using Feed ForwardNeuralNetwork(FFNN),K-Means,and Decision Tree(DT).We generate two network topologies,Fat Tree and Simple Tree Topologies,with the Mininet simulator and coupled to the OpenDayLight(ODL)controller.The efficiency and efficacy of the suggested algorithms are assessed using several assessment indicators such as success rate query,propagation delay,overall dropped packets,energy consumption,bandwidth usage,latency rate,and throughput.The findings showed that the suggested technique to tackle the flow table congestion problem minimizes the number of flows while retaining the statistical consistency of the 5G network.By putting the proposed flow method and checking whether a packet may move from point A to point B without breaking certain regulations,the evaluation tool examines every flow against a set of criteria.The FFNN with DT and K-means algorithms obtain accuracies of 96.29%and 97.51%,respectively,in the identification of collision flows,according to the experimental outcome when associated with existing methods from the literature.展开更多
With the commercialization of 5th-generation mobile communications(5G)networks,a large-scale internet of things(IoT)environment is being built.Security is becoming increasingly crucial in 5G network environments due t...With the commercialization of 5th-generation mobile communications(5G)networks,a large-scale internet of things(IoT)environment is being built.Security is becoming increasingly crucial in 5G network environments due to the growing risk of various distributed denial of service(DDoS)attacks across vast IoT devices.Recently,research on automated intrusion detection using machine learning(ML)for 5G environments has been actively conducted.However,5G traffic has insufficient data due to privacy protection problems and imbalance problems with significantly fewer attack data.If this data is used to train an ML model,it will likely suffer from generalization errors due to not training enough different features on the attack data.Therefore,this paper aims to study a training method to mitigate the generalization error problem of the ML model that classifies IoT DDoS attacks even under conditions of insufficient and imbalanced 5G traffic.We built a 5G testbed to construct a 5G dataset for training to solve the problem of insufficient data.To solve the imbalance problem,synthetic minority oversampling technique(SMOTE)and generative adversarial network(GAN)-based conditional tabular GAN(CTGAN)of data augmentation were used.The performance of the trained ML models was compared and meaningfully analyzed regarding the generalization error problem.The experimental results showed that CTGAN decreased the accuracy and f1-score compared to the Baseline.Still,regarding the generalization error,the difference between the validation and test results was reduced by at least 1.7 and up to 22.88 times,indicating an improvement in the problem.This result suggests that the ML model training method that utilizes CTGANs to augment attack data for training data in the 5G environment mitigates the generalization error problem.展开更多
As Internet of Things(IoT)devices with security issues are connected to 5G mobile networks,the importance of IoT Botnet detection research in mobile network environments is increasing.However,the existing research foc...As Internet of Things(IoT)devices with security issues are connected to 5G mobile networks,the importance of IoT Botnet detection research in mobile network environments is increasing.However,the existing research focused on AI-based IoT Botnet detection research in wired network environments.In addition,the existing research related to IoT Botnet detection in ML-based mobile network environments have been conducted up to 4G.Therefore,this paper conducts a study on ML-based IoT Botnet traffic detection in the 5G core network.The binary and multiclass classification was performed to compare simple normal/malicious detection and normal/threetype IoT Botnet malware detection.In both classification methods,the IoT Botnet detection performance using only 5GC’s GTP-U packets decreased by at least 22.99%of accuracy compared to detection in wired network environment.In addition,by conducting a feature importance experiment,the importance of feature study for IoT Botnet detection considering 5GC network characteristics was confirmed.Since this paper analyzed IoT botnet traffic passing through the 5GC network using ML and presented detection results,think it will be meaningful as a reference for research to link AI-based security to the 5GC network.展开更多
The emerging technology of multi-tenancy network slicing is considered as an es sential feature of 5G cellular networks.It provides network slices as a new type of public cloud services and therewith increases the ser...The emerging technology of multi-tenancy network slicing is considered as an es sential feature of 5G cellular networks.It provides network slices as a new type of public cloud services and therewith increases the service flexibility and enhances the network re source efficiency.Meanwhile,it raises new challenges of network resource management.A number of various methods have been proposed over the recent past years,in which machine learning and artificial intelligence techniques are widely deployed.In this article,we provide a survey to existing approaches of network slicing resource management,with a highlight on the roles played by machine learning in them.展开更多
Internet of Things(IoT) is one of the targeted application scenarios of fifth generation(5 G) wireless communication.IoT brings a large amount of data transported on the network.Considering those data,machine learning...Internet of Things(IoT) is one of the targeted application scenarios of fifth generation(5 G) wireless communication.IoT brings a large amount of data transported on the network.Considering those data,machine learning(ML) algorithms can be naturally utilized to make network efficiently and reliably.However,how to fully apply ML to IoT driven wireless network is still open.The fundamental reason is that wireless communication pursuits the high capacity and quality facing the challenges from the varying and fading wireless channel.So in this paper,we explore feasible combination for ML and IoT driven wireless network from wireless channel perspective.Firstly,a three-level structure of wireless channel fading features is defined in order to classify the versatile propagation environments.This three-layer structure includes scenario,meter and wavelength levels.Based on this structure,there are different tasks like service prediction and pushing,self-organization networking,self adapting largescale fading modeling and so on,which can be abstracted into problems like regression,classification,clustering,etc.Then,we introduce corresponding ML methods to different levelsfrom channel perspective,which makes their interdisciplinary research promisingly.展开更多
This paper discusses telemedicine and the employment of advanced mobile technologies in smart healthcare delivery. It covers the technological advances in connected smart healthcare, including the roles of artificial ...This paper discusses telemedicine and the employment of advanced mobile technologies in smart healthcare delivery. It covers the technological advances in connected smart healthcare, including the roles of artificial intelligence, machine learning, 5G and IoT platforms, and other enabling technologies. It also presents the challenges and potential risks that could arise from delivering connected smart healthcare services. Healthcare delivery is witnessing revolutions engineered by the developments in mobile connectivity and the plethora of platforms, applications, sensors, devices, and equipment that go along with it. Human society is evolving fast in response to these technological developments, which are also pushing the connectivity-providing sector to create and adopt new waves of network technologies. Consequently, new communications technologies have been introduced into the healthcare system and many novel applications have been developed to make it easier for sharing data in various forms and volumes within health-related services. These applications have also made it possible for telemedicine to be effectively adopted. This paper provides an overview of some of the recent developments within the space of mobile connectivity and telemedicine.展开更多
How to explore and exploit the full potential of artificial intelligence(AI)technologies in future wireless communications such as beyond 5G(B5G)and 6G is an extremely hot inter-disciplinary research topic around the ...How to explore and exploit the full potential of artificial intelligence(AI)technologies in future wireless communications such as beyond 5G(B5G)and 6G is an extremely hot inter-disciplinary research topic around the world.On the one hand,AI empowers intelligent resource management for wireless communications through powerful learning and automatic adaptation capabilities.On the other hand,embracing AI in wireless communication resource management calls for new network architecture and system models as well as standardized interfaces/protocols/data formats to facilitate the large-scale deployment of AI in future B5G/6G networks.This paper reviews the state-of-art AI-empowered resource management from the framework perspective down to the methodology perspective,not only considering the radio resource(e.g.,spectrum)management but also other types of resources such as computing and caching.We also discuss the challenges and opportunities for AI-based resource management to widely deploy AI in future wireless communication networks.展开更多
The mobile Internet and Internet of Things are considered the main driving forc⁃es of 5G,as they require an ultra-dense deployment of small base stations to meet the in⁃creasing traffic demands.5G new radio(NR)access ...The mobile Internet and Internet of Things are considered the main driving forc⁃es of 5G,as they require an ultra-dense deployment of small base stations to meet the in⁃creasing traffic demands.5G new radio(NR)access is designed to enable denser network deployments,while leading to a significant concern about the network energy consump⁃tion.Energy consumption is a main part of network operational expense(OPEX),and base stations work as the main energy consumption equipment in the radio access network(RAN).In order to achieve RAN energy efficiency(EE),switching off cells is a strategy to reduce the energy consumption of networks during off-peak conditions.This paper intro⁃duces NR cell switching on/off schemes in 3GPP to achieve energy efficiency in 5G RAN,including intra-system energy saving(ES)scheme and inter-system ES scheme.Addition⁃ally,NR architectural features including central unit/distributed unit(CU/DU)split and dual connectivity(DC)are also considered in NR energy saving.How to apply artificial in⁃telligence(AI)into 5G networks is a new topic in 3GPP,and we also propose a machine learning(ML)based scheme to save energy by switching off the cell selected relying on the load prediction.According to the experiment results in the real wireless environment,the ML based ES scheme can reduce more power consumption than the conventional ES scheme without load prediction.展开更多
Recently,the fifth generation(5G)of mobile networks has been deployed and various ranges of mobile services have been provided.The 5G mobile network supports improved mobile broadband,ultra-low latency and densely dep...Recently,the fifth generation(5G)of mobile networks has been deployed and various ranges of mobile services have been provided.The 5G mobile network supports improved mobile broadband,ultra-low latency and densely deployed massive devices.It allows multiple radio access technologies and interworks them for services.5G mobile systems employ traffic steering techniques to efficiently use multiple radio access technologies.However,conventional traffic steering techniques do not consider dynamic network conditions efficiently.In this paper,we propose a network aided traffic steering technique in 5G mobile network architecture.5G mobile systems monitor network conditions and learn with network data.Through a machine learning algorithm such as a feed-forward neural network,it recognizes dynamic network conditions and then performs traffic steering.The proposed scheme controls traffic for multiple radio access according to the ratio of measured throughput.Thus,it can be expected to improve traffic steering efficiency.The performance of the proposed traffic steering scheme is evaluated using extensive computer simulations.展开更多
The emerging fifth generation(5G)network has the potential to satisfy the rapidly growing traffic demand and promote the transformation of smartphone-centric networks into an Internet of Things(IoT)ecosystem.Due to th...The emerging fifth generation(5G)network has the potential to satisfy the rapidly growing traffic demand and promote the transformation of smartphone-centric networks into an Internet of Things(IoT)ecosystem.Due to the introduction of new communication technologies and the increased density of 5G cells,the complexity of operation and operational expenditure(OPEX)will become very challenging in 5G.Self-organizing network(SON)has been researched extensively since 2G,to cope with the similar challenge,however by predefined poli cies,rather than intelligent analysis.The requirement for better quality of experience and the complexity of 5G network demands call for an approach that is different from SON.In several recent studies,the combination of machine learning(ML)technology with SON has been investi gated.In this paper,we focus on the intelligent operation of wireless network through ML algo rithms.A comprehensive and flexible framework is proposed to achieve an intelligent operation system.Two use cases are also studied to use ML algorithms to automate the anomaly detection and fault diagnosis of key performance indicators(KPIs)in wireless networks.The effectiveness of the proposed ML algorithms is demonstrated by the real data experiments,thus encouraging the further research for intelligent wireless network operation.展开更多
5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channe...5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.展开更多
Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big...Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big volume feature,considering the massive antennas,huge bandwidth and versatile application scenarios.This article firstly presents a comprehensive survey of channel measurement and modeling research for mobile communication,especially for 5th Generation(5G) and beyond.Considering the big data research progress,then a cluster-nuclei based model is proposed,which takes advantages of both the stochastical model and deterministic model.The novel model has low complexity with the limited number of cluster-nuclei while the cluster-nuclei has the physical mapping to real propagation objects.Combining the channel properties variation principles with antenna size,frequency,mobility and scenario dug from the channel data,the proposed model can be expanded in versatile application to support future mobile research.展开更多
Network densification is envisioned as one of the key enabling technologies in the next generation and beyond wireless networks to satisfy the demand of high coverage and capacity whilst deliver an ultra-reliable low ...Network densification is envisioned as one of the key enabling technologies in the next generation and beyond wireless networks to satisfy the demand of high coverage and capacity whilst deliver an ultra-reliable low latency communication services especially to the users on the move.One of the fundamental tasks in wireless networks is user association.In the case of ultra-dense vehicular networks,due to the dense deployment and small coverage of the eNodeBs,there may be more than one eNodeB that may simultaneously satisfy the conventional maximum radio signal strength user association criteria.In addition to this,the spatial-temporal vehicle distribution in dynamic environments contribute significantly towards the rapidly changing radio environment that substantially impacts the user association,therefore,the network performance and user experience.This paper addresses the problem of user association in dynamic environments by proposing intelligent user association approach,variable-reward,quality-aware Q-learning(VR-QAQL)that has an ability to strike a balance between the number of handovers per transmission and system performance whilst a guaranteed network quality of service is delivered.The VR-QAQL technique integrates the control-theoretic concepts and the reinforcement learning approach in an LTE uplink,using the framework of an urban vehicular environment.The algorithm is assessed using large-scale simulation on a highway scenario at different vehicle speeds in an urban setting.The results demonstrate that the proposed VR-QAQL algorithm outperforms all the other investigated approaches across all mobility levels.展开更多
The existing literature on device-to-device(D2D)architecture suffers from a dearth of analysis under imperfect channel conditions.There is a need for rigorous analyses on the policy improvement and evaluation of netwo...The existing literature on device-to-device(D2D)architecture suffers from a dearth of analysis under imperfect channel conditions.There is a need for rigorous analyses on the policy improvement and evaluation of network performance.Accordingly,a two-stage transmit power control approach(named QSPCA)is proposed:First,a reinforcement Q-learning based power control technique and;second,a supervised learning based support vector machine(SVM)model.This model replaces the unified communication model of the conventional D2D setup with a distributed one,thereby requiring lower resources,such as D2D throughput,transmit power,and signal-to-interference-plus-noise ratio as compared to existing algorithms.Results confirm that the QSPCA technique is better than existing models by at least 15.31%and 19.5%in terms of throughput as compared to SVM and Q-learning techniques,respectively.The customizability of the QSPCA technique opens up multiple avenues and industrial communication technologies in 5G networks,such as factory automation.展开更多
文摘Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications industry loses millions of dollars due to poor video Quality of Experience(QoE)for users.Among the standard proposals for standardizing the quality of video streaming over internet service providers(ISPs)is the Mean Opinion Score(MOS).However,the accurate finding of QoE by MOS is subjective and laborious,and it varies depending on the user.A fully automated data analytics framework is required to reduce the inter-operator variability characteristic in QoE assessment.This work addresses this concern by suggesting a novel hybrid XGBStackQoE analytical model using a two-level layering technique.Level one combines multiple Machine Learning(ML)models via a layer one Hybrid XGBStackQoE-model.Individual ML models at level one are trained using the entire training data set.The level two Hybrid XGBStackQoE-Model is fitted using the outputs(meta-features)of the layer one ML models.The proposed model outperformed the conventional models,with an accuracy improvement of 4 to 5 percent,which is still higher than the current traditional models.The proposed framework could significantly improve video QoE accuracy.
文摘Key challenges for 5G and Beyond networks relate with the requirements for exceptionally low latency, high reliability, and extremely high data rates. The Ultra-Reliable Low Latency Communication (URLLC) use case is the trickiest to support and current research is focused on physical or MAC layer solutions, while proposals focused on the network layer using Machine Learning (ML) and Artificial Intelligence (AI) algorithms running on base stations and User Equipment (UE) or Internet of Things (IoT) devices are in early stages. In this paper, we describe the operation rationale of the most recent relevant ML algorithms and techniques, and we propose and validate ML algorithms running on both cells (base stations/gNBs) and UEs or IoT devices to handle URLLC service control. One ML algorithm runs on base stations to evaluate latency demands and offload traffic in case of need, while another lightweight algorithm runs on UEs and IoT devices to rank cells with the best URLLC service in real-time to indicate the best one cell for a UE or IoT device to camp. We show that the interplay of these algorithms leads to good service control and eventually optimal load allocation, under slow load mobility. .
基金Taif University Researchers supporting Project number(TURSP-2020/215),Taif University,Taif,Saudi Arabia.
文摘The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoupled in software-defined networking(SDN)and allow the network to be programmable.Each switch in SDN keeps track of forwarding information in a flow table.The SDN switches must search the flow table for the flow rules that match the packets to handle the incoming packets.Due to the obvious vast quantity of data in data centres,the capacity of the flow table restricts the data plane’s forwarding capabilities.So,the SDN must handle traffic from across the whole network.The flow table depends on Ternary Content Addressable Memorable Memory(TCAM)for storing and a quick search of regulations;it is restricted in capacity owing to its elevated cost and energy consumption.Whenever the flow table is abused and overflowing,the usual regulations cannot be executed quickly.In this case,we consider lowrate flow table overflowing that causes collision flow rules to be installed and consumes excessive existing flow table capacity by delivering packets that don’t fit the flow table at a low rate.This study introduces machine learning techniques for detecting and categorizing low-rate collision flows table in SDN,using Feed ForwardNeuralNetwork(FFNN),K-Means,and Decision Tree(DT).We generate two network topologies,Fat Tree and Simple Tree Topologies,with the Mininet simulator and coupled to the OpenDayLight(ODL)controller.The efficiency and efficacy of the suggested algorithms are assessed using several assessment indicators such as success rate query,propagation delay,overall dropped packets,energy consumption,bandwidth usage,latency rate,and throughput.The findings showed that the suggested technique to tackle the flow table congestion problem minimizes the number of flows while retaining the statistical consistency of the 5G network.By putting the proposed flow method and checking whether a packet may move from point A to point B without breaking certain regulations,the evaluation tool examines every flow against a set of criteria.The FFNN with DT and K-means algorithms obtain accuracies of 96.29%and 97.51%,respectively,in the identification of collision flows,according to the experimental outcome when associated with existing methods from the literature.
基金This work was supported by Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.2021-0-00796Research on Foundational Technologies for 6GAutonomous Security-by-Design toGuarantee Constant Quality of Security).
文摘With the commercialization of 5th-generation mobile communications(5G)networks,a large-scale internet of things(IoT)environment is being built.Security is becoming increasingly crucial in 5G network environments due to the growing risk of various distributed denial of service(DDoS)attacks across vast IoT devices.Recently,research on automated intrusion detection using machine learning(ML)for 5G environments has been actively conducted.However,5G traffic has insufficient data due to privacy protection problems and imbalance problems with significantly fewer attack data.If this data is used to train an ML model,it will likely suffer from generalization errors due to not training enough different features on the attack data.Therefore,this paper aims to study a training method to mitigate the generalization error problem of the ML model that classifies IoT DDoS attacks even under conditions of insufficient and imbalanced 5G traffic.We built a 5G testbed to construct a 5G dataset for training to solve the problem of insufficient data.To solve the imbalance problem,synthetic minority oversampling technique(SMOTE)and generative adversarial network(GAN)-based conditional tabular GAN(CTGAN)of data augmentation were used.The performance of the trained ML models was compared and meaningfully analyzed regarding the generalization error problem.The experimental results showed that CTGAN decreased the accuracy and f1-score compared to the Baseline.Still,regarding the generalization error,the difference between the validation and test results was reduced by at least 1.7 and up to 22.88 times,indicating an improvement in the problem.This result suggests that the ML model training method that utilizes CTGANs to augment attack data for training data in the 5G environment mitigates the generalization error problem.
基金This work was supported by Institute of Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.2021-0-00796,Research on Foundational Technologies for 6G Autonomous Security-by-Design to Guarantee Constant Quality of Security)。
文摘As Internet of Things(IoT)devices with security issues are connected to 5G mobile networks,the importance of IoT Botnet detection research in mobile network environments is increasing.However,the existing research focused on AI-based IoT Botnet detection research in wired network environments.In addition,the existing research related to IoT Botnet detection in ML-based mobile network environments have been conducted up to 4G.Therefore,this paper conducts a study on ML-based IoT Botnet traffic detection in the 5G core network.The binary and multiclass classification was performed to compare simple normal/malicious detection and normal/threetype IoT Botnet malware detection.In both classification methods,the IoT Botnet detection performance using only 5GC’s GTP-U packets decreased by at least 22.99%of accuracy compared to detection in wired network environment.In addition,by conducting a feature importance experiment,the importance of feature study for IoT Botnet detection considering 5GC network characteristics was confirmed.Since this paper analyzed IoT botnet traffic passing through the 5GC network using ML and presented detection results,think it will be meaningful as a reference for research to link AI-based security to the 5GC network.
文摘The emerging technology of multi-tenancy network slicing is considered as an es sential feature of 5G cellular networks.It provides network slices as a new type of public cloud services and therewith increases the service flexibility and enhances the network re source efficiency.Meanwhile,it raises new challenges of network resource management.A number of various methods have been proposed over the recent past years,in which machine learning and artificial intelligence techniques are widely deployed.In this article,we provide a survey to existing approaches of network slicing resource management,with a highlight on the roles played by machine learning in them.
基金supported by National Science and Technology Major Program of the Ministry of Science and Technology(No.2018ZX03001031)Key program of Beijing Municipal Natural Science Foundation(No.L172030)+1 种基金Beijing unicipal Science and Technology Commission Project(No.Z181100003218007)National Key Technology Research and Development Program of the Ministry of Science and Technology of China(NO.2012BAF14B01)
文摘Internet of Things(IoT) is one of the targeted application scenarios of fifth generation(5 G) wireless communication.IoT brings a large amount of data transported on the network.Considering those data,machine learning(ML) algorithms can be naturally utilized to make network efficiently and reliably.However,how to fully apply ML to IoT driven wireless network is still open.The fundamental reason is that wireless communication pursuits the high capacity and quality facing the challenges from the varying and fading wireless channel.So in this paper,we explore feasible combination for ML and IoT driven wireless network from wireless channel perspective.Firstly,a three-level structure of wireless channel fading features is defined in order to classify the versatile propagation environments.This three-layer structure includes scenario,meter and wavelength levels.Based on this structure,there are different tasks like service prediction and pushing,self-organization networking,self adapting largescale fading modeling and so on,which can be abstracted into problems like regression,classification,clustering,etc.Then,we introduce corresponding ML methods to different levelsfrom channel perspective,which makes their interdisciplinary research promisingly.
文摘This paper discusses telemedicine and the employment of advanced mobile technologies in smart healthcare delivery. It covers the technological advances in connected smart healthcare, including the roles of artificial intelligence, machine learning, 5G and IoT platforms, and other enabling technologies. It also presents the challenges and potential risks that could arise from delivering connected smart healthcare services. Healthcare delivery is witnessing revolutions engineered by the developments in mobile connectivity and the plethora of platforms, applications, sensors, devices, and equipment that go along with it. Human society is evolving fast in response to these technological developments, which are also pushing the connectivity-providing sector to create and adopt new waves of network technologies. Consequently, new communications technologies have been introduced into the healthcare system and many novel applications have been developed to make it easier for sharing data in various forms and volumes within health-related services. These applications have also made it possible for telemedicine to be effectively adopted. This paper provides an overview of some of the recent developments within the space of mobile connectivity and telemedicine.
文摘How to explore and exploit the full potential of artificial intelligence(AI)technologies in future wireless communications such as beyond 5G(B5G)and 6G is an extremely hot inter-disciplinary research topic around the world.On the one hand,AI empowers intelligent resource management for wireless communications through powerful learning and automatic adaptation capabilities.On the other hand,embracing AI in wireless communication resource management calls for new network architecture and system models as well as standardized interfaces/protocols/data formats to facilitate the large-scale deployment of AI in future B5G/6G networks.This paper reviews the state-of-art AI-empowered resource management from the framework perspective down to the methodology perspective,not only considering the radio resource(e.g.,spectrum)management but also other types of resources such as computing and caching.We also discuss the challenges and opportunities for AI-based resource management to widely deploy AI in future wireless communication networks.
文摘The mobile Internet and Internet of Things are considered the main driving forc⁃es of 5G,as they require an ultra-dense deployment of small base stations to meet the in⁃creasing traffic demands.5G new radio(NR)access is designed to enable denser network deployments,while leading to a significant concern about the network energy consump⁃tion.Energy consumption is a main part of network operational expense(OPEX),and base stations work as the main energy consumption equipment in the radio access network(RAN).In order to achieve RAN energy efficiency(EE),switching off cells is a strategy to reduce the energy consumption of networks during off-peak conditions.This paper intro⁃duces NR cell switching on/off schemes in 3GPP to achieve energy efficiency in 5G RAN,including intra-system energy saving(ES)scheme and inter-system ES scheme.Addition⁃ally,NR architectural features including central unit/distributed unit(CU/DU)split and dual connectivity(DC)are also considered in NR energy saving.How to apply artificial in⁃telligence(AI)into 5G networks is a new topic in 3GPP,and we also propose a machine learning(ML)based scheme to save energy by switching off the cell selected relying on the load prediction.According to the experiment results in the real wireless environment,the ML based ES scheme can reduce more power consumption than the conventional ES scheme without load prediction.
基金This research was supported by the MSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2020-2015-0-00403)supervised by the IITP(Institute for Information&communications Technology Planning&Evaluation)this work was supported by the Soonchunhyang University Research Fund.
文摘Recently,the fifth generation(5G)of mobile networks has been deployed and various ranges of mobile services have been provided.The 5G mobile network supports improved mobile broadband,ultra-low latency and densely deployed massive devices.It allows multiple radio access technologies and interworks them for services.5G mobile systems employ traffic steering techniques to efficiently use multiple radio access technologies.However,conventional traffic steering techniques do not consider dynamic network conditions efficiently.In this paper,we propose a network aided traffic steering technique in 5G mobile network architecture.5G mobile systems monitor network conditions and learn with network data.Through a machine learning algorithm such as a feed-forward neural network,it recognizes dynamic network conditions and then performs traffic steering.The proposed scheme controls traffic for multiple radio access according to the ratio of measured throughput.Thus,it can be expected to improve traffic steering efficiency.The performance of the proposed traffic steering scheme is evaluated using extensive computer simulations.
基金sponsored by Shanghai Sailing Program under Grant No.18YF1423300.
文摘The emerging fifth generation(5G)network has the potential to satisfy the rapidly growing traffic demand and promote the transformation of smartphone-centric networks into an Internet of Things(IoT)ecosystem.Due to the introduction of new communication technologies and the increased density of 5G cells,the complexity of operation and operational expenditure(OPEX)will become very challenging in 5G.Self-organizing network(SON)has been researched extensively since 2G,to cope with the similar challenge,however by predefined poli cies,rather than intelligent analysis.The requirement for better quality of experience and the complexity of 5G network demands call for an approach that is different from SON.In several recent studies,the combination of machine learning(ML)technology with SON has been investi gated.In this paper,we focus on the intelligent operation of wireless network through ML algo rithms.A comprehensive and flexible framework is proposed to achieve an intelligent operation system.Two use cases are also studied to use ML algorithms to automate the anomaly detection and fault diagnosis of key performance indicators(KPIs)in wireless networks.The effectiveness of the proposed ML algorithms is demonstrated by the real data experiments,thus encouraging the further research for intelligent wireless network operation.
基金supported by the Motorola Mobility,the National Council for Scientific and Technological Development(No.433142/2018-9)Research Productivity Fellowship(No.312831/2020-0)the Pernambuco Research Foundation(FACEPE)。
文摘5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.
基金supported in part by National Natural Science Foundation of China (61322110, 6141101115)Doctoral Fund of Ministry of Education (201300051100013)
文摘Recently,internet stimulates the explosive progress of knowledge discovery in big volume data resource,to dig the valuable and hidden rules by computing.Simultaneously,the wireless channel measurement data reveals big volume feature,considering the massive antennas,huge bandwidth and versatile application scenarios.This article firstly presents a comprehensive survey of channel measurement and modeling research for mobile communication,especially for 5th Generation(5G) and beyond.Considering the big data research progress,then a cluster-nuclei based model is proposed,which takes advantages of both the stochastical model and deterministic model.The novel model has low complexity with the limited number of cluster-nuclei while the cluster-nuclei has the physical mapping to real propagation objects.Combining the channel properties variation principles with antenna size,frequency,mobility and scenario dug from the channel data,the proposed model can be expanded in versatile application to support future mobile research.
文摘Network densification is envisioned as one of the key enabling technologies in the next generation and beyond wireless networks to satisfy the demand of high coverage and capacity whilst deliver an ultra-reliable low latency communication services especially to the users on the move.One of the fundamental tasks in wireless networks is user association.In the case of ultra-dense vehicular networks,due to the dense deployment and small coverage of the eNodeBs,there may be more than one eNodeB that may simultaneously satisfy the conventional maximum radio signal strength user association criteria.In addition to this,the spatial-temporal vehicle distribution in dynamic environments contribute significantly towards the rapidly changing radio environment that substantially impacts the user association,therefore,the network performance and user experience.This paper addresses the problem of user association in dynamic environments by proposing intelligent user association approach,variable-reward,quality-aware Q-learning(VR-QAQL)that has an ability to strike a balance between the number of handovers per transmission and system performance whilst a guaranteed network quality of service is delivered.The VR-QAQL technique integrates the control-theoretic concepts and the reinforcement learning approach in an LTE uplink,using the framework of an urban vehicular environment.The algorithm is assessed using large-scale simulation on a highway scenario at different vehicle speeds in an urban setting.The results demonstrate that the proposed VR-QAQL algorithm outperforms all the other investigated approaches across all mobility levels.
文摘The existing literature on device-to-device(D2D)architecture suffers from a dearth of analysis under imperfect channel conditions.There is a need for rigorous analyses on the policy improvement and evaluation of network performance.Accordingly,a two-stage transmit power control approach(named QSPCA)is proposed:First,a reinforcement Q-learning based power control technique and;second,a supervised learning based support vector machine(SVM)model.This model replaces the unified communication model of the conventional D2D setup with a distributed one,thereby requiring lower resources,such as D2D throughput,transmit power,and signal-to-interference-plus-noise ratio as compared to existing algorithms.Results confirm that the QSPCA technique is better than existing models by at least 15.31%and 19.5%in terms of throughput as compared to SVM and Q-learning techniques,respectively.The customizability of the QSPCA technique opens up multiple avenues and industrial communication technologies in 5G networks,such as factory automation.