As 5th Generation(5G)and Beyond 5G(B5G)networks become increasingly prevalent,ensuring not only networksecurity but also the security and reliability of the applications,the so-called network applications,becomesof pa...As 5th Generation(5G)and Beyond 5G(B5G)networks become increasingly prevalent,ensuring not only networksecurity but also the security and reliability of the applications,the so-called network applications,becomesof paramount importance.This paper introduces a novel integrated model architecture,combining a networkapplication validation framework with an AI-driven reactive system to enhance security in real-time.The proposedmodel leverages machine learning(ML)and artificial intelligence(AI)to dynamically monitor and respond tosecurity threats,effectively mitigating potential risks before they impact the network infrastructure.This dualapproach not only validates the functionality and performance of network applications before their real deploymentbut also enhances the network’s ability to adapt and respond to threats as they arise.The implementation ofthis model,in the shape of an architecture deployed in two distinct sites,demonstrates its practical viability andeffectiveness.Integrating application validation with proactive threat detection and response,the proposed modeladdresses critical security challenges unique to 5G infrastructures.This paper details the model,architecture’sdesign,implementation,and evaluation of this solution,illustrating its potential to improve network securitymanagement in 5G environments significantly.Our findings highlight the architecture’s capability to ensure boththe operational integrity of network applications and the security of the underlying infrastructure,presenting asignificant advancement in network security.展开更多
In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver u...In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.展开更多
Recent developments in the aerospace industry have led to a dramatic reduction in the manufacturing and launch costs of low Earth orbit satellites.The new trend enables the paradigm shift of satelliteterrestrial integ...Recent developments in the aerospace industry have led to a dramatic reduction in the manufacturing and launch costs of low Earth orbit satellites.The new trend enables the paradigm shift of satelliteterrestrial integrated networks with global coverage.In particular,the integration of 5G communication systems and satellites has the potential to restructure nextgeneration mobile networks.By leveraging the network function virtualization and network slicing,the satellite 5G core networks will facilitate the coordination and management of network functions in satellite-terrestrial integrated networks.We are the first to deploy a 5G core network on a real-world satellite to investigate its feasibility.We conducted experiments to validate the satellite 5G core network functions.The validated procedures include registration and session setup procedures.The results show that the satellite 5G core network can function normally and generate correct signaling.展开更多
The current resource allocation in 5G vehicular networks for mobile cloud communication faces several challenges,such as low user utilization,unbalanced resource allocation,and extended adaptive allocation time.We pro...The current resource allocation in 5G vehicular networks for mobile cloud communication faces several challenges,such as low user utilization,unbalanced resource allocation,and extended adaptive allocation time.We propose an adaptive allocation algorithm for mobile cloud communication resources in 5G vehicular networks to address these issues.This study analyzes the components of the 5G vehicular network architecture to determine the performance of different components.It is ascertained that the communication modes in 5G vehicular networks for mobile cloud communication include in-band and out-of-band modes.Furthermore,this study analyzes the single-hop and multi-hop modes in mobile cloud communication and calculates the resource transmission rate and bandwidth in different communication modes.The study also determines the scenario of one-way and two-way vehicle lane cloud communication network connectivity,calculates the probability of vehicle network connectivity under different mobile cloud communication radii,and determines the amount of cloud communication resources required by vehicles in different lane scenarios.Based on the communication status of users in 5G vehicular networks,this study calculates the bandwidth and transmission rate of the allocated channels using Shannon’s formula.It determines the adaptive allocation of cloud communication resources,introduces an objective function to obtain the optimal solution after allocation,and completes the adaptive allocation process.The experimental results demonstrate that,with the application of the proposed method,the maximum utilization of user communication resources reaches approximately 99%.The balance coefficient curve approaches 1,and the allocation time remains under 2 s.This indicates that the proposed method has higher adaptive allocation efficiency.展开更多
With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to res...With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to respond to the growing demand of users.This rapid evolution has given the operators to adapt,their methods to the new technologies that increase.This complexity becomes more important,when these networks include several technologies to access different from the heterogeneous network like in the 4G network.The dimensional new challenges tell the application and the considerable increase in demand for services and the compatibility with existing networks,the management of mobility intercellular of users and it offers a better quality of services.Thus,the proposed solution to meet these new requirements is the sizing of the EPC(Evolved Packet Core)core network to support the 5G access network.For the case of Orange Guinea,this involves setting up an architecture for interconnecting the core networks of Sonfonia and Camayenne.The objectives of our work are of two orders:(1)to propose these solutions and recommendations for the heart network EPC sizing and the deployment to be adopted;(2)supply and architectural interconnection in the heart network EPC and an existing heart network.In our work,the model of traffic in communication that we use to calculate the traffic generated with each technology has link in the network of the heart.展开更多
Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional ...Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional video streaming,the quality of experience(QoE)of video in 5G systems has been receiving overwhelming significance from both customers and service provider ends.Therefore,effectively categorizing QoE-aware video streaming is imperative for achieving greater client satisfaction.This work makes the following contribution:First,a simulation platform based on NS-3 is introduced to analyze and improve the performance of video services.The simulation is formulated to offer real-time measurements,saving the expensive expenses associated with real-world equipment.Second,A valuable framework for QoE-aware video streaming categorization is introduced in 5G networks based on machine learning(ML)by incorporating the hyperparameter tuning(HPT)principle.It implements an enhanced hyperparameter tuning(EHPT)ensemble and decision tree(DT)classifier for video streaming categorization.The performance of the ML approach is assessed by considering precision,accuracy,recall,and computation time metrics for manifesting the superiority of these classifiers regarding video streaming categorization.This paper demonstrates that our ML classifiers achieve QoE prediction accuracy of 92.59%for(EHPT)ensemble and 87.037%for decision tree(DT)classifiers.展开更多
The ever-increasing needs of Internet of Things networks (IoTn) present considerable issues in computing complexity, security, trust, and authentication, among others. This gets increasingly more challenging as techno...The ever-increasing needs of Internet of Things networks (IoTn) present considerable issues in computing complexity, security, trust, and authentication, among others. This gets increasingly more challenging as technology advances, and its use expands. As a consequence, boosting the capacity of these networks has garnered widespread attention. As a result, 5G, the next phase of cellular networks, is expected to be a game-changer, bringing with it faster data transmission rates, more capacity, improved service quality, and reduced latency. However, 5G networks continue to confront difficulties in establishing pervasive and dependable connections amongst high-speed IoT devices. Thus, to address the shortcomings in current recommendations, we present a unified architecture based on software-defined networks (SDNs) that provides 5G-enabled devices that must have complete secrecy. Through SDN, the architecture streamlines network administration while optimizing network communications. A mutual authentication protocol using elliptic curve cryptography is introduced for mutual authentication across certificate authorities and clustered heads in IoT network deployments based on IoT. Again, a dimensionality reduction intrusion detection mechanism is introduced to decrease computational cost and identify possible network breaches. However, to leverage the method’s potential, the initial module's security is reviewed. The second module is evaluated and compared to modern models.展开更多
With the capacities of self-learning,acquainted capacities,high-speed looking for ideal arrangements,solid nonlin-ear fitting,and mapping self-assertively complex nonlinear relations,neural systems have made incredibl...With the capacities of self-learning,acquainted capacities,high-speed looking for ideal arrangements,solid nonlin-ear fitting,and mapping self-assertively complex nonlinear relations,neural systems have made incredible advances and accomplished broad application over the final half-century.As one of the foremost conspicuous methods for fake insights,neural systems are growing toward high computational speed and moo control utilization.Due to the inborn impediments of electronic gadgets,it may be troublesome for electronic-implemented neural systems to make the strides these two exhibitions encourage.Optical neural systems can combine optoelectronic procedures and neural organization models to provide ways to break the bottleneck.This paper outlines optical neural networks of feedforward repetitive and spiking models to give a clearer picture of history,wildernesses,and future optical neural systems.The framework demonstrates neural systems in optic communication with the serial and parallel setup.The graphene-based laser structure for fiber optic communication is discussed.The comparison of different balance plans for photonic neural systems is made within the setting of hereditary calculation and molecule swarm optimization.In expansion,the execution comparison of routine photonic neural,time-domain with and without extending commotion is additionally expounded.The challenges and future patterns of optical neural systems on the growing scale and applications of in situ preparing nonlinear computing will hence be uncovered.展开更多
Nowadays,the widespread application of 5G has promoted rapid development in different areas,particularly in the Internet of Things(IoT),where 5G provides the advantages of higher data transfer rate,lower latency,and w...Nowadays,the widespread application of 5G has promoted rapid development in different areas,particularly in the Internet of Things(IoT),where 5G provides the advantages of higher data transfer rate,lower latency,and widespread connections.Wireless sensor networks(WSNs),which comprise various sensors,are crucial components of IoT.The main functions of WSN include providing users with real-time monitoring information,deploying regional information collection,and synchronizing with the Internet.Security in WSNs is becoming increasingly essential because of the across-the-board nature of wireless technology in many fields.Recently,Yu et al.proposed a user authentication protocol forWSN.However,their design is vulnerable to sensor capture and temporary information disclosure attacks.Thus,in this study,an improved protocol called PSAP-WSNis proposed.The security of PSAP-WSN is demonstrated by employing the ROR model,BAN logic,and ProVerif tool for the analysis.The experimental evaluation shows that our design is more efficient and suitable forWSN environments.展开更多
5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent lat...5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent latency and bandwidth requirements;thus,it should be extremely flexible and dynamic.Slicing enables service providers to develop various network slice architectures.As users travel from one coverage region to another area,the callmust be routed to a slice thatmeets the same or different expectations.This research aims to develop and evaluate an algorithm to make handover decisions appearing in 5G sliced networks.Rules of thumb which indicates the accuracy regarding the training data classification schemes within machine learning should be considered for validation and selection of the appropriate machine learning strategies.Therefore,this study discusses the network model’s design and implementation of self-optimization Fuzzy Qlearning of the decision-making algorithm for slice handover.The algorithm’s performance is assessed by means of connection-level metrics considering the Quality of Service(QoS),specifically the probability of the new call to be blocked and the probability of a handoff call being dropped.Hence,within the network model,the call admission control(AC)method is modeled by leveraging supervised learning algorithm as prior knowledge of additional capacity.Moreover,to mitigate high complexity,the integration of fuzzy logic as well as Fuzzy Q-Learning is used to discretize state and the corresponding action spaces.The results generated from our proposal surpass the traditional methods without the use of supervised learning and fuzzy-Q learning.展开更多
Orbital Angular Momentum(OAM)is an intrinsic property of electro-magnetic waves.Great research has been witnessed in the last decades aiming at exploiting the OAM wave property in different areas in radio and optics.O...Orbital Angular Momentum(OAM)is an intrinsic property of electro-magnetic waves.Great research has been witnessed in the last decades aiming at exploiting the OAM wave property in different areas in radio and optics.One pro-mising area of particular interest is to enhance the efficiency of the available communications spectrum.However,adopting OAM-based solutions is not priceless as these suffer from wave divergence especially when the OAM order is high.This shall limit the practical communications distance,especially in the radio regime.In this paper,we propose a cooperative OAM relaying system consisting of a source,relay,and destination.Relays help the source to transmit packets to the destination by providing an alternative connection between source and desti-nation.This cooperative solution aims on the one hand,through best-path selection,on increasing the communications range.On the other hand,through the parallel transmission orders allowed by OAM carrying waves,the system could raise its total transmission throughput.Simulation results show that combining a cooperative relay with OAM improves the system throughput compared to using each element separately.In addition,the proposed cooperative relaying OAM out-performs the cooperative relaying non-orthogonal multiple access scheme,which is a key spectrally efficient technique used in 5G technology.展开更多
The Internet of Things(IoTs)has become an essential component of the 5th Generation(5G)network and beyond,accelerating the transition to digital society.The increasing signaling traffic generated by billions of IoT de...The Internet of Things(IoTs)has become an essential component of the 5th Generation(5G)network and beyond,accelerating the transition to digital society.The increasing signaling traffic generated by billions of IoT devices has placed significant strain on the 5G Core network(5GC)control plane.To address this issue,the 3rd Gener-ation Partnership Project(3GPP)first proposed a Service-Based Architecture(SBA),intending to create a flexible,scalable,and agile cloud-native 5GC.However,considering the coupling of protocol states and functions,there are still many challenges to fully utilize the benefits of the cloud computing and orchestrate the 5GC in a cloud-native manner.We propose a Message-Level StateLess Design(ML-SLD)to provide a cloud-native 5GC from an architectural standpoint in this paper.Firstly,we propose an innovative mechanism for servitization of the N2 interface to maintain the connection between Radio Access Network(RAN)and the 5GC,avoiding interruptions and dropouts of large-scale user data.Furthermore,we propose an On-demand Message Forwarding(OMF)al-gorithm to reduce the impact of cloud fluctuations on the performance of cloud-native 5GC.Finally,we create a prototype that is based on the OpenAirInterface(OAI)5G core network projects,with all Network Functions(NFs)packaged in dockers and deployed in a kubernetes-based cloud environment.Several experiments have been built with UERANSIM and Chaosblade simulation tools.The findings demonstrate the viability and efficiency of our proposed methods.展开更多
This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts bea...This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts beamforming technology,an anti-jamming model under Space Division Multiple Access(SDMA)conditions is proposed.Secondly,the confrontational relationship between users and the jammer is formulated as a Stackelberg game.Besides,to achieve global optimization,we design a local cooperation mechanism for users and formulate the cooperation and competition among users as a local altruistic game.By proving that the local altruistic game is an Exact Potential Game(EPG),we further prove the existence of pure strategy Nash Equilibrium(NE)among users and Stackelberg Equilibrium(SE)between users and jammer.Thirdly,to obtain the equilibrium solutions of the proposed games,we propose an anti-jamming channel selection algorithm and improve its convergence speed through heterogeneous learning parameters.The simulation results validate the convergence and effectiveness of the proposed algorithm.Compared with the throughput optimization scheme,our proposed scheme obtain a greater network satisfaction rate.Finally,we also analyze user fairness changes during the algorithm convergence process and get some interesting conclusions.展开更多
Fault diagnosis of 5G networks faces the challenges of heavy reliance on human experience and insufficient fault samples and relevant monitoring data.The digital twin technology can realize the interaction between vir...Fault diagnosis of 5G networks faces the challenges of heavy reliance on human experience and insufficient fault samples and relevant monitoring data.The digital twin technology can realize the interaction between virtual space and physical space through the fusion of model and data,providing a new paradigm for fault diagnosis.In this paper,we first propose a network digital twin model and apply it to 5G network diagnosis.We then use an improved Average Wasserstein GAN with Gradient Penalty(AWGAN-GP)method to discover and predict failures in the twin network.Finally,we use XGBoost algorithm to locate the faults in physical network in real time.Extensive simulation results show that the proposed approach can significantly increase fault prediction and diagnosis accuracy in the case of a small number of labeled failure samples in 5G networks.展开更多
Massive multiple-input multiple-output(MIMO)systems that use the millimeter-wave(mm-wave)band have a higher frequency and more antennas,which leads to significant path loss,high power consumption,and server interferen...Massive multiple-input multiple-output(MIMO)systems that use the millimeter-wave(mm-wave)band have a higher frequency and more antennas,which leads to significant path loss,high power consumption,and server interference.Due to these issues,the spectrum efficiency is significantly reduced,making spectral efficiency improvement an important research topic for 5G communication.Together with communication in the terahertz(THz)bands,mmWave communication is currently a component of the 5G standards and is seen as a solution to the commercial bandwidth shortage.The quantity of continuous,mostly untapped bandwidth in the 30–300 GHz band has presented a rare opportunity to boost the capacity of wireless networks.The wireless communications and consumer electronics industries have recently paid a lot of attention to wireless data transfer and media streaming in the mmWave frequency range.Simple massive MIMO beamforming technology cannot successfully prevent interference between multiple networks in the current spectrum-sharing schemes,particularly the complex interference dispersed in indoor communication systems such as homes,workplaces,and stadiums.To effectively improve spectrum utilization and reduce co-channel interference,this paper proposes a novel algorithm.The main idea is to utilize the spectrum in software-defined mmWave massive MIMO networks through coordinated and unified management.Then,the optimal interference threshold is determined through the beam alignment method.Finally,a greedy optimization algorithm is used to allocate optimal spectral resources to the users.Simulation results show that the proposed algorithm improved spectral efficiency and reduced interference.展开更多
Nowadays,high mobility scenarios have become increasingly common.The widespread adoption of High-speed Rail(HSR)in China exemplifies this trend,while more promising use cases,such as vehicle-to-everything,continue to ...Nowadays,high mobility scenarios have become increasingly common.The widespread adoption of High-speed Rail(HSR)in China exemplifies this trend,while more promising use cases,such as vehicle-to-everything,continue to emerge.However,the Internet access provided in high mobility environments stllstruggles to achieve seamless connectivity.The next generation of wireless cellular technology 5 G further poses more requirements on the endto-end evolution to fully utilize its ultra-high band-width,while existing network diagnostic tools focus on above-IP layers or below-IP layers only.We then propose HiMoDiag,which enables flexible online analysis of the network performance in a cross-layer manner,i.e.,from the top(application layer)to the bottom(physical layer).We believe HiMoDiag could greatly simplify the process of pinpointing the deficiencies of the Internet access delivery on HSR,lead to more timely optimization and ultimately help to improve the network performance.展开更多
The fifth generation (5G) networks will support the rapid emergence of Internet of Things (IoT) devices operating in a heterogeneous network (HetNet) system. These 5G-enabled IoT devices will result in a surge in data...The fifth generation (5G) networks will support the rapid emergence of Internet of Things (IoT) devices operating in a heterogeneous network (HetNet) system. These 5G-enabled IoT devices will result in a surge in data traffic for Mobile Network Operators (MNOs) to handle. At the same time, MNOs are preparing for a paradigm shift to decouple the control and forwarding plane in a Software-Defined Networking (SDN) architecture. Artificial Intelligence powered Self-Organising Networks (AI-SON) can fit into the SDN architecture by providing prediction and recommender systems to minimise costs in supporting the MNO’s infrastructure. This paper presents a review report on AI-SON frameworks in 5G and SDN. The review considers the dynamic deployment and functions of the AI-SON frameworks, especially for SDN support and applications. Each module in the frameworks was discussed to ascertain its relevance based on the context of AI-SON and SDN integration. After examining each framework, the identified gaps are summarised as open issues for future works.展开更多
The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoup...The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoupled in software-defined networking(SDN)and allow the network to be programmable.Each switch in SDN keeps track of forwarding information in a flow table.The SDN switches must search the flow table for the flow rules that match the packets to handle the incoming packets.Due to the obvious vast quantity of data in data centres,the capacity of the flow table restricts the data plane’s forwarding capabilities.So,the SDN must handle traffic from across the whole network.The flow table depends on Ternary Content Addressable Memorable Memory(TCAM)for storing and a quick search of regulations;it is restricted in capacity owing to its elevated cost and energy consumption.Whenever the flow table is abused and overflowing,the usual regulations cannot be executed quickly.In this case,we consider lowrate flow table overflowing that causes collision flow rules to be installed and consumes excessive existing flow table capacity by delivering packets that don’t fit the flow table at a low rate.This study introduces machine learning techniques for detecting and categorizing low-rate collision flows table in SDN,using Feed ForwardNeuralNetwork(FFNN),K-Means,and Decision Tree(DT).We generate two network topologies,Fat Tree and Simple Tree Topologies,with the Mininet simulator and coupled to the OpenDayLight(ODL)controller.The efficiency and efficacy of the suggested algorithms are assessed using several assessment indicators such as success rate query,propagation delay,overall dropped packets,energy consumption,bandwidth usage,latency rate,and throughput.The findings showed that the suggested technique to tackle the flow table congestion problem minimizes the number of flows while retaining the statistical consistency of the 5G network.By putting the proposed flow method and checking whether a packet may move from point A to point B without breaking certain regulations,the evaluation tool examines every flow against a set of criteria.The FFNN with DT and K-means algorithms obtain accuracies of 96.29%and 97.51%,respectively,in the identification of collision flows,according to the experimental outcome when associated with existing methods from the literature.展开更多
The fifth generation(5G) of mobile communications are facing big challenges, due to the proliferation of diversified terminals and unprecedented services such as internet of things(IoT), high-definition videos, virtua...The fifth generation(5G) of mobile communications are facing big challenges, due to the proliferation of diversified terminals and unprecedented services such as internet of things(IoT), high-definition videos, virtual/augmented reality(VR/AR). To accommodate massive connections and astonish mobile traffic, an efficient 5G transport network is required. Optical transport network has been demonstrated to play an important role for carrying 5G radio signals. This paper focuses on the future challenges, recent studies and potential solutions for the 5G flexible optical transport networks with the performances on large-capacity, low-latency and high-efficiency. In addition, we discuss the technology development trends of the 5G transport networks in terms of the optical device, optical transport system, optical switching, and optical networking. Finally, we conclude the paper with the improvement of network intelligence enabled by these technologies to deterministic content delivery over 5G optical transport networks.展开更多
文摘As 5th Generation(5G)and Beyond 5G(B5G)networks become increasingly prevalent,ensuring not only networksecurity but also the security and reliability of the applications,the so-called network applications,becomesof paramount importance.This paper introduces a novel integrated model architecture,combining a networkapplication validation framework with an AI-driven reactive system to enhance security in real-time.The proposedmodel leverages machine learning(ML)and artificial intelligence(AI)to dynamically monitor and respond tosecurity threats,effectively mitigating potential risks before they impact the network infrastructure.This dualapproach not only validates the functionality and performance of network applications before their real deploymentbut also enhances the network’s ability to adapt and respond to threats as they arise.The implementation ofthis model,in the shape of an architecture deployed in two distinct sites,demonstrates its practical viability andeffectiveness.Integrating application validation with proactive threat detection and response,the proposed modeladdresses critical security challenges unique to 5G infrastructures.This paper details the model,architecture’sdesign,implementation,and evaluation of this solution,illustrating its potential to improve network securitymanagement in 5G environments significantly.Our findings highlight the architecture’s capability to ensure boththe operational integrity of network applications and the security of the underlying infrastructure,presenting asignificant advancement in network security.
基金supported by the Key Research and Development Program of China(No.2022YFC3005401)Key Research and Development Program of China,Yunnan Province(No.202203AA080009,202202AF080003)Postgraduate Research&Practice Innovation Program of Jiangsu Province(No.KYCX21_0482).
文摘In Beyond the Fifth Generation(B5G)heterogeneous edge networks,numerous users are multiplexed on a channel or served on the same frequency resource block,in which case the transmitter applies coding and the receiver uses interference cancellation.Unfortunately,uncoordinated radio resource allocation can reduce system throughput and lead to user inequity,for this reason,in this paper,channel allocation and power allocation problems are formulated to maximize the system sum rate and minimum user achievable rate.Since the construction model is non-convex and the response variables are high-dimensional,a distributed Deep Reinforcement Learning(DRL)framework called distributed Proximal Policy Optimization(PPO)is proposed to allocate or assign resources.Specifically,several simulated agents are trained in a heterogeneous environment to find robust behaviors that perform well in channel assignment and power allocation.Moreover,agents in the collection stage slow down,which hinders the learning of other agents.Therefore,a preemption strategy is further proposed in this paper to optimize the distributed PPO,form DP-PPO and successfully mitigate the straggler problem.The experimental results show that our mechanism named DP-PPO improves the performance over other DRL methods.
基金supported by the National Key R&D Program of China(2020YFB1805500)National Natural Science Foundation of China(61922017,62032003 and 61921003)。
文摘Recent developments in the aerospace industry have led to a dramatic reduction in the manufacturing and launch costs of low Earth orbit satellites.The new trend enables the paradigm shift of satelliteterrestrial integrated networks with global coverage.In particular,the integration of 5G communication systems and satellites has the potential to restructure nextgeneration mobile networks.By leveraging the network function virtualization and network slicing,the satellite 5G core networks will facilitate the coordination and management of network functions in satellite-terrestrial integrated networks.We are the first to deploy a 5G core network on a real-world satellite to investigate its feasibility.We conducted experiments to validate the satellite 5G core network functions.The validated procedures include registration and session setup procedures.The results show that the satellite 5G core network can function normally and generate correct signaling.
基金This research was supported by Science and Technology Research Project of Education Department of Jiangxi Province,China(Nos.GJJ2206701,GJJ2206717).
文摘The current resource allocation in 5G vehicular networks for mobile cloud communication faces several challenges,such as low user utilization,unbalanced resource allocation,and extended adaptive allocation time.We propose an adaptive allocation algorithm for mobile cloud communication resources in 5G vehicular networks to address these issues.This study analyzes the components of the 5G vehicular network architecture to determine the performance of different components.It is ascertained that the communication modes in 5G vehicular networks for mobile cloud communication include in-band and out-of-band modes.Furthermore,this study analyzes the single-hop and multi-hop modes in mobile cloud communication and calculates the resource transmission rate and bandwidth in different communication modes.The study also determines the scenario of one-way and two-way vehicle lane cloud communication network connectivity,calculates the probability of vehicle network connectivity under different mobile cloud communication radii,and determines the amount of cloud communication resources required by vehicles in different lane scenarios.Based on the communication status of users in 5G vehicular networks,this study calculates the bandwidth and transmission rate of the allocated channels using Shannon’s formula.It determines the adaptive allocation of cloud communication resources,introduces an objective function to obtain the optimal solution after allocation,and completes the adaptive allocation process.The experimental results demonstrate that,with the application of the proposed method,the maximum utilization of user communication resources reaches approximately 99%.The balance coefficient curve approaches 1,and the allocation time remains under 2 s.This indicates that the proposed method has higher adaptive allocation efficiency.
文摘With the arrival of the 4G and 5G,the telecommunications networks have experienced a large expansion of these networks.That enabled the integration of many services and adequate flow,thus enabling the operators to respond to the growing demand of users.This rapid evolution has given the operators to adapt,their methods to the new technologies that increase.This complexity becomes more important,when these networks include several technologies to access different from the heterogeneous network like in the 4G network.The dimensional new challenges tell the application and the considerable increase in demand for services and the compatibility with existing networks,the management of mobility intercellular of users and it offers a better quality of services.Thus,the proposed solution to meet these new requirements is the sizing of the EPC(Evolved Packet Core)core network to support the 5G access network.For the case of Orange Guinea,this involves setting up an architecture for interconnecting the core networks of Sonfonia and Camayenne.The objectives of our work are of two orders:(1)to propose these solutions and recommendations for the heart network EPC sizing and the deployment to be adopted;(2)supply and architectural interconnection in the heart network EPC and an existing heart network.In our work,the model of traffic in communication that we use to calculate the traffic generated with each technology has link in the network of the heart.
文摘Recently,the combination of video services and 5G networks have been gaining attention in the wireless communication realm.With the brisk advancement in 5G network usage and the massive popularity of threedimensional video streaming,the quality of experience(QoE)of video in 5G systems has been receiving overwhelming significance from both customers and service provider ends.Therefore,effectively categorizing QoE-aware video streaming is imperative for achieving greater client satisfaction.This work makes the following contribution:First,a simulation platform based on NS-3 is introduced to analyze and improve the performance of video services.The simulation is formulated to offer real-time measurements,saving the expensive expenses associated with real-world equipment.Second,A valuable framework for QoE-aware video streaming categorization is introduced in 5G networks based on machine learning(ML)by incorporating the hyperparameter tuning(HPT)principle.It implements an enhanced hyperparameter tuning(EHPT)ensemble and decision tree(DT)classifier for video streaming categorization.The performance of the ML approach is assessed by considering precision,accuracy,recall,and computation time metrics for manifesting the superiority of these classifiers regarding video streaming categorization.This paper demonstrates that our ML classifiers achieve QoE prediction accuracy of 92.59%for(EHPT)ensemble and 87.037%for decision tree(DT)classifiers.
文摘The ever-increasing needs of Internet of Things networks (IoTn) present considerable issues in computing complexity, security, trust, and authentication, among others. This gets increasingly more challenging as technology advances, and its use expands. As a consequence, boosting the capacity of these networks has garnered widespread attention. As a result, 5G, the next phase of cellular networks, is expected to be a game-changer, bringing with it faster data transmission rates, more capacity, improved service quality, and reduced latency. However, 5G networks continue to confront difficulties in establishing pervasive and dependable connections amongst high-speed IoT devices. Thus, to address the shortcomings in current recommendations, we present a unified architecture based on software-defined networks (SDNs) that provides 5G-enabled devices that must have complete secrecy. Through SDN, the architecture streamlines network administration while optimizing network communications. A mutual authentication protocol using elliptic curve cryptography is introduced for mutual authentication across certificate authorities and clustered heads in IoT network deployments based on IoT. Again, a dimensionality reduction intrusion detection mechanism is introduced to decrease computational cost and identify possible network breaches. However, to leverage the method’s potential, the initial module's security is reviewed. The second module is evaluated and compared to modern models.
基金extend their appreciation to the Deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through Project Number RI-44-0345.
文摘With the capacities of self-learning,acquainted capacities,high-speed looking for ideal arrangements,solid nonlin-ear fitting,and mapping self-assertively complex nonlinear relations,neural systems have made incredible advances and accomplished broad application over the final half-century.As one of the foremost conspicuous methods for fake insights,neural systems are growing toward high computational speed and moo control utilization.Due to the inborn impediments of electronic gadgets,it may be troublesome for electronic-implemented neural systems to make the strides these two exhibitions encourage.Optical neural systems can combine optoelectronic procedures and neural organization models to provide ways to break the bottleneck.This paper outlines optical neural networks of feedforward repetitive and spiking models to give a clearer picture of history,wildernesses,and future optical neural systems.The framework demonstrates neural systems in optic communication with the serial and parallel setup.The graphene-based laser structure for fiber optic communication is discussed.The comparison of different balance plans for photonic neural systems is made within the setting of hereditary calculation and molecule swarm optimization.In expansion,the execution comparison of routine photonic neural,time-domain with and without extending commotion is additionally expounded.The challenges and future patterns of optical neural systems on the growing scale and applications of in situ preparing nonlinear computing will hence be uncovered.
文摘Nowadays,the widespread application of 5G has promoted rapid development in different areas,particularly in the Internet of Things(IoT),where 5G provides the advantages of higher data transfer rate,lower latency,and widespread connections.Wireless sensor networks(WSNs),which comprise various sensors,are crucial components of IoT.The main functions of WSN include providing users with real-time monitoring information,deploying regional information collection,and synchronizing with the Internet.Security in WSNs is becoming increasingly essential because of the across-the-board nature of wireless technology in many fields.Recently,Yu et al.proposed a user authentication protocol forWSN.However,their design is vulnerable to sensor capture and temporary information disclosure attacks.Thus,in this study,an improved protocol called PSAP-WSNis proposed.The security of PSAP-WSN is demonstrated by employing the ROR model,BAN logic,and ProVerif tool for the analysis.The experimental evaluation shows that our design is more efficient and suitable forWSN environments.
基金This work was supported partially by the BK21 FOUR program of the National Research Foundation of Korea funded by the Ministry of Education(NRF5199991514504)by theMSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2023-2018-0-01431)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation).
文摘5G use cases,for example enhanced mobile broadband(eMBB),massive machine-type communications(mMTC),and an ultra-reliable low latency communication(URLLC),need a network architecture capable of sustaining stringent latency and bandwidth requirements;thus,it should be extremely flexible and dynamic.Slicing enables service providers to develop various network slice architectures.As users travel from one coverage region to another area,the callmust be routed to a slice thatmeets the same or different expectations.This research aims to develop and evaluate an algorithm to make handover decisions appearing in 5G sliced networks.Rules of thumb which indicates the accuracy regarding the training data classification schemes within machine learning should be considered for validation and selection of the appropriate machine learning strategies.Therefore,this study discusses the network model’s design and implementation of self-optimization Fuzzy Qlearning of the decision-making algorithm for slice handover.The algorithm’s performance is assessed by means of connection-level metrics considering the Quality of Service(QoS),specifically the probability of the new call to be blocked and the probability of a handoff call being dropped.Hence,within the network model,the call admission control(AC)method is modeled by leveraging supervised learning algorithm as prior knowledge of additional capacity.Moreover,to mitigate high complexity,the integration of fuzzy logic as well as Fuzzy Q-Learning is used to discretize state and the corresponding action spaces.The results generated from our proposal surpass the traditional methods without the use of supervised learning and fuzzy-Q learning.
文摘Orbital Angular Momentum(OAM)is an intrinsic property of electro-magnetic waves.Great research has been witnessed in the last decades aiming at exploiting the OAM wave property in different areas in radio and optics.One pro-mising area of particular interest is to enhance the efficiency of the available communications spectrum.However,adopting OAM-based solutions is not priceless as these suffer from wave divergence especially when the OAM order is high.This shall limit the practical communications distance,especially in the radio regime.In this paper,we propose a cooperative OAM relaying system consisting of a source,relay,and destination.Relays help the source to transmit packets to the destination by providing an alternative connection between source and desti-nation.This cooperative solution aims on the one hand,through best-path selection,on increasing the communications range.On the other hand,through the parallel transmission orders allowed by OAM carrying waves,the system could raise its total transmission throughput.Simulation results show that combining a cooperative relay with OAM improves the system throughput compared to using each element separately.In addition,the proposed cooperative relaying OAM out-performs the cooperative relaying non-orthogonal multiple access scheme,which is a key spectrally efficient technique used in 5G technology.
基金funded by the National Key Research and Development Program of China under Grant 2019YFB1803301Beijing Natural Science Foundation(L202002).
文摘The Internet of Things(IoTs)has become an essential component of the 5th Generation(5G)network and beyond,accelerating the transition to digital society.The increasing signaling traffic generated by billions of IoT devices has placed significant strain on the 5G Core network(5GC)control plane.To address this issue,the 3rd Gener-ation Partnership Project(3GPP)first proposed a Service-Based Architecture(SBA),intending to create a flexible,scalable,and agile cloud-native 5GC.However,considering the coupling of protocol states and functions,there are still many challenges to fully utilize the benefits of the cloud computing and orchestrate the 5GC in a cloud-native manner.We propose a Message-Level StateLess Design(ML-SLD)to provide a cloud-native 5GC from an architectural standpoint in this paper.Firstly,we propose an innovative mechanism for servitization of the N2 interface to maintain the connection between Radio Access Network(RAN)and the 5GC,avoiding interruptions and dropouts of large-scale user data.Furthermore,we propose an On-demand Message Forwarding(OMF)al-gorithm to reduce the impact of cloud fluctuations on the performance of cloud-native 5GC.Finally,we create a prototype that is based on the OpenAirInterface(OAI)5G core network projects,with all Network Functions(NFs)packaged in dockers and deployed in a kubernetes-based cloud environment.Several experiments have been built with UERANSIM and Chaosblade simulation tools.The findings demonstrate the viability and efficiency of our proposed methods.
基金supported by the National Natural Science Foundation of China under Grant No.61901523 and No.62071488.
文摘This paper investigates the Quality of Experience(QoE)oriented channel access anti-jamming problem in 5th Generation Mobile Communication(5G)ultra-dense networks.Firstly,considering that the 5G base station adopts beamforming technology,an anti-jamming model under Space Division Multiple Access(SDMA)conditions is proposed.Secondly,the confrontational relationship between users and the jammer is formulated as a Stackelberg game.Besides,to achieve global optimization,we design a local cooperation mechanism for users and formulate the cooperation and competition among users as a local altruistic game.By proving that the local altruistic game is an Exact Potential Game(EPG),we further prove the existence of pure strategy Nash Equilibrium(NE)among users and Stackelberg Equilibrium(SE)between users and jammer.Thirdly,to obtain the equilibrium solutions of the proposed games,we propose an anti-jamming channel selection algorithm and improve its convergence speed through heterogeneous learning parameters.The simulation results validate the convergence and effectiveness of the proposed algorithm.Compared with the throughput optimization scheme,our proposed scheme obtain a greater network satisfaction rate.Finally,we also analyze user fairness changes during the algorithm convergence process and get some interesting conclusions.
基金supported by Natural Science Foundation of China(61871237,92067101)Program to Cultivate Middle-aged and Young Science Leaders of Universities of Jiangsu Province+1 种基金Key R&D plan of Jiangsu Province(BE2021013-3)。
文摘Fault diagnosis of 5G networks faces the challenges of heavy reliance on human experience and insufficient fault samples and relevant monitoring data.The digital twin technology can realize the interaction between virtual space and physical space through the fusion of model and data,providing a new paradigm for fault diagnosis.In this paper,we first propose a network digital twin model and apply it to 5G network diagnosis.We then use an improved Average Wasserstein GAN with Gradient Penalty(AWGAN-GP)method to discover and predict failures in the twin network.Finally,we use XGBoost algorithm to locate the faults in physical network in real time.Extensive simulation results show that the proposed approach can significantly increase fault prediction and diagnosis accuracy in the case of a small number of labeled failure samples in 5G networks.
文摘Massive multiple-input multiple-output(MIMO)systems that use the millimeter-wave(mm-wave)band have a higher frequency and more antennas,which leads to significant path loss,high power consumption,and server interference.Due to these issues,the spectrum efficiency is significantly reduced,making spectral efficiency improvement an important research topic for 5G communication.Together with communication in the terahertz(THz)bands,mmWave communication is currently a component of the 5G standards and is seen as a solution to the commercial bandwidth shortage.The quantity of continuous,mostly untapped bandwidth in the 30–300 GHz band has presented a rare opportunity to boost the capacity of wireless networks.The wireless communications and consumer electronics industries have recently paid a lot of attention to wireless data transfer and media streaming in the mmWave frequency range.Simple massive MIMO beamforming technology cannot successfully prevent interference between multiple networks in the current spectrum-sharing schemes,particularly the complex interference dispersed in indoor communication systems such as homes,workplaces,and stadiums.To effectively improve spectrum utilization and reduce co-channel interference,this paper proposes a novel algorithm.The main idea is to utilize the spectrum in software-defined mmWave massive MIMO networks through coordinated and unified management.Then,the optimal interference threshold is determined through the beam alignment method.Finally,a greedy optimization algorithm is used to allocate optimal spectral resources to the users.Simulation results show that the proposed algorithm improved spectral efficiency and reduced interference.
基金supported by National Key Research and Development Plan,China(Grant No.2020YFB1710900)National Natural Science Foundation of China(Grant No.62022005 and 62172008).
文摘Nowadays,high mobility scenarios have become increasingly common.The widespread adoption of High-speed Rail(HSR)in China exemplifies this trend,while more promising use cases,such as vehicle-to-everything,continue to emerge.However,the Internet access provided in high mobility environments stllstruggles to achieve seamless connectivity.The next generation of wireless cellular technology 5 G further poses more requirements on the endto-end evolution to fully utilize its ultra-high band-width,while existing network diagnostic tools focus on above-IP layers or below-IP layers only.We then propose HiMoDiag,which enables flexible online analysis of the network performance in a cross-layer manner,i.e.,from the top(application layer)to the bottom(physical layer).We believe HiMoDiag could greatly simplify the process of pinpointing the deficiencies of the Internet access delivery on HSR,lead to more timely optimization and ultimately help to improve the network performance.
文摘The fifth generation (5G) networks will support the rapid emergence of Internet of Things (IoT) devices operating in a heterogeneous network (HetNet) system. These 5G-enabled IoT devices will result in a surge in data traffic for Mobile Network Operators (MNOs) to handle. At the same time, MNOs are preparing for a paradigm shift to decouple the control and forwarding plane in a Software-Defined Networking (SDN) architecture. Artificial Intelligence powered Self-Organising Networks (AI-SON) can fit into the SDN architecture by providing prediction and recommender systems to minimise costs in supporting the MNO’s infrastructure. This paper presents a review report on AI-SON frameworks in 5G and SDN. The review considers the dynamic deployment and functions of the AI-SON frameworks, especially for SDN support and applications. Each module in the frameworks was discussed to ascertain its relevance based on the context of AI-SON and SDN integration. After examining each framework, the identified gaps are summarised as open issues for future works.
基金Taif University Researchers supporting Project number(TURSP-2020/215),Taif University,Taif,Saudi Arabia.
文摘The rapid advancement of wireless communication is forming a hyper-connected 5G network in which billions of linked devices generate massive amounts of data.The traffic control and data forwarding functions are decoupled in software-defined networking(SDN)and allow the network to be programmable.Each switch in SDN keeps track of forwarding information in a flow table.The SDN switches must search the flow table for the flow rules that match the packets to handle the incoming packets.Due to the obvious vast quantity of data in data centres,the capacity of the flow table restricts the data plane’s forwarding capabilities.So,the SDN must handle traffic from across the whole network.The flow table depends on Ternary Content Addressable Memorable Memory(TCAM)for storing and a quick search of regulations;it is restricted in capacity owing to its elevated cost and energy consumption.Whenever the flow table is abused and overflowing,the usual regulations cannot be executed quickly.In this case,we consider lowrate flow table overflowing that causes collision flow rules to be installed and consumes excessive existing flow table capacity by delivering packets that don’t fit the flow table at a low rate.This study introduces machine learning techniques for detecting and categorizing low-rate collision flows table in SDN,using Feed ForwardNeuralNetwork(FFNN),K-Means,and Decision Tree(DT).We generate two network topologies,Fat Tree and Simple Tree Topologies,with the Mininet simulator and coupled to the OpenDayLight(ODL)controller.The efficiency and efficacy of the suggested algorithms are assessed using several assessment indicators such as success rate query,propagation delay,overall dropped packets,energy consumption,bandwidth usage,latency rate,and throughput.The findings showed that the suggested technique to tackle the flow table congestion problem minimizes the number of flows while retaining the statistical consistency of the 5G network.By putting the proposed flow method and checking whether a packet may move from point A to point B without breaking certain regulations,the evaluation tool examines every flow against a set of criteria.The FFNN with DT and K-means algorithms obtain accuracies of 96.29%and 97.51%,respectively,in the identification of collision flows,according to the experimental outcome when associated with existing methods from the literature.
基金supported by the National Nature Science Foundation of China Projects(No.61871051,61771073)the Nature Science Foundation of Beijing project(No.4192039)
文摘The fifth generation(5G) of mobile communications are facing big challenges, due to the proliferation of diversified terminals and unprecedented services such as internet of things(IoT), high-definition videos, virtual/augmented reality(VR/AR). To accommodate massive connections and astonish mobile traffic, an efficient 5G transport network is required. Optical transport network has been demonstrated to play an important role for carrying 5G radio signals. This paper focuses on the future challenges, recent studies and potential solutions for the 5G flexible optical transport networks with the performances on large-capacity, low-latency and high-efficiency. In addition, we discuss the technology development trends of the 5G transport networks in terms of the optical device, optical transport system, optical switching, and optical networking. Finally, we conclude the paper with the improvement of network intelligence enabled by these technologies to deterministic content delivery over 5G optical transport networks.