Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control l...Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.展开更多
In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get...In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get the electromagnetic wave propagation model of typical scenarios firstly and then do the network design by simulation offline,which obviously leads to a 6G network lacking of adaptation to dynamic environments.Recently,with the aid of sensing enhancement,more environment information can be obtained.Based on this,from radio wave propagation perspective,we propose a predictive 6G network with environment sensing enhancement,the electromagnetic wave propagation characteristics prediction enabled network(EWave Net),to further release the potential of 6G.To this end,a prediction plane is created to sense,predict and utilize the physical environment information in EWave Net to realize the electromagnetic wave propagation characteristics prediction timely.A two-level closed feedback workflow is also designed to enhance the sensing and prediction ability for EWave Net.Several promising application cases of EWave Net are analyzed and the open issues to achieve this goal are addressed finally.展开更多
In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperativ...In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperative communication scenarios,the recognition of digital signal modulations assists people in identifying the communication targets and ensures an effective management over them.The recent advancements in both Machine Learning(ML)and Deep Learning(DL)models demand the development of effective modulation recognition models with self-learning capability.In this background,the current research article designs aDeep Learning enabled Intelligent Modulation Recognition of Communication Signal(DLIMR-CS)technique for next-generation networks.The aim of the proposed DLIMR-CS technique is to classify different kinds of digitally-modulated signals.In addition,the fractal feature extraction process is appliedwith the help of the Sevcik Fractal Dimension(SFD)approach.Then,the extracted features are fed into the Deep Variational Autoencoder(DVAE)model for the classification of the modulated signals.In order to improve the classification performance of the DVAE model,the Tunicate Swarm Algorithm(TSA)is used to finetune the hyperparameters involved in DVAE model.A wide range of simulations was conducted to establish the enhanced performance of the proposed DLIMR-CS model.The experimental outcomes confirmed the superior recognition rate of the DLIMR-CS model over recent state-of-the-art methods under different evaluation parameters.展开更多
With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integ...With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integrated terrestrial-satellite networks are integrated into ultra dense satellite-enabled 6G networks architecture.Then the subchannel and power allocation schemes for the downlink of the ultra dense satellite-enabled 6G heterogeneous networks are introduced.Satellite mobile edge computing(SMEC)with edge caching in three-layer heterogeneous networks serves to reduce the link traffic of networks.Furthermore,a scheme for interference management is presented,involving quality-of-service(QoS)and co-tier/cross-tier interference constraints.The simulation results show that the proposed schemes can significantly increase the total capacity of ultra dense satellite-enabled 6G heterogeneous networks.展开更多
Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks...Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks. In this paper, we aim to minimize the transmission delay in the MIMO-MEC in order to improve the spectral efficiency, energy efficiency, and data rate of MEC offloading. Dinkelbach transform and generalized singular value decomposition (GSVD) method are used to solve the delay minimization problem. Analytical results are provided to evaluate the performance of the proposed Hybrid-NOMA-MIMO-MEC system. Simulation results reveal that the H-NOMA-MIMO-MEC system can achieve better delay performance and lower energy consumption compared to OMA.展开更多
Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the compu...Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the computational concept of Decision-Making of cognition intelligence,its implementation framework adapting to foreseen innovations on networks and services,and its empirical evaluations are key techniques to guarantee the generationagnostic intelligence evolution of wireless communication networks.In this paper,we propose an Intelligent Decision Making(IDM)framework,acting as the role of network brain,based on Reinforcement Learning modelling philosophy to empower autonomous intelligence evolution capability to 6G network.Besides,usage scenarios and simulation demonstrate the generality and efficiency of IDM.We hope that some of the ideas of IDM will assist the research of 6G network in a new or different light.展开更多
In this paper,we develop a 6G wireless powered Internet of Things(IoT)system assisted by unmanned aerial vehicles(UAVs)to intelligently supply energy and collect data at the same time.In our dual-UAV scheme,UAV-E,with...In this paper,we develop a 6G wireless powered Internet of Things(IoT)system assisted by unmanned aerial vehicles(UAVs)to intelligently supply energy and collect data at the same time.In our dual-UAV scheme,UAV-E,with a constant power supply,transmits energy to charge the IoT devices on the ground,whereas UAV-B serves the IoT devices by data collection as a base station.In this framework,the system's energy efficiency is maximized,which we define as a ratio of the sum rate of IoT devices to the energy consumption of two UAVs during a fixed working duration.With the constraints of duration,transmit power,energy,and mobility,a difficult non-convex issue is presented by optimizing the trajectory,time duration allocation,and uplink transmit power of concurrently.To tackle the non-convex fractional optimization issue,we deconstruct it into three subproblems and we solve each of them iteratively using the descent method in conjunction with sequential convex approximation(SCA)approaches and the Dinkelbach algorithm.The simulation findings indicate that the suggested cooperative design has the potential to greatly increase the energy efficiency of the 6G intelligent UAV-assisted wireless powered IoT system when compared to previous benchmark systems.展开更多
The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enable...The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enables to achieve maximum throughput and delay for several applications.Besides,the evolution of 6G leads to the design of unmanned aerial vehicles(UAVs)in providing inexpensive and effective solutions in various application areas such as healthcare,environment monitoring,and so on.In the UAV network,effective data collection with restricted energy capacity poses a major issue to achieving high quality network communication.It can be addressed by the use of clustering techniques forUAVs in 6G networks.In this aspect,this study develops a novel metaheuristic based energy efficient data gathering scheme for clustered unmanned aerial vehicles(MEEDG-CUAV).The proposed MEEDG-CUAV technique intends in partitioning the UAV networks into various clusters and assign a cluster head(CH)to reduce the overall energy utilization.Besides,the quantum chaotic butterfly optimization algorithm(QCBOA)with a fitness function is derived to choose CHs and construct clusters.The experimental validation of the MEEDG-CUAV technique occurs utilizing benchmark dataset and the experimental results highlighted the better performance over the other state of art techniques interms of different measures.展开更多
The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,in...The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.展开更多
5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channe...5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.展开更多
Unmanned aerial vehicle(UAV)-enabled edge computing is emerging as a potential enabler for Artificial Intelligence of Things(AIoT)in the forthcoming sixth-generation(6G)communication networks.With the use of flexible ...Unmanned aerial vehicle(UAV)-enabled edge computing is emerging as a potential enabler for Artificial Intelligence of Things(AIoT)in the forthcoming sixth-generation(6G)communication networks.With the use of flexible UAVs,massive sensing data is gathered and processed promptly without considering geographical locations.Deep neural networks(DNNs)are becoming a driving force to extract valuable information from sensing data.However,the lightweight servers installed on UAVs are not able to meet the extremely high requirements of inference tasks due to the limited battery capacities of UAVs.In this work,we investigate a DNN model placement problem for AIoT applications,where the trained DNN models are selected and placed on UAVs to execute inference tasks locally.It is impractical to obtain future DNN model request profiles and system operation states in UAV-enabled edge computing.The Lyapunov optimization technique is leveraged for the proposed DNN model placement problem.Based on the observed system overview,an advanced online placement(AOP)algorithm is developed to solve the transformed problem in each time slot,which can reduce DNN model transmission delay and disk I/O energy cost simultaneously while keeping the input data queues stable.Finally,extensive simulations are provided to depict the effectiveness of the AOP algorithm.The numerical results demonstrate that the AOP algorithm can reduce 18.14%of the model placement cost and 29.89%of the input data queue backlog on average by comparing it with benchmark algorithms.展开更多
6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT netw...6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT networks face two critical challenges,i.e.,energy limitation and severe signal attenuation.Simultaneous wireless information and power transfer(SWIPT)and cooperative relaying provide effective ways to address these two challenges.In this paper,we investigate the energy self-sustainability(ESS)of 6G IoT network and propose an OFDM based bidirectional multi-relay SWIPT strategy for 6G IoT networks.In the proposed strategy,the transmission process is equally divided into two phases.Specifically,in phase1 two source nodes transmit their signals to relay nodes which will then use different subcarrier sets to decode information and harvest energy,respectively.In phase2 relay nodes forward signals to corresponding destination nodes with the harvested energy.We maximize the weighted sum transmission rate by optimizing subcarriers and power allocation.Our proposed strategy achieves larger weighted sum transmission rate comparing with the benchmark scheme.展开更多
基金funding by Comunidad de Madrid within the framework of the Multiannual Agreement with Universidad Politécnica de Madrid to encourage research by young doctors(PRINCE project).
文摘Future 6G communications are envisioned to enable a large catalogue of pioneering applications.These will range from networked Cyber-Physical Systems to edge computing devices,establishing real-time feedback control loops critical for managing Industry 5.0 deployments,digital agriculture systems,and essential infrastructures.The provision of extensive machine-type communications through 6G will render many of these innovative systems autonomous and unsupervised.While full automation will enhance industrial efficiency significantly,it concurrently introduces new cyber risks and vulnerabilities.In particular,unattended systems are highly susceptible to trust issues:malicious nodes and false information can be easily introduced into control loops.Additionally,Denialof-Service attacks can be executed by inundating the network with valueless noise.Current anomaly detection schemes require the entire transformation of the control software to integrate new steps and can only mitigate anomalies that conform to predefined mathematical models.Solutions based on an exhaustive data collection to detect anomalies are precise but extremely slow.Standard models,with their limited understanding of mobile networks,can achieve precision rates no higher than 75%.Therefore,more general and transversal protection mechanisms are needed to detect malicious behaviors transparently.This paper introduces a probabilistic trust model and control algorithm designed to address this gap.The model determines the probability of any node to be trustworthy.Communication channels are pruned for those nodes whose probability is below a given threshold.The trust control algorithmcomprises three primary phases,which feed themodel with three different probabilities,which are weighted and combined.Initially,anomalous nodes are identified using Gaussian mixture models and clustering technologies.Next,traffic patterns are studied using digital Bessel functions and the functional scalar product.Finally,the information coherence and content are analyzed.The noise content and abnormal information sequences are detected using a Volterra filter and a bank of Finite Impulse Response filters.An experimental validation based on simulation tools and environments was carried out.Results show the proposed solution can successfully detect up to 92%of malicious data injection attacks.
基金supported by the National Natural Science Foundation of China(No.92167202,61925102,U21B2014,62101069)the National Key R&D Program of China(No.2020YFB1805002)。
文摘In order to support the future digital society,sixth generation(6G)network faces the challenge to work efficiently and flexibly in a wider range of scenarios.The traditional way of system design is to sequentially get the electromagnetic wave propagation model of typical scenarios firstly and then do the network design by simulation offline,which obviously leads to a 6G network lacking of adaptation to dynamic environments.Recently,with the aid of sensing enhancement,more environment information can be obtained.Based on this,from radio wave propagation perspective,we propose a predictive 6G network with environment sensing enhancement,the electromagnetic wave propagation characteristics prediction enabled network(EWave Net),to further release the potential of 6G.To this end,a prediction plane is created to sense,predict and utilize the physical environment information in EWave Net to realize the electromagnetic wave propagation characteristics prediction timely.A two-level closed feedback workflow is also designed to enhance the sensing and prediction ability for EWave Net.Several promising application cases of EWave Net are analyzed and the open issues to achieve this goal are addressed finally.
文摘In recent years,the need for a fast,efficient and a reliable wireless network has increased dramatically.Numerous 5G networks have already been tested while a few are in the early stages of deployment.In noncooperative communication scenarios,the recognition of digital signal modulations assists people in identifying the communication targets and ensures an effective management over them.The recent advancements in both Machine Learning(ML)and Deep Learning(DL)models demand the development of effective modulation recognition models with self-learning capability.In this background,the current research article designs aDeep Learning enabled Intelligent Modulation Recognition of Communication Signal(DLIMR-CS)technique for next-generation networks.The aim of the proposed DLIMR-CS technique is to classify different kinds of digitally-modulated signals.In addition,the fractal feature extraction process is appliedwith the help of the Sevcik Fractal Dimension(SFD)approach.Then,the extracted features are fed into the Deep Variational Autoencoder(DVAE)model for the classification of the modulated signals.In order to improve the classification performance of the DVAE model,the Tunicate Swarm Algorithm(TSA)is used to finetune the hyperparameters involved in DVAE model.A wide range of simulations was conducted to establish the enhanced performance of the proposed DLIMR-CS model.The experimental outcomes confirmed the superior recognition rate of the DLIMR-CS model over recent state-of-the-art methods under different evaluation parameters.
基金supported in part by the National Key R&D Program of China(2020YFB1806103)the National Natural Science Foundation of China under Grant 62225103 and U22B2003+1 种基金Beijing Natural Science Foundation(L212004)China University Industry-University-Research Collaborative Innovation Fund(2021FNA05001).
文摘With the evolution of the sixth generation(6G)mobile communication technology,ample attention has gone to the integrated terrestrial-satellite networks.This paper notes that four typical application scenarios of integrated terrestrial-satellite networks are integrated into ultra dense satellite-enabled 6G networks architecture.Then the subchannel and power allocation schemes for the downlink of the ultra dense satellite-enabled 6G heterogeneous networks are introduced.Satellite mobile edge computing(SMEC)with edge caching in three-layer heterogeneous networks serves to reduce the link traffic of networks.Furthermore,a scheme for interference management is presented,involving quality-of-service(QoS)and co-tier/cross-tier interference constraints.The simulation results show that the proposed schemes can significantly increase the total capacity of ultra dense satellite-enabled 6G heterogeneous networks.
基金supported by Republic of Turkey Ministry of National Education
文摘Non-orthogonal multiple access (NOMA), multiple-input multiple-output (MIMO) and mobile edge computing (MEC) are prominent technologies to meet high data rate demand in the sixth generation (6G) communication networks. In this paper, we aim to minimize the transmission delay in the MIMO-MEC in order to improve the spectral efficiency, energy efficiency, and data rate of MEC offloading. Dinkelbach transform and generalized singular value decomposition (GSVD) method are used to solve the delay minimization problem. Analytical results are provided to evaluate the performance of the proposed Hybrid-NOMA-MIMO-MEC system. Simulation results reveal that the H-NOMA-MIMO-MEC system can achieve better delay performance and lower energy consumption compared to OMA.
基金supported by National Key Research and Development Project 2018YFE0205503Beijing University of Posts and Telecommunications-China Mobile Research Institute Joint Innovation Center。
文摘Sixth Generation(6G)wireless communication network has been expected to provide global coverage,enhanced spectral efficiency,and AI(Artificial Intelligence)-native intelligence,etc.To meet these requirements,the computational concept of Decision-Making of cognition intelligence,its implementation framework adapting to foreseen innovations on networks and services,and its empirical evaluations are key techniques to guarantee the generationagnostic intelligence evolution of wireless communication networks.In this paper,we propose an Intelligent Decision Making(IDM)framework,acting as the role of network brain,based on Reinforcement Learning modelling philosophy to empower autonomous intelligence evolution capability to 6G network.Besides,usage scenarios and simulation demonstrate the generality and efficiency of IDM.We hope that some of the ideas of IDM will assist the research of 6G network in a new or different light.
基金supported by the Natural Science Foundation of Beijing Municipality under Grant L192034。
文摘In this paper,we develop a 6G wireless powered Internet of Things(IoT)system assisted by unmanned aerial vehicles(UAVs)to intelligently supply energy and collect data at the same time.In our dual-UAV scheme,UAV-E,with a constant power supply,transmits energy to charge the IoT devices on the ground,whereas UAV-B serves the IoT devices by data collection as a base station.In this framework,the system's energy efficiency is maximized,which we define as a ratio of the sum rate of IoT devices to the energy consumption of two UAVs during a fixed working duration.With the constraints of duration,transmit power,energy,and mobility,a difficult non-convex issue is presented by optimizing the trajectory,time duration allocation,and uplink transmit power of concurrently.To tackle the non-convex fractional optimization issue,we deconstruct it into three subproblems and we solve each of them iteratively using the descent method in conjunction with sequential convex approximation(SCA)approaches and the Dinkelbach algorithm.The simulation findings indicate that the suggested cooperative design has the potential to greatly increase the energy efficiency of the 6G intelligent UAV-assisted wireless powered IoT system when compared to previous benchmark systems.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under Grant Number(RGP 1/279/42).www.kku.edu.sa.
文摘The sixth-generation(6G)wireless communication networks are anticipated in integrating aerial,terrestrial,and maritime communication into a robust system to accomplish trustworthy,quick,and low latency needs.It enables to achieve maximum throughput and delay for several applications.Besides,the evolution of 6G leads to the design of unmanned aerial vehicles(UAVs)in providing inexpensive and effective solutions in various application areas such as healthcare,environment monitoring,and so on.In the UAV network,effective data collection with restricted energy capacity poses a major issue to achieving high quality network communication.It can be addressed by the use of clustering techniques forUAVs in 6G networks.In this aspect,this study develops a novel metaheuristic based energy efficient data gathering scheme for clustered unmanned aerial vehicles(MEEDG-CUAV).The proposed MEEDG-CUAV technique intends in partitioning the UAV networks into various clusters and assign a cluster head(CH)to reduce the overall energy utilization.Besides,the quantum chaotic butterfly optimization algorithm(QCBOA)with a fitness function is derived to choose CHs and construct clusters.The experimental validation of the MEEDG-CUAV technique occurs utilizing benchmark dataset and the experimental results highlighted the better performance over the other state of art techniques interms of different measures.
基金This study is supported by Fujitsu-Waseda Digital Annealer FWDA Research Project and Fujitsu Co-Creation Research Laboratory at Waseda University(Joint Research between Waseda University and Fujitsu Lab).The study was also partly supported by the School of Fundamental Science and Engineering,Faculty of Science and Engineering,Waseda University,Japan.This study was supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)Grant funded by the Korean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security)and Research Cluster Project,R20143,by the Zayed University Research Office.
文摘The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.
基金supported by the Motorola Mobility,the National Council for Scientific and Technological Development(No.433142/2018-9)Research Productivity Fellowship(No.312831/2020-0)the Pernambuco Research Foundation(FACEPE)。
文摘5G networks apply adaptive modulation and coding according to the channel condition reported by the user in order to keep the mobile communication quality.However,the delay incurred by the feedback may make the channel quality indicator(CQI)obsolete.This paper addresses this issue by proposing two approaches,one based on machine learning and another on evolutionary computing,which considers the user context and signal-to-interference-plus-noise ratio(SINR)besides the delay length to estimate the updated SINR to be mapped into a CQI value.Our proposals are designed to run at the user equipment(UE)side,neither requiring any change in the signalling between the base station(gNB)and UE nor overloading the gNB.They are evaluated in terms of mean squared error by adopting 5G network simulation data and the results show their high accuracy and feasibility to be employed in 5G/6G systems.
基金supported by the National Science Foundation of China(Grant No.62202118)the Top-Technology Talent Project from Guizhou Education Department(Qianjiao Ji[2022]073)+1 种基金the Natural Science Foundation of Hebei Province(Grant No.F2022203045 and F2022203026)the Central Government Guided Local Science and Technology Development Fund Project(Grant No.226Z0701G).
文摘Unmanned aerial vehicle(UAV)-enabled edge computing is emerging as a potential enabler for Artificial Intelligence of Things(AIoT)in the forthcoming sixth-generation(6G)communication networks.With the use of flexible UAVs,massive sensing data is gathered and processed promptly without considering geographical locations.Deep neural networks(DNNs)are becoming a driving force to extract valuable information from sensing data.However,the lightweight servers installed on UAVs are not able to meet the extremely high requirements of inference tasks due to the limited battery capacities of UAVs.In this work,we investigate a DNN model placement problem for AIoT applications,where the trained DNN models are selected and placed on UAVs to execute inference tasks locally.It is impractical to obtain future DNN model request profiles and system operation states in UAV-enabled edge computing.The Lyapunov optimization technique is leveraged for the proposed DNN model placement problem.Based on the observed system overview,an advanced online placement(AOP)algorithm is developed to solve the transformed problem in each time slot,which can reduce DNN model transmission delay and disk I/O energy cost simultaneously while keeping the input data queues stable.Finally,extensive simulations are provided to depict the effectiveness of the AOP algorithm.The numerical results demonstrate that the AOP algorithm can reduce 18.14%of the model placement cost and 29.89%of the input data queue backlog on average by comparing it with benchmark algorithms.
基金This work was supported by China National Science Foundation under Grant No.61871348by University Key Laboratory of Advanced Wireless Communications of Guangdong Province,by the Project funded by China Postdoctoral Science Foundation under Grant 2019T120531+1 种基金by the Science and Technology Development Fund,Macao,China under Grant 0162/2019/A3by the Fundamental Research Funds for the Provincial Universities of Zhejiang under Grant RFA2019001.
文摘6G IoT networks aim for providing significantly higher data rates and extremely lower latency.However,due to the increasingly scarce spectrum bands and ever-growing massive number IoT devices(IoDs)deployed,6G IoT networks face two critical challenges,i.e.,energy limitation and severe signal attenuation.Simultaneous wireless information and power transfer(SWIPT)and cooperative relaying provide effective ways to address these two challenges.In this paper,we investigate the energy self-sustainability(ESS)of 6G IoT network and propose an OFDM based bidirectional multi-relay SWIPT strategy for 6G IoT networks.In the proposed strategy,the transmission process is equally divided into two phases.Specifically,in phase1 two source nodes transmit their signals to relay nodes which will then use different subcarrier sets to decode information and harvest energy,respectively.In phase2 relay nodes forward signals to corresponding destination nodes with the harvested energy.We maximize the weighted sum transmission rate by optimizing subcarriers and power allocation.Our proposed strategy achieves larger weighted sum transmission rate comparing with the benchmark scheme.