With the rapid spread of smart sensors,data collection is becoming more and more important in Mobile Edge Networks(MENs).The collected data can be used in many applications based on the analysis results of these data ...With the rapid spread of smart sensors,data collection is becoming more and more important in Mobile Edge Networks(MENs).The collected data can be used in many applications based on the analysis results of these data by cloud computing.Nowadays,data collection schemes have been widely studied by researchers.However,most of the researches take the amount of collected data into consideration without thinking about the problem of privacy leakage of the collected data.In this paper,we propose an energy-efficient and anonymous data collection scheme for MENs to keep a balance between energy consumption and data privacy,in which the privacy information of senors is hidden during data communication.In addition,the residual energy of nodes is taken into consideration in this scheme in particular when it comes to the selection of the relay node.The security analysis shows that no privacy information of the source node and relay node is leaked to attackers.Moreover,the simulation results demonstrate that the proposed scheme is better than other schemes in aspects of lifetime and energy consumption.At the end of the simulation part,we present a qualitative analysis for the proposed scheme and some conventional protocols.It is noteworthy that the proposed scheme outperforms the existing protocols in terms of the above indicators.展开更多
The emerging mobile edge networks with content caching capability allows end users to receive information from adjacent edge servers directly instead of a centralized data warehouse,thus the network transmission delay...The emerging mobile edge networks with content caching capability allows end users to receive information from adjacent edge servers directly instead of a centralized data warehouse,thus the network transmission delay and system throughput can be improved significantly.Since the duplicate content transmissions between edge network and remote cloud can be reduced,the appropriate caching strategy can also improve the system energy efficiency of mobile edge networks to a great extent.This paper focuses on how to improve the network energy efficiency and proposes an intelligent caching strategy according to the cached content distribution model for mobile edge networks based on promising deep reinforcement learning algorithm.The deep neural network(DNN)and Q-learning algorithm are combined to design a deep reinforcement learning framework named as the deep-Q neural network(DQN),in which the DNN is adopted to represent the approximation of action-state value function in the Q-learning solution.The parameters iteration strategies in the proposed DQN algorithm were improved through stochastic gradient descent method,so the DQN algorithm could converge to the optimal solution quickly,and the network performance of the content caching policy can be optimized.The simulation results show that the proposed intelligent DQN-based content cache strategy with enough training steps could improve the energy efficiency of the mobile edge networks significantly.展开更多
Mobile Edge Computing(MEC)is a technology designed for the on-demand provisioning of computing and storage services,strategically positioned close to users.In the MEC environment,frequently accessed content can be dep...Mobile Edge Computing(MEC)is a technology designed for the on-demand provisioning of computing and storage services,strategically positioned close to users.In the MEC environment,frequently accessed content can be deployed and cached on edge servers to optimize the efficiency of content delivery,ultimately enhancing the quality of the user experience.However,due to the typical placement of edge devices and nodes at the network’s periphery,these components may face various potential fault tolerance challenges,including network instability,device failures,and resource constraints.Considering the dynamic nature ofMEC,making high-quality content caching decisions for real-time mobile applications,especially those sensitive to latency,by effectively utilizing mobility information,continues to be a significant challenge.In response to this challenge,this paper introduces FT-MAACC,a mobility-aware caching solution grounded in multi-agent deep reinforcement learning and equipped with fault tolerance mechanisms.This approach comprehensively integrates content adaptivity algorithms to evaluate the priority of highly user-adaptive cached content.Furthermore,it relies on collaborative caching strategies based onmulti-agent deep reinforcement learningmodels and establishes a fault-tolerancemodel to ensure the system’s reliability,availability,and persistence.Empirical results unequivocally demonstrate that FTMAACC outperforms its peer methods in cache hit rates and transmission latency.展开更多
基金This work is supported by the National Key R&D Program of China under Grant No.2018YFB0505000the National Natural Science Foundation of China under Grant No.U1836115,No.61922045,No.U1836115 and No.61672295+2 种基金the Natural Science Foundation of Jiangsu Province under Grant No.BK20181408the State Key Laboratory of Cryptology Foundation,Guangxi Key Laboratory of Cryptography and Information Security No.GCIS201715the CICAEET fund,and the PAPD fund.
文摘With the rapid spread of smart sensors,data collection is becoming more and more important in Mobile Edge Networks(MENs).The collected data can be used in many applications based on the analysis results of these data by cloud computing.Nowadays,data collection schemes have been widely studied by researchers.However,most of the researches take the amount of collected data into consideration without thinking about the problem of privacy leakage of the collected data.In this paper,we propose an energy-efficient and anonymous data collection scheme for MENs to keep a balance between energy consumption and data privacy,in which the privacy information of senors is hidden during data communication.In addition,the residual energy of nodes is taken into consideration in this scheme in particular when it comes to the selection of the relay node.The security analysis shows that no privacy information of the source node and relay node is leaked to attackers.Moreover,the simulation results demonstrate that the proposed scheme is better than other schemes in aspects of lifetime and energy consumption.At the end of the simulation part,we present a qualitative analysis for the proposed scheme and some conventional protocols.It is noteworthy that the proposed scheme outperforms the existing protocols in terms of the above indicators.
基金This work was supported by the National Natural Science Foundation of China(61871058,WYF,http://www.nsfc.gov.cn/).
文摘The emerging mobile edge networks with content caching capability allows end users to receive information from adjacent edge servers directly instead of a centralized data warehouse,thus the network transmission delay and system throughput can be improved significantly.Since the duplicate content transmissions between edge network and remote cloud can be reduced,the appropriate caching strategy can also improve the system energy efficiency of mobile edge networks to a great extent.This paper focuses on how to improve the network energy efficiency and proposes an intelligent caching strategy according to the cached content distribution model for mobile edge networks based on promising deep reinforcement learning algorithm.The deep neural network(DNN)and Q-learning algorithm are combined to design a deep reinforcement learning framework named as the deep-Q neural network(DQN),in which the DNN is adopted to represent the approximation of action-state value function in the Q-learning solution.The parameters iteration strategies in the proposed DQN algorithm were improved through stochastic gradient descent method,so the DQN algorithm could converge to the optimal solution quickly,and the network performance of the content caching policy can be optimized.The simulation results show that the proposed intelligent DQN-based content cache strategy with enough training steps could improve the energy efficiency of the mobile edge networks significantly.
基金supported by the Innovation Fund Project of Jiangxi Normal University(YJS2022065)the Domestic Visiting Program of Jiangxi Normal University.
文摘Mobile Edge Computing(MEC)is a technology designed for the on-demand provisioning of computing and storage services,strategically positioned close to users.In the MEC environment,frequently accessed content can be deployed and cached on edge servers to optimize the efficiency of content delivery,ultimately enhancing the quality of the user experience.However,due to the typical placement of edge devices and nodes at the network’s periphery,these components may face various potential fault tolerance challenges,including network instability,device failures,and resource constraints.Considering the dynamic nature ofMEC,making high-quality content caching decisions for real-time mobile applications,especially those sensitive to latency,by effectively utilizing mobility information,continues to be a significant challenge.In response to this challenge,this paper introduces FT-MAACC,a mobility-aware caching solution grounded in multi-agent deep reinforcement learning and equipped with fault tolerance mechanisms.This approach comprehensively integrates content adaptivity algorithms to evaluate the priority of highly user-adaptive cached content.Furthermore,it relies on collaborative caching strategies based onmulti-agent deep reinforcement learningmodels and establishes a fault-tolerancemodel to ensure the system’s reliability,availability,and persistence.Empirical results unequivocally demonstrate that FTMAACC outperforms its peer methods in cache hit rates and transmission latency.