The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data gen...The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data generated by the IIo T,coupled with heterogeneous computation capacity across IIo T devices,and users’data privacy concerns,have posed challenges towards achieving industrial edge intelligence(IEI).To achieve IEI,in this paper,we propose a semi-federated learning framework where a portion of the data with higher privacy is kept locally and a portion of the less private data can be potentially uploaded to the edge server.In addition,we leverage digital twins to overcome the problem of computation capacity heterogeneity of IIo T devices through the mapping of physical entities.We formulate a synchronization latency minimization problem which jointly optimizes edge association and the proportion of uploaded nonprivate data.As the joint problem is NP-hard and combinatorial and taking into account the reality of largescale device training,we develop a multi-agent hybrid action deep reinforcement learning(DRL)algorithm to find the optimal solution.Simulation results show that our proposed DRL algorithm can reduce latency and have a better convergence performance for semi-federated learning compared to benchmark algorithms.展开更多
The integration of digital twin(DT)and 6G edge intelligence provides accurate forecasting for distributed resources control in smart park.However,the adverse impact of model poisoning attacks on DT model training cann...The integration of digital twin(DT)and 6G edge intelligence provides accurate forecasting for distributed resources control in smart park.However,the adverse impact of model poisoning attacks on DT model training cannot be ignored.To address this issue,we firstly construct the models of DT model training and model poisoning attacks.An optimization problem is formulated to minimize the weighted sum of the DT loss function and DT model training delay.Then,the problem is transformed and solved by the proposed Multi-timescAle endogenouS securiTy-aware DQN-based rEsouRce management algorithm(MASTER)based on DT-assisted state information evaluation and attack detection.MASTER adopts multi-timescale deep Q-learning(DQN)networks to jointly schedule local training epochs and devices.It actively adjusts resource management strategies based on estimated attack probability to achieve endogenous security awareness.Simulation results demonstrate that MASTER has excellent performances in DT model training accuracy and delay.展开更多
The enormous volume of heterogeneous data fromvarious smart device-based applications has growingly increased a deeply interlaced cyber-physical system.In order to deliver smart cloud services that require low latency...The enormous volume of heterogeneous data fromvarious smart device-based applications has growingly increased a deeply interlaced cyber-physical system.In order to deliver smart cloud services that require low latency with strong computational processing capabilities,the Edge Intelligence System(EIS)idea is now being employed,which takes advantage of Artificial Intelligence(AI)and Edge Computing Technology(ECT).Thus,EIS presents a potential approach to enforcing future Intelligent Transportation Systems(ITS),particularly within a context of a Vehicular Network(VNets).However,the current EIS framework meets some issues and is conceivably vulnerable tomultiple adversarial attacks because the central aggregator server handles the entire systemorchestration.Hence,this paper introduces the concept of distributed edge intelligence,combining the advantages of Federated Learning(FL),Differential Privacy(DP),and blockchain to address the issues raised earlier.By performing decentralized data management and storing transactions in immutable distributed ledger networks,the blockchain-assisted FL method improves user privacy and boosts traffic prediction accuracy.Additionally,DP is utilized in defending the user’s private data from various threats and is given the authority to bolster the confidentiality of data-sharing transactions.Our model has been deployed in two strategies:First,DP-based FL to strengthen user privacy by masking the intermediate data during model uploading.Second,blockchain-based FL to effectively construct secure and decentralized traffic management in vehicular networks.The simulation results demonstrated that our framework yields several benefits for VNets privacy protection by forming a distributed EIS with privacy budget(ε)of 4.03,1.18,and 0.522,achieving model accuracy of 95.8%,93.78%,and 89.31%,respectively.展开更多
With the booming of artificial intelligence(AI),Internet of Things(IoT),and high-speed communication technology,integrating these technologies to innovate the smart grid(SG)further is future development direction of t...With the booming of artificial intelligence(AI),Internet of Things(IoT),and high-speed communication technology,integrating these technologies to innovate the smart grid(SG)further is future development direction of the power grid.Driven by this trend,billions of devices in the SG are connected to the Internet and generate a large amount of data at network edge.To reduce pressure of cloud computing and overcome defects of centralized learning,emergence of edge computing(EC)makes the computing task transfer from the network center to the network edge.When further exploring the relationship between EC and AI,edge intelligence(EI)has become one of the research hotspots.Advantages of EI in flexibly utilizing EC resources and improving AI model learning efficiency make its application in SG a good prospect.However,since only a few existing studies have applied EI to SG,this paper focuses on the application potential of EI in SG.First,the concepts,characteristics,frameworks,and key technologies of EI are investigated.Then,a comprehensive review of AI and EC applications in SG is presented.Furthermore,application potentials for EI in SG are explored,and four application scenarios of EI for SG are proposed.Finally,challenges and future directions for EI in SG are discussed.This application survey of EI on SG is carried out before EI enters the largescale commercial stage to provide references and guidelines for developing future EI frameworks in the SG paradigm.展开更多
With the vigorous development of artificial intelligence(AI),intelligence applications based on deep neural networks(DNNs)have changed people’s lifestyles and production efficiency.However,the large amount of computa...With the vigorous development of artificial intelligence(AI),intelligence applications based on deep neural networks(DNNs)have changed people’s lifestyles and production efficiency.However,the large amount of computation and data generated from the network edge becomes the major bottleneck,and the traditional cloud-based computing mode has been unable to meet the requirements of realtime processing tasks.To solve the above problems,by embedding AI model training and inference capabilities into the network edge,edge intelligence(EI)becomes a cutting-edge direction in the field of AI.Furthermore,collaborative DNN inference among the cloud,edge,and end devices provides a promising way to boost EI.Nevertheless,at present,EI oriented collaborative DNN inference is still in its early stage,lacking systematic classification and discussion of existing research efforts.Motivated by it,we have comprehensively investigated recent studies on EI-oriented collaborative DNN inference.In this paper,we first review the background and motivation of EI.Then,we classify four typical collaborative DNN inference paradigms for EI,and analyse their characteristics and key technologies.Finally,we summarize the current challenges of collaborative DNN inference,discuss future development trends and provide future research directions.展开更多
The popularization of intelligent healthcare devices and big data analytics significantly boosts the development of Smart Healthcare Networks(SHNs).To enhance the precision of diagnosis,different participants in SHNs ...The popularization of intelligent healthcare devices and big data analytics significantly boosts the development of Smart Healthcare Networks(SHNs).To enhance the precision of diagnosis,different participants in SHNs share health data that contain sensitive information.Therefore,the data exchange process raises privacy concerns,especially when the integration of health data from multiple sources(linkage attack)results in further leakage.Linkage attack is a type of dominant attack in the privacy domain,which can leverage various data sources for private data mining.Furthermore,adversaries launch poisoning attacks to falsify the health data,which leads to misdiagnosing or even physical damage.To protect private health data,we propose a personalized differential privacy model based on the trust levels among users.The trust is evaluated by a defined community density,while the corresponding privacy protection level is mapped to controllable randomized noise constrained by differential privacy.To avoid linkage attacks in personalized differential privacy,we design a noise correlation decoupling mechanism using a Markov stochastic process.In addition,we build the community model on a blockchain,which can mitigate the risk of poisoning attacks during differentially private data transmission over SHNs.Extensive experiments and analysis on real-world datasets have testified the proposed model,and achieved better performance compared with existing research from perspectives of privacy protection and effectiveness.展开更多
To facilitate emerging applications and demands of edge intelligence(EI)-empowered 6G networks,model-driven semantic communications have been proposed to reduce transmission volume by deploying artificial intelligence...To facilitate emerging applications and demands of edge intelligence(EI)-empowered 6G networks,model-driven semantic communications have been proposed to reduce transmission volume by deploying artificial intelligence(AI)models that provide abilities of semantic extraction and recovery.Nevertheless,it is not feasible to preload all AI models on resource-constrained terminals.Thus,in-time model transmission becomes a crucial problem.This paper proposes an intellicise model transmission architecture to guarantee the reliable transmission of models for semantic communication.The mathematical relationship between model size and performance is formulated by employing a recognition error function supported with experimental data.We consider the characteristics of wireless channels and derive the closed-form expression of model transmission outage probability(MTOP)over the Rayleigh channel.Besides,we define the effective model accuracy(EMA)to evaluate the model transmission performance of both communication and intelligence.Then we propose a joint model selection and resource allocation(JMSRA)algorithm to maximize the average EMA of all users.Simulation results demonstrate that the average EMA of the JMSRA algorithm outperforms baseline algorithms by about 22%.展开更多
With the proportion of intelligent services in the industrial internet of things(IIoT)rising rapidly,its data dependency and decomposability increase the difficulty of scheduling computing resources.In this paper,we p...With the proportion of intelligent services in the industrial internet of things(IIoT)rising rapidly,its data dependency and decomposability increase the difficulty of scheduling computing resources.In this paper,we propose an intelligent service computing framework.In the framework,we take the long-term rewards of its important participants,edge service providers,as the optimization goal,which is related to service delay and computing cost.Considering the different update frequencies of data deployment and service offloading,double-timescale reinforcement learning is utilized in the framework.In the small-scale strategy,the frequent concurrency of services and the difference in service time lead to the fuzzy relationship between reward and action.To solve the fuzzy reward problem,a reward mapping-based reinforcement learning(RMRL)algorithm is proposed,which enables the agent to learn the relationship between reward and action more clearly.The large time scale strategy adopts the improved Monte Carlo tree search(MCTS)algorithm to improve the learning speed.The simulation results show that the strategy is superior to popular reinforcement learning algorithms such as double Q-learning(DDQN)and dueling Q-learning(dueling-DQN)in learning speed,and the reward is also increased by 14%.展开更多
[Objective]Real-time monitoring of cow ruminant behavior is of paramount importance for promptly obtaining relevant information about cow health and predicting cow diseases.Currently,various strategies have been propo...[Objective]Real-time monitoring of cow ruminant behavior is of paramount importance for promptly obtaining relevant information about cow health and predicting cow diseases.Currently,various strategies have been proposed for monitoring cow ruminant behavior,including video surveillance,sound recognition,and sensor monitoring methods.How‐ever,the application of edge device gives rise to the issue of inadequate real-time performance.To reduce the volume of data transmission and cloud computing workload while achieving real-time monitoring of dairy cow rumination behavior,a real-time monitoring method was proposed for cow ruminant behavior based on edge computing.[Methods]Autono‐mously designed edge devices were utilized to collect and process six-axis acceleration signals from cows in real-time.Based on these six-axis data,two distinct strategies,federated edge intelligence and split edge intelligence,were investigat‐ed for the real-time recognition of cow ruminant behavior.Focused on the real-time recognition method for cow ruminant behavior leveraging federated edge intelligence,the CA-MobileNet v3 network was proposed by enhancing the MobileNet v3 network with a collaborative attention mechanism.Additionally,a federated edge intelligence model was designed uti‐lizing the CA-MobileNet v3 network and the FedAvg federated aggregation algorithm.In the study on split edge intelli‐gence,a split edge intelligence model named MobileNet-LSTM was designed by integrating the MobileNet v3 network with a fusion collaborative attention mechanism and the Bi-LSTM network.[Results and Discussions]Through compara‐tive experiments with MobileNet v3 and MobileNet-LSTM,the federated edge intelligence model based on CA-Mo‐bileNet v3 achieved an average Precision rate,Recall rate,F1-Score,Specificity,and Accuracy of 97.1%,97.9%,97.5%,98.3%,and 98.2%,respectively,yielding the best recognition performance.[Conclusions]It is provided a real-time and effective method for monitoring cow ruminant behavior,and the proposed federated edge intelligence model can be ap‐plied in practical settings.展开更多
Edge intelligence is anticipated to underlay the pathway to connected intelligence for 6G networks,but the organic confluence of edge computing and artificial intelligence still needs to be carefully treated.To this e...Edge intelligence is anticipated to underlay the pathway to connected intelligence for 6G networks,but the organic confluence of edge computing and artificial intelligence still needs to be carefully treated.To this end,this article discusses the concepts of edge intelligence from the semantic cognitive perspective.Two instructive theoretical models for edge semantic cognitive intelligence(ESCI)are first established.Afterwards,the ESCI framework orchestrating deep learning with semantic communication is discussed.Two representative applications are present to shed light on the prospect of ESCI in 6G networks.Some open problems are finally listed to elicit the future research directions of ESCI.展开更多
The burgeoning advances in machine learning and wireless technologies are forg?ing a new paradigm for future networks, which are expected to possess higher degrees of in?telligence via the inference from vast dataset ...The burgeoning advances in machine learning and wireless technologies are forg?ing a new paradigm for future networks, which are expected to possess higher degrees of in?telligence via the inference from vast dataset and being able to respond to local events in a timely manner. Due to the sheer volume of data generated by end-user devices, as well as the increasing concerns about sharing private information, a new branch of machine learn?ing models, namely federated learning, has emerged from the intersection of artificial intelli?gence and edge computing. In contrast to conventional machine learning methods, federated learning brings the models directly to the device for training, where only the resultant param?eters shall be sent to the edge servers. The local copies of the model on the devices bring along great advantages of eliminating network latency and preserving data privacy. Never?theless, to make federated learning possible, one needs to tackle new challenges that require a fundamental departure from standard methods designed for distributed optimizations. In this paper, we aim to deliver a comprehensive introduction of federated learning. Specifical?ly, we first survey the basis of federated learning, including its learning structure and the distinct features from conventional machine learning models. We then enumerate several critical issues associated with the deployment of federated learning in a wireless network, and show why and how technologies should be jointly integrated to facilitate the full imple?mentation from different perspectives, ranging from algorithmic design, on-device training, to communication resource management. Finally, we conclude by shedding light on some po?tential applications and future trends.展开更多
The demand for the Internet of Everything has slowed down network routing efficiency.Tradi-tional routing policies rely on manual configuration,which has limitations and adversely affects network performance.In this p...The demand for the Internet of Everything has slowed down network routing efficiency.Tradi-tional routing policies rely on manual configuration,which has limitations and adversely affects network performance.In this paper,we propose an Inter-net of Things(IoT)Intelligent Edge Network Routing(ENIR)architecture.ENIR uses deep reinforcement learning(DRL)to simulate human learning of empir-ical knowledge and an intelligent routing closed-loop control mechanism for real-time interaction with the network environment.According to the network de-mand and environmental conditions,the method can dynamically adjust network resources and perform in-telligent routing optimization.It uses blockchain tech-nology to share network knowledge and global op-timization of network routing.The intelligent rout-ing method uses the deep deterministic policy gradient(DDPG)algorithm.Our simulation results show that ENIR provides significantly better link utilization and transmission delay performance than various routing methods(e.g.,open shortest path first,routing based on Q-learning and DRL-based control framework for traffic engineering).展开更多
As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when ...As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when learning agents are deployed on the edge side,the data aggregation from the end side to the designated edge devices is an important research topic.Considering the various importance of end devices,this paper studies the weighted data aggregation problem in a single hop end-to-edge communication network.Firstly,to make sure all the end devices with various weights are fairly treated in data aggregation,a distributed end-to-edge cooperative scheme is proposed.Then,to handle the massive contention on the wireless channel caused by end devices,a multi-armed bandit(MAB)algorithm is designed to help the end devices find their most appropriate update rates.Diffe-rent from the traditional data aggregation works,combining the MAB enables our algorithm a higher efficiency in data aggregation.With a theoretical analysis,we show that the efficiency of our algorithm is asymptotically optimal.Comparative experiments with previous works are also conducted to show the strength of our algorithm.展开更多
Recent breakthroughs in artificial intelligence(AI) give rise to a plethora of intelligent applications and services based on machine learning algorithms such as deep neural networks(DNNs). With the proliferation of I...Recent breakthroughs in artificial intelligence(AI) give rise to a plethora of intelligent applications and services based on machine learning algorithms such as deep neural networks(DNNs). With the proliferation of Internet of things(IoT) and mobile edge computing, these applications are being pushed to the network edge, thus enabling a new paradigm termed as edge intelligence. This provokes the demand for decentralized implementation of learning algorithms over edge networks to distill the intelligence from distributed data, and also calls for new communication-efficient designs in air interfaces to improve the privacy by avoiding raw data exchange. This paper provides a comprehensive overview on edge intelligence, by particularly focusing on two paradigms named edge learning and edge inference, as well as the corresponding communication-efficient solutions for their implementations in wireless systems. Several insightful theoretical results and design guidelines are also provided.展开更多
The efficient integration of satellite and terrestrial networks has become an important component for 6 G wireless architectures to provide highly reliable and secure connectivity over a wide geographical area.As the ...The efficient integration of satellite and terrestrial networks has become an important component for 6 G wireless architectures to provide highly reliable and secure connectivity over a wide geographical area.As the satellite and cellular networks are developed separately these years,the integrated network should synergize the communication,storage,computation capabilities of both sides towards an intelligent system more than mere consideration of coexistence.This has motivated us to develop double-edge intelligent integrated satellite and terrestrial networks(DILIGENT).Leveraging the boost development of multi-access edge computing(MEC)technology and artificial intelligence(AI),the framework is entitled with the systematic learning and adaptive network management of satellite and cellular networks.In this article,we provide a brief review of the state-of-art contributions from the perspective of academic research and standardization.Then we present the overall design of the proposed DILIGENT architecture,where the advantages are discussed and summarized.Strategies of task offloading,content caching and distribution are presented.Numerical results show that the proposed network architecture outperforms the existing integrated networks.展开更多
The healthcare industry is rapidly adapting to new computing environments and technologies. With academics increasingly committed to developing and enhancing healthcare solutions that combine the Internet of Things (I...The healthcare industry is rapidly adapting to new computing environments and technologies. With academics increasingly committed to developing and enhancing healthcare solutions that combine the Internet of Things (IoT) and edge computing, there is a greater need than ever to adequately monitor the data being acquired, shared, processed, and stored. The growth of cloud, IoT, and edge computing models presents severe data privacy concerns, especially in the healthcare sector. However, rigorous research to develop appropriate data privacy solutions in the healthcare sector is still lacking. This paper discusses the current state of privacy-preservation solutions in IoT and edge healthcare applications. It identifies the common strategies often used to include privacy by the intelligent edges and technologies in healthcare systems. Furthermore, the study addresses the technical complexity, efficacy, and sustainability limits of these methods. The study also highlights the privacy issues and current research directions that have driven the IoT and edge healthcare solutions, with which more insightful future applications are encouraged.展开更多
In recent years,several efforts have been made to develop power transmission line abnormal target detection models based on edge devices.Typically,updates to these models rely on participation of the cloud,which means...In recent years,several efforts have been made to develop power transmission line abnormal target detection models based on edge devices.Typically,updates to these models rely on participation of the cloud,which means that network resource shortages can lead to update failures,followed by unsatisfactory recognition and detection performance in practical use.To address this problem,this article proposes an edge visual incremental perception framework,based on deep semisupervised learning,for monitoring power transmission lines.After generation of the initial model using a small amount of labeled data,models trained using this framework can update themselves based on unlabeled data.A teacher-student joint training strategy,a data augmentation strategy,and a model updating strategy are also designed and adopted to improve the performance of the models trained with this framework.The proposed framework is then examined with various transmission line datasets with 1%,2%,5%,and 10%labeled data.General performance enhancement is thus confirmed against traditional supervised learning strategies.With the 10%labeled data training set,the recognition accuracy of the model is improved to exceed 80%,meeting the practical needs of power system operation,and thus clearly validating the effectiveness of the framework.展开更多
Federated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the...Federated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the supervised training data stored in local clients inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased labels. These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries. Unfortunately, few attempts have been made towards noise-tolerant federated learning, with most of them relying on the strategy of transmitting overhead messages to assist noisy labels detection and correction, which increases the communication burden as well as privacy risks. In this paper, we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework. Our method leverages the inherent discrepancy in the learning ability of the local and global models in FL, which can be regarded as two complementary views. By iteratively exchanging samples with their high confident predictions, the two models “teach each other” to suppress the influence of noisy labels. The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning. Experimental results demonstrate that our method outperforms existing approaches, highlighting the superiority of our method.展开更多
Sixth generation(6G)enabled edge intelligence opens up a new era of Internet of everything and makes it possible to interconnect people-devices-cloud anytime,anywhere.More and more next-generation wireless network sma...Sixth generation(6G)enabled edge intelligence opens up a new era of Internet of everything and makes it possible to interconnect people-devices-cloud anytime,anywhere.More and more next-generation wireless network smart service applications are changing our way of life and improving our quality of life.As the hottest new form of next-generation Internet applications,Metaverse is striving to connect billions of users and create a shared world where virtual and reality merge.However,limited by resources,computing power,and sensory devices,Metaverse is still far from realizing its full vision of immersion,materialization,and interoperability.To this end,this survey aims to realize this vision through the organic integration of 6G-enabled edge artificial intelligence(AI)and Metaverse.Specifically,we first introduce three new types of edge-Metaverse architectures that use 6G-enabled edge AI to solve resource and computing constraints in Metaverse.Then we summarize technical challenges that these architectures face in Metaverse and the existing solutions.Furthermore,we explore how the edge-Metaverse architecture technology helps Metaverse to interact and share digital data.Finally,we discuss future research directions to realize the true vision of Metaverse with 6G-enabled edge AI.展开更多
In this paper the idea of Intelligent Scissors is adopted for contourtracking in dynamic image sequence. Tracking contour of human can therefore be converted to trackingseed points in images by making use of the prope...In this paper the idea of Intelligent Scissors is adopted for contourtracking in dynamic image sequence. Tracking contour of human can therefore be converted to trackingseed points in images by making use of the properties of the optimal path (Intelligent Edge). Themain advantage of the approach is that it can handle correctly occlusions that occur frequently whenhuman is moving. Non-Uniform Rational B-Spline (NURBS) is used to represent parametrically thecontour that one wants to track. In order to track robustly the contour in images, similarity andcompatibility measurements of the edge are computed as the weighting functions of optimal estimator.To reduce dramatically the computational load, an efficient method for extracting the regioninterested is proposed. Experiments show that the approach works robustly for sequences withfrequent occlusions.展开更多
基金supported in part by the National Nature Science Foundation of China under Grant 62001168in part by the Foundation and Application Research Grant of Guangzhou under Grant 202102020515。
文摘The rapid development of emerging technologies,such as edge intelligence and digital twins,have added momentum towards the development of the Industrial Internet of Things(IIo T).However,the massive amount of data generated by the IIo T,coupled with heterogeneous computation capacity across IIo T devices,and users’data privacy concerns,have posed challenges towards achieving industrial edge intelligence(IEI).To achieve IEI,in this paper,we propose a semi-federated learning framework where a portion of the data with higher privacy is kept locally and a portion of the less private data can be potentially uploaded to the edge server.In addition,we leverage digital twins to overcome the problem of computation capacity heterogeneity of IIo T devices through the mapping of physical entities.We formulate a synchronization latency minimization problem which jointly optimizes edge association and the proportion of uploaded nonprivate data.As the joint problem is NP-hard and combinatorial and taking into account the reality of largescale device training,we develop a multi-agent hybrid action deep reinforcement learning(DRL)algorithm to find the optimal solution.Simulation results show that our proposed DRL algorithm can reduce latency and have a better convergence performance for semi-federated learning compared to benchmark algorithms.
基金supported by the Science and Technology Project of State Grid Corporation of China under Grant Number 52094021N010 (5400-202199534A-05-ZN)。
文摘The integration of digital twin(DT)and 6G edge intelligence provides accurate forecasting for distributed resources control in smart park.However,the adverse impact of model poisoning attacks on DT model training cannot be ignored.To address this issue,we firstly construct the models of DT model training and model poisoning attacks.An optimization problem is formulated to minimize the weighted sum of the DT loss function and DT model training delay.Then,the problem is transformed and solved by the proposed Multi-timescAle endogenouS securiTy-aware DQN-based rEsouRce management algorithm(MASTER)based on DT-assisted state information evaluation and attack detection.MASTER adopts multi-timescale deep Q-learning(DQN)networks to jointly schedule local training epochs and devices.It actively adjusts resource management strategies based on estimated attack probability to achieve endogenous security awareness.Simulation results demonstrate that MASTER has excellent performances in DT model training accuracy and delay.
基金supported by theRepublic ofKorea’sMSIT(Ministry of Science and ICT)under the ICT Convergence Industry Innovation Technology Development Project(2022-0-00614)supervised by the IITP and partially supported by the Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(No.2021R1I1A3046590).
文摘The enormous volume of heterogeneous data fromvarious smart device-based applications has growingly increased a deeply interlaced cyber-physical system.In order to deliver smart cloud services that require low latency with strong computational processing capabilities,the Edge Intelligence System(EIS)idea is now being employed,which takes advantage of Artificial Intelligence(AI)and Edge Computing Technology(ECT).Thus,EIS presents a potential approach to enforcing future Intelligent Transportation Systems(ITS),particularly within a context of a Vehicular Network(VNets).However,the current EIS framework meets some issues and is conceivably vulnerable tomultiple adversarial attacks because the central aggregator server handles the entire systemorchestration.Hence,this paper introduces the concept of distributed edge intelligence,combining the advantages of Federated Learning(FL),Differential Privacy(DP),and blockchain to address the issues raised earlier.By performing decentralized data management and storing transactions in immutable distributed ledger networks,the blockchain-assisted FL method improves user privacy and boosts traffic prediction accuracy.Additionally,DP is utilized in defending the user’s private data from various threats and is given the authority to bolster the confidentiality of data-sharing transactions.Our model has been deployed in two strategies:First,DP-based FL to strengthen user privacy by masking the intermediate data during model uploading.Second,blockchain-based FL to effectively construct secure and decentralized traffic management in vehicular networks.The simulation results demonstrated that our framework yields several benefits for VNets privacy protection by forming a distributed EIS with privacy budget(ε)of 4.03,1.18,and 0.522,achieving model accuracy of 95.8%,93.78%,and 89.31%,respectively.
基金supported by the Department of the Navy,Office of Naval Research Global under N62909-19-1-2037.
文摘With the booming of artificial intelligence(AI),Internet of Things(IoT),and high-speed communication technology,integrating these technologies to innovate the smart grid(SG)further is future development direction of the power grid.Driven by this trend,billions of devices in the SG are connected to the Internet and generate a large amount of data at network edge.To reduce pressure of cloud computing and overcome defects of centralized learning,emergence of edge computing(EC)makes the computing task transfer from the network center to the network edge.When further exploring the relationship between EC and AI,edge intelligence(EI)has become one of the research hotspots.Advantages of EI in flexibly utilizing EC resources and improving AI model learning efficiency make its application in SG a good prospect.However,since only a few existing studies have applied EI to SG,this paper focuses on the application potential of EI in SG.First,the concepts,characteristics,frameworks,and key technologies of EI are investigated.Then,a comprehensive review of AI and EC applications in SG is presented.Furthermore,application potentials for EI in SG are explored,and four application scenarios of EI for SG are proposed.Finally,challenges and future directions for EI in SG are discussed.This application survey of EI on SG is carried out before EI enters the largescale commercial stage to provide references and guidelines for developing future EI frameworks in the SG paradigm.
基金National Natural Science Foundation of China(Nos.61931011,62072303 and 61872310)the Key-area Research and Development Program of Guangdong Province,China(No.2021B0101400003)+2 种基金Hong Kong Research Grants Council(RGC)Research Impact Fund,China(No.R5060-19)General Research Fund(Nos.152221/19E,152203/20E and 152244/2IE)Shenzhen Science and Technology Innovation Commission,China(No.JCYJ20200109142008673).
文摘With the vigorous development of artificial intelligence(AI),intelligence applications based on deep neural networks(DNNs)have changed people’s lifestyles and production efficiency.However,the large amount of computation and data generated from the network edge becomes the major bottleneck,and the traditional cloud-based computing mode has been unable to meet the requirements of realtime processing tasks.To solve the above problems,by embedding AI model training and inference capabilities into the network edge,edge intelligence(EI)becomes a cutting-edge direction in the field of AI.Furthermore,collaborative DNN inference among the cloud,edge,and end devices provides a promising way to boost EI.Nevertheless,at present,EI oriented collaborative DNN inference is still in its early stage,lacking systematic classification and discussion of existing research efforts.Motivated by it,we have comprehensively investigated recent studies on EI-oriented collaborative DNN inference.In this paper,we first review the background and motivation of EI.Then,we classify four typical collaborative DNN inference paradigms for EI,and analyse their characteristics and key technologies.Finally,we summarize the current challenges of collaborative DNN inference,discuss future development trends and provide future research directions.
基金supported by the National Key Research and Development Program of China(No.2021YFF0900400).
文摘The popularization of intelligent healthcare devices and big data analytics significantly boosts the development of Smart Healthcare Networks(SHNs).To enhance the precision of diagnosis,different participants in SHNs share health data that contain sensitive information.Therefore,the data exchange process raises privacy concerns,especially when the integration of health data from multiple sources(linkage attack)results in further leakage.Linkage attack is a type of dominant attack in the privacy domain,which can leverage various data sources for private data mining.Furthermore,adversaries launch poisoning attacks to falsify the health data,which leads to misdiagnosing or even physical damage.To protect private health data,we propose a personalized differential privacy model based on the trust levels among users.The trust is evaluated by a defined community density,while the corresponding privacy protection level is mapped to controllable randomized noise constrained by differential privacy.To avoid linkage attacks in personalized differential privacy,we design a noise correlation decoupling mechanism using a Markov stochastic process.In addition,we build the community model on a blockchain,which can mitigate the risk of poisoning attacks during differentially private data transmission over SHNs.Extensive experiments and analysis on real-world datasets have testified the proposed model,and achieved better performance compared with existing research from perspectives of privacy protection and effectiveness.
基金supported in part by the National Key R&D Program of China No.2020YFB1806905the National Natural Science Foundation of China No.62201079+1 种基金the Beijing Natural Science Foundation No.L232051the Major Key Project of Peng Cheng Laboratory(PCL)Department of Broadband Communication。
文摘To facilitate emerging applications and demands of edge intelligence(EI)-empowered 6G networks,model-driven semantic communications have been proposed to reduce transmission volume by deploying artificial intelligence(AI)models that provide abilities of semantic extraction and recovery.Nevertheless,it is not feasible to preload all AI models on resource-constrained terminals.Thus,in-time model transmission becomes a crucial problem.This paper proposes an intellicise model transmission architecture to guarantee the reliable transmission of models for semantic communication.The mathematical relationship between model size and performance is formulated by employing a recognition error function supported with experimental data.We consider the characteristics of wireless channels and derive the closed-form expression of model transmission outage probability(MTOP)over the Rayleigh channel.Besides,we define the effective model accuracy(EMA)to evaluate the model transmission performance of both communication and intelligence.Then we propose a joint model selection and resource allocation(JMSRA)algorithm to maximize the average EMA of all users.Simulation results demonstrate that the average EMA of the JMSRA algorithm outperforms baseline algorithms by about 22%.
基金supported by the National Natural Science Foundation of China(No.62171051)。
文摘With the proportion of intelligent services in the industrial internet of things(IIoT)rising rapidly,its data dependency and decomposability increase the difficulty of scheduling computing resources.In this paper,we propose an intelligent service computing framework.In the framework,we take the long-term rewards of its important participants,edge service providers,as the optimization goal,which is related to service delay and computing cost.Considering the different update frequencies of data deployment and service offloading,double-timescale reinforcement learning is utilized in the framework.In the small-scale strategy,the frequent concurrency of services and the difference in service time lead to the fuzzy relationship between reward and action.To solve the fuzzy reward problem,a reward mapping-based reinforcement learning(RMRL)algorithm is proposed,which enables the agent to learn the relationship between reward and action more clearly.The large time scale strategy adopts the improved Monte Carlo tree search(MCTS)algorithm to improve the learning speed.The simulation results show that the strategy is superior to popular reinforcement learning algorithms such as double Q-learning(DDQN)and dueling Q-learning(dueling-DQN)in learning speed,and the reward is also increased by 14%.
文摘[Objective]Real-time monitoring of cow ruminant behavior is of paramount importance for promptly obtaining relevant information about cow health and predicting cow diseases.Currently,various strategies have been proposed for monitoring cow ruminant behavior,including video surveillance,sound recognition,and sensor monitoring methods.How‐ever,the application of edge device gives rise to the issue of inadequate real-time performance.To reduce the volume of data transmission and cloud computing workload while achieving real-time monitoring of dairy cow rumination behavior,a real-time monitoring method was proposed for cow ruminant behavior based on edge computing.[Methods]Autono‐mously designed edge devices were utilized to collect and process six-axis acceleration signals from cows in real-time.Based on these six-axis data,two distinct strategies,federated edge intelligence and split edge intelligence,were investigat‐ed for the real-time recognition of cow ruminant behavior.Focused on the real-time recognition method for cow ruminant behavior leveraging federated edge intelligence,the CA-MobileNet v3 network was proposed by enhancing the MobileNet v3 network with a collaborative attention mechanism.Additionally,a federated edge intelligence model was designed uti‐lizing the CA-MobileNet v3 network and the FedAvg federated aggregation algorithm.In the study on split edge intelli‐gence,a split edge intelligence model named MobileNet-LSTM was designed by integrating the MobileNet v3 network with a fusion collaborative attention mechanism and the Bi-LSTM network.[Results and Discussions]Through compara‐tive experiments with MobileNet v3 and MobileNet-LSTM,the federated edge intelligence model based on CA-Mo‐bileNet v3 achieved an average Precision rate,Recall rate,F1-Score,Specificity,and Accuracy of 97.1%,97.9%,97.5%,98.3%,and 98.2%,respectively,yielding the best recognition performance.[Conclusions]It is provided a real-time and effective method for monitoring cow ruminant behavior,and the proposed federated edge intelligence model can be ap‐plied in practical settings.
基金supported in part by the National Science Foundation of China under Grant 62101253the Natural Science Foundation of Jiangsu Province under Grant BK20210283+2 种基金the Jiangsu Provincial Inno-vation and Entrepreneurship Doctor Program under Grant JSSCBS20210158the Open Research Foun-dation of National Mobile Communications Research Laboratory under Grant 2022D08the Research Foundation of Nanjing for Returned Chinese Scholars.
文摘Edge intelligence is anticipated to underlay the pathway to connected intelligence for 6G networks,but the organic confluence of edge computing and artificial intelligence still needs to be carefully treated.To this end,this article discusses the concepts of edge intelligence from the semantic cognitive perspective.Two instructive theoretical models for edge semantic cognitive intelligence(ESCI)are first established.Afterwards,the ESCI framework orchestrating deep learning with semantic communication is discussed.Two representative applications are present to shed light on the prospect of ESCI in 6G networks.Some open problems are finally listed to elicit the future research directions of ESCI.
文摘The burgeoning advances in machine learning and wireless technologies are forg?ing a new paradigm for future networks, which are expected to possess higher degrees of in?telligence via the inference from vast dataset and being able to respond to local events in a timely manner. Due to the sheer volume of data generated by end-user devices, as well as the increasing concerns about sharing private information, a new branch of machine learn?ing models, namely federated learning, has emerged from the intersection of artificial intelli?gence and edge computing. In contrast to conventional machine learning methods, federated learning brings the models directly to the device for training, where only the resultant param?eters shall be sent to the edge servers. The local copies of the model on the devices bring along great advantages of eliminating network latency and preserving data privacy. Never?theless, to make federated learning possible, one needs to tackle new challenges that require a fundamental departure from standard methods designed for distributed optimizations. In this paper, we aim to deliver a comprehensive introduction of federated learning. Specifical?ly, we first survey the basis of federated learning, including its learning structure and the distinct features from conventional machine learning models. We then enumerate several critical issues associated with the deployment of federated learning in a wireless network, and show why and how technologies should be jointly integrated to facilitate the full imple?mentation from different perspectives, ranging from algorithmic design, on-device training, to communication resource management. Finally, we conclude by shedding light on some po?tential applications and future trends.
基金This work has been supported by the Leadingedge Technology Program of Jiangsu Natural Science Foundation(No.BK20202001).
文摘The demand for the Internet of Everything has slowed down network routing efficiency.Tradi-tional routing policies rely on manual configuration,which has limitations and adversely affects network performance.In this paper,we propose an Inter-net of Things(IoT)Intelligent Edge Network Routing(ENIR)architecture.ENIR uses deep reinforcement learning(DRL)to simulate human learning of empir-ical knowledge and an intelligent routing closed-loop control mechanism for real-time interaction with the network environment.According to the network de-mand and environmental conditions,the method can dynamically adjust network resources and perform in-telligent routing optimization.It uses blockchain tech-nology to share network knowledge and global op-timization of network routing.The intelligent rout-ing method uses the deep deterministic policy gradient(DDPG)algorithm.Our simulation results show that ENIR provides significantly better link utilization and transmission delay performance than various routing methods(e.g.,open shortest path first,routing based on Q-learning and DRL-based control framework for traffic engineering).
基金supported by the National Natural Science Foundation of China(NSFC)(62102232,62122042,61971269)Natural Science Foundation of Shandong Province Under(ZR2021QF064)。
文摘As a combination of edge computing and artificial intelligence,edge intelligence has become a promising technique and provided its users with a series of fast,precise,and customized services.In edge intelligence,when learning agents are deployed on the edge side,the data aggregation from the end side to the designated edge devices is an important research topic.Considering the various importance of end devices,this paper studies the weighted data aggregation problem in a single hop end-to-edge communication network.Firstly,to make sure all the end devices with various weights are fairly treated in data aggregation,a distributed end-to-edge cooperative scheme is proposed.Then,to handle the massive contention on the wireless channel caused by end devices,a multi-armed bandit(MAB)algorithm is designed to help the end devices find their most appropriate update rates.Diffe-rent from the traditional data aggregation works,combining the MAB enables our algorithm a higher efficiency in data aggregation.With a theoretical analysis,we show that the efficiency of our algorithm is asymptotically optimal.Comparative experiments with previous works are also conducted to show the strength of our algorithm.
文摘Recent breakthroughs in artificial intelligence(AI) give rise to a plethora of intelligent applications and services based on machine learning algorithms such as deep neural networks(DNNs). With the proliferation of Internet of things(IoT) and mobile edge computing, these applications are being pushed to the network edge, thus enabling a new paradigm termed as edge intelligence. This provokes the demand for decentralized implementation of learning algorithms over edge networks to distill the intelligence from distributed data, and also calls for new communication-efficient designs in air interfaces to improve the privacy by avoiding raw data exchange. This paper provides a comprehensive overview on edge intelligence, by particularly focusing on two paradigms named edge learning and edge inference, as well as the corresponding communication-efficient solutions for their implementations in wireless systems. Several insightful theoretical results and design guidelines are also provided.
基金supportedin part by the National Science Foundation of China(NSFC)under Grant 61631005,Grant 61771065,Grant 61901048in part by the Zhijiang Laboratory Open Project Fund 2020LCOAB01in part by the Beijing Municipal Science and Technology Commission Research under Project Z181100003218015。
文摘The efficient integration of satellite and terrestrial networks has become an important component for 6 G wireless architectures to provide highly reliable and secure connectivity over a wide geographical area.As the satellite and cellular networks are developed separately these years,the integrated network should synergize the communication,storage,computation capabilities of both sides towards an intelligent system more than mere consideration of coexistence.This has motivated us to develop double-edge intelligent integrated satellite and terrestrial networks(DILIGENT).Leveraging the boost development of multi-access edge computing(MEC)technology and artificial intelligence(AI),the framework is entitled with the systematic learning and adaptive network management of satellite and cellular networks.In this article,we provide a brief review of the state-of-art contributions from the perspective of academic research and standardization.Then we present the overall design of the proposed DILIGENT architecture,where the advantages are discussed and summarized.Strategies of task offloading,content caching and distribution are presented.Numerical results show that the proposed network architecture outperforms the existing integrated networks.
文摘The healthcare industry is rapidly adapting to new computing environments and technologies. With academics increasingly committed to developing and enhancing healthcare solutions that combine the Internet of Things (IoT) and edge computing, there is a greater need than ever to adequately monitor the data being acquired, shared, processed, and stored. The growth of cloud, IoT, and edge computing models presents severe data privacy concerns, especially in the healthcare sector. However, rigorous research to develop appropriate data privacy solutions in the healthcare sector is still lacking. This paper discusses the current state of privacy-preservation solutions in IoT and edge healthcare applications. It identifies the common strategies often used to include privacy by the intelligent edges and technologies in healthcare systems. Furthermore, the study addresses the technical complexity, efficacy, and sustainability limits of these methods. The study also highlights the privacy issues and current research directions that have driven the IoT and edge healthcare solutions, with which more insightful future applications are encouraged.
基金supported by the National Key R&D Program of China (2020YFB0905900).
文摘In recent years,several efforts have been made to develop power transmission line abnormal target detection models based on edge devices.Typically,updates to these models rely on participation of the cloud,which means that network resource shortages can lead to update failures,followed by unsatisfactory recognition and detection performance in practical use.To address this problem,this article proposes an edge visual incremental perception framework,based on deep semisupervised learning,for monitoring power transmission lines.After generation of the initial model using a small amount of labeled data,models trained using this framework can update themselves based on unlabeled data.A teacher-student joint training strategy,a data augmentation strategy,and a model updating strategy are also designed and adopted to improve the performance of the models trained with this framework.The proposed framework is then examined with various transmission line datasets with 1%,2%,5%,and 10%labeled data.General performance enhancement is thus confirmed against traditional supervised learning strategies.With the 10%labeled data training set,the recognition accuracy of the model is improved to exceed 80%,meeting the practical needs of power system operation,and thus clearly validating the effectiveness of the framework.
基金supported by National Natural Science Foundation of China(Nos.92270116 and 62071155).
文摘Federated learning (FL) is a promising decentralized machine learning approach that enables multiple distributed clients to train a model jointly while keeping their data private. However, in real-world scenarios, the supervised training data stored in local clients inevitably suffer from imperfect annotations, resulting in subjective, inconsistent and biased labels. These noisy labels can harm the collaborative aggregation process of FL by inducing inconsistent decision boundaries. Unfortunately, few attempts have been made towards noise-tolerant federated learning, with most of them relying on the strategy of transmitting overhead messages to assist noisy labels detection and correction, which increases the communication burden as well as privacy risks. In this paper, we propose a simple yet effective method for noise-tolerant FL based on the well-established co-training framework. Our method leverages the inherent discrepancy in the learning ability of the local and global models in FL, which can be regarded as two complementary views. By iteratively exchanging samples with their high confident predictions, the two models “teach each other” to suppress the influence of noisy labels. The proposed scheme enjoys the benefit of overhead cost-free and can serve as a robust and efficient baseline for noise-tolerant federated learning. Experimental results demonstrate that our method outperforms existing approaches, highlighting the superiority of our method.
基金Provincial Natural Science Foundation of China(LH2020F044)2019-“Chunhui”Plan Cooperative Scientific Research Project of the Ministry of Education of China(HLJ2019015)+2 种基金Fundamental Research Funds for Heilongjiang University,China(2020-KYYWF-1014)NSFC(62102099)National Key R&D Program of China(2018YFE0205503)。
文摘Sixth generation(6G)enabled edge intelligence opens up a new era of Internet of everything and makes it possible to interconnect people-devices-cloud anytime,anywhere.More and more next-generation wireless network smart service applications are changing our way of life and improving our quality of life.As the hottest new form of next-generation Internet applications,Metaverse is striving to connect billions of users and create a shared world where virtual and reality merge.However,limited by resources,computing power,and sensory devices,Metaverse is still far from realizing its full vision of immersion,materialization,and interoperability.To this end,this survey aims to realize this vision through the organic integration of 6G-enabled edge artificial intelligence(AI)and Metaverse.Specifically,we first introduce three new types of edge-Metaverse architectures that use 6G-enabled edge AI to solve resource and computing constraints in Metaverse.Then we summarize technical challenges that these architectures face in Metaverse and the existing solutions.Furthermore,we explore how the edge-Metaverse architecture technology helps Metaverse to interact and share digital data.Finally,we discuss future research directions to realize the true vision of Metaverse with 6G-enabled edge AI.
文摘In this paper the idea of Intelligent Scissors is adopted for contourtracking in dynamic image sequence. Tracking contour of human can therefore be converted to trackingseed points in images by making use of the properties of the optimal path (Intelligent Edge). Themain advantage of the approach is that it can handle correctly occlusions that occur frequently whenhuman is moving. Non-Uniform Rational B-Spline (NURBS) is used to represent parametrically thecontour that one wants to track. In order to track robustly the contour in images, similarity andcompatibility measurements of the edge are computed as the weighting functions of optimal estimator.To reduce dramatically the computational load, an efficient method for extracting the regioninterested is proposed. Experiments show that the approach works robustly for sequences withfrequent occlusions.