With the continuous development of network technology, the numberof streaming media videos is growing rapidly. More and more users are watchingvideos through the Internet, which leads to the increasing huge server loa...With the continuous development of network technology, the numberof streaming media videos is growing rapidly. More and more users are watchingvideos through the Internet, which leads to the increasing huge server load andthe increasing transmission cost across ISP domains. A feasible scheme to reducetransmission cost across ISP domains and alleviate the server load is to cachesome popular videos in a large number of terminal users. Therefore, in this paper,in order to utilize the idle resources of the terminal peers, some peers with goodperformance were selected from the fixed peers as the super peers, which wereaggregated into a super peer set (SPS). In addition, with the supply and demandrelation of streaming videos among ISP domains, a mathematical model was formulatedto optimize the service utility of ISP. Then, a collaborative cache strategywas proposed based on the utility optimization. The simulation results show thatthe strategy proposed can effectively improve the user playback fluency and hitrate while ensuring the optimal service utility.展开更多
Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the eff...Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the efficacy of big data awareness detection systems.We advocate for a collaborative caching approach involving edge devices and cloud networks to combat this.This strategy is devised to streamline the data retrieval path,subsequently diminishing network strain.Crafting an adept cache processing scheme poses its own set of challenges,especially given the transient nature of monitoring data and the imperative for swift data transmission,intertwined with resource allocation tactics.This paper unveils a novel mobile healthcare solution that harnesses the power of our collaborative caching approach,facilitating nuanced health monitoring via edge devices.The system capitalizes on cloud computing for intricate health data analytics,especially in pinpointing health anomalies.Given the dynamic locational shifts and possible connection disruptions,we have architected a hierarchical detection system,particularly during crises.This system caches data efficiently and incorporates a detection utility to assess data freshness and potential lag in response times.Furthermore,we introduce the Cache-Assisted Real-Time Detection(CARD)model,crafted to optimize utility.Addressing the inherent complexity of the NP-hard CARD model,we have championed a greedy algorithm as a solution.Simulations reveal that our collaborative caching technique markedly elevates the Cache Hit Ratio(CHR)and data freshness,outshining its contemporaneous benchmark algorithms.The empirical results underscore the strength and efficiency of our innovative IoHT-based health monitoring solution.To encapsulate,this paper tackles the nuances of real-time health data monitoring in the IoHT landscape,presenting a joint edge-cloud caching strategy paired with a hierarchical detection system.Our methodology yields enhanced cache efficiency and data freshness.The corroborative numerical data accentuates the feasibility and relevance of our model,casting a beacon for the future trajectory of real-time health data monitoring systems.展开更多
In this paper,we explore a distributed collaborative caching and computing model to support the distribution of adaptive bit rate video streaming.The aim is to reduce the average initial buffer delay and improve the q...In this paper,we explore a distributed collaborative caching and computing model to support the distribution of adaptive bit rate video streaming.The aim is to reduce the average initial buffer delay and improve the quality of user experience.Considering the difference between global and local video popularities and the time-varying characteristics of video popularity,a two-stage caching scheme is proposed to push popular videos closer to users and minimize the average initial buffer delay.Based on both long-term content popularity and short-term content popularity,the proposed caching solution is decouple into the proactive cache stage and the cache update stage.In the proactive cache stage,we develop a proactive cache placement algorithm that can be executed in an off-peak period.In the cache update stage,we propose a reactive cache update algorithm to update the existing cache policy to minimize the buffer delay.Simulation results verify that the proposed caching algorithms can reduce the initial buffer delay efficiently.展开更多
Due to the explosion of network data traffic and IoT devices,edge servers are overloaded and slow to respond to the massive volume of online requests.A large number of studies have shown that edge caching can solve th...Due to the explosion of network data traffic and IoT devices,edge servers are overloaded and slow to respond to the massive volume of online requests.A large number of studies have shown that edge caching can solve this problem effectively.This paper proposes a distributed edge collaborative caching mechanism for Internet online request services scenario.It solves the problem of large average access delay caused by unbalanced load of edge servers,meets users’differentiated service demands and improves user experience.In particular,the edge cache node selection algorithm is optimized,and a novel edge cache replacement strategy considering the differentiated user requests is proposed.This mechanism can shorten the response time to a large number of user requests.Experimental results show that,compared with the current advanced online edge caching algorithm,the proposed edge collaborative caching strategy in this paper can reduce the average response delay by 9%.It also increases the user utility by 4.5 times in differentiated service scenarios,and significantly reduces the time complexity of the edge caching algorithm.展开更多
With the development of internet of vehicles,the traditional centralized content caching mode transmits content through the core network,which causes a large delay and cannot meet the demands for delay-sensitive servi...With the development of internet of vehicles,the traditional centralized content caching mode transmits content through the core network,which causes a large delay and cannot meet the demands for delay-sensitive services.To solve these problems,on basis of vehicle caching network,we propose an edge colla-borative caching scheme.Road side unit(RSU)and mobile edge computing(MEC)are used to collect vehicle information,predict and cache popular content,thereby provide low-latency content delivery services.However,the storage capa-city of a single RSU severely limits the edge caching performance and cannot handle intensive content requests at the same time.Through content sharing,col-laborative caching can relieve the storage burden on caching servers.Therefore,we integrate RSU and collaborative caching to build a MEC-assisted vehicle edge collaborative caching(MVECC)scheme,so as to realize the collaborative caching among cloud,edge and vehicle.MVECC uses deep reinforcement learning to pre-dict what needs to be cached on RSU,which enables RSUs to cache more popular content.In addition,MVECC also introduces a mobility-aware caching replace-ment scheme at the edge network to reduce redundant cache and improving cache efficiency,which allows RSU to dynamically replace the cached content in response to the mobility of vehicles.The simulation results show that the pro-posed MVECC scheme can improve cache performance in terms of energy cost and content hit rate.展开更多
基金This research was supported by the national key research and development program of China(No.2020YFF0305301)the National Natural Science Foundation(61762029,U1811264).References。
文摘With the continuous development of network technology, the numberof streaming media videos is growing rapidly. More and more users are watchingvideos through the Internet, which leads to the increasing huge server load andthe increasing transmission cost across ISP domains. A feasible scheme to reducetransmission cost across ISP domains and alleviate the server load is to cachesome popular videos in a large number of terminal users. Therefore, in this paper,in order to utilize the idle resources of the terminal peers, some peers with goodperformance were selected from the fixed peers as the super peers, which wereaggregated into a super peer set (SPS). In addition, with the supply and demandrelation of streaming videos among ISP domains, a mathematical model was formulatedto optimize the service utility of ISP. Then, a collaborative cache strategywas proposed based on the utility optimization. The simulation results show thatthe strategy proposed can effectively improve the user playback fluency and hitrate while ensuring the optimal service utility.
基金supported by National Natural Science Foundation of China(NSFC)under Grant Number T2350710232.
文摘Real-time health data monitoring is pivotal for bolstering road services’safety,intelligence,and efficiency within the Internet of Health Things(IoHT)framework.Yet,delays in data retrieval can markedly hinder the efficacy of big data awareness detection systems.We advocate for a collaborative caching approach involving edge devices and cloud networks to combat this.This strategy is devised to streamline the data retrieval path,subsequently diminishing network strain.Crafting an adept cache processing scheme poses its own set of challenges,especially given the transient nature of monitoring data and the imperative for swift data transmission,intertwined with resource allocation tactics.This paper unveils a novel mobile healthcare solution that harnesses the power of our collaborative caching approach,facilitating nuanced health monitoring via edge devices.The system capitalizes on cloud computing for intricate health data analytics,especially in pinpointing health anomalies.Given the dynamic locational shifts and possible connection disruptions,we have architected a hierarchical detection system,particularly during crises.This system caches data efficiently and incorporates a detection utility to assess data freshness and potential lag in response times.Furthermore,we introduce the Cache-Assisted Real-Time Detection(CARD)model,crafted to optimize utility.Addressing the inherent complexity of the NP-hard CARD model,we have championed a greedy algorithm as a solution.Simulations reveal that our collaborative caching technique markedly elevates the Cache Hit Ratio(CHR)and data freshness,outshining its contemporaneous benchmark algorithms.The empirical results underscore the strength and efficiency of our innovative IoHT-based health monitoring solution.To encapsulate,this paper tackles the nuances of real-time health data monitoring in the IoHT landscape,presenting a joint edge-cloud caching strategy paired with a hierarchical detection system.Our methodology yields enhanced cache efficiency and data freshness.The corroborative numerical data accentuates the feasibility and relevance of our model,casting a beacon for the future trajectory of real-time health data monitoring systems.
基金the National Natural Science Foundation of China under grants 61901078,61871062,and U20A20157in part by the China University Industry-University-Research Collaborative Innovation Fund(Future Network Innovation Research and Application Project)under grant 2021FNA04008+5 种基金in part by the China Postdoctoral Science Foundation under grant 2022MD713692in part by the Chongqing Postdoctoral Science Special Foundation under grant 2021XM2018in part by the Natural Science Foundation of Chongqing under grant cstc2020jcyj-zdxmX0024in part by University Innovation Research Group of Chongqing under grant CXQT20017in part by the Science and Technology Research Program of Chongqing Municipal Education Commission under Grant KJQN202000626in part by the Youth Innovation Group Support Program of ICE Discipline of CQUPT under grant SCIE-QN-2022-04.
文摘In this paper,we explore a distributed collaborative caching and computing model to support the distribution of adaptive bit rate video streaming.The aim is to reduce the average initial buffer delay and improve the quality of user experience.Considering the difference between global and local video popularities and the time-varying characteristics of video popularity,a two-stage caching scheme is proposed to push popular videos closer to users and minimize the average initial buffer delay.Based on both long-term content popularity and short-term content popularity,the proposed caching solution is decouple into the proactive cache stage and the cache update stage.In the proactive cache stage,we develop a proactive cache placement algorithm that can be executed in an off-peak period.In the cache update stage,we propose a reactive cache update algorithm to update the existing cache policy to minimize the buffer delay.Simulation results verify that the proposed caching algorithms can reduce the initial buffer delay efficiently.
基金This work is supported by the National Natural Science Foundation of China(62072465)the Key-Area Research and Development Program of Guang Dong Province(2019B010107001).
文摘Due to the explosion of network data traffic and IoT devices,edge servers are overloaded and slow to respond to the massive volume of online requests.A large number of studies have shown that edge caching can solve this problem effectively.This paper proposes a distributed edge collaborative caching mechanism for Internet online request services scenario.It solves the problem of large average access delay caused by unbalanced load of edge servers,meets users’differentiated service demands and improves user experience.In particular,the edge cache node selection algorithm is optimized,and a novel edge cache replacement strategy considering the differentiated user requests is proposed.This mechanism can shorten the response time to a large number of user requests.Experimental results show that,compared with the current advanced online edge caching algorithm,the proposed edge collaborative caching strategy in this paper can reduce the average response delay by 9%.It also increases the user utility by 4.5 times in differentiated service scenarios,and significantly reduces the time complexity of the edge caching algorithm.
基金supported by the Science and Technology Project of State Grid Corporation of China:Research and Application of Key Technologies in Virtual Operation of Information and Communication Resources.
文摘With the development of internet of vehicles,the traditional centralized content caching mode transmits content through the core network,which causes a large delay and cannot meet the demands for delay-sensitive services.To solve these problems,on basis of vehicle caching network,we propose an edge colla-borative caching scheme.Road side unit(RSU)and mobile edge computing(MEC)are used to collect vehicle information,predict and cache popular content,thereby provide low-latency content delivery services.However,the storage capa-city of a single RSU severely limits the edge caching performance and cannot handle intensive content requests at the same time.Through content sharing,col-laborative caching can relieve the storage burden on caching servers.Therefore,we integrate RSU and collaborative caching to build a MEC-assisted vehicle edge collaborative caching(MVECC)scheme,so as to realize the collaborative caching among cloud,edge and vehicle.MVECC uses deep reinforcement learning to pre-dict what needs to be cached on RSU,which enables RSUs to cache more popular content.In addition,MVECC also introduces a mobility-aware caching replace-ment scheme at the edge network to reduce redundant cache and improving cache efficiency,which allows RSU to dynamically replace the cached content in response to the mobility of vehicles.The simulation results show that the pro-posed MVECC scheme can improve cache performance in terms of energy cost and content hit rate.