Fog computing is considered as a solution to accommodate the emergence of booming requirements from a large variety of resource-limited Internet of Things(IoT)devices.To ensure the security of private data,in this pap...Fog computing is considered as a solution to accommodate the emergence of booming requirements from a large variety of resource-limited Internet of Things(IoT)devices.To ensure the security of private data,in this paper,we introduce a blockchain-enabled three-layer device-fog-cloud heterogeneous network.A reputation model is proposed to update the credibility of the fog nodes(FN),which is used to select blockchain nodes(BN)from FNs to participate in the consensus process.According to the Rivest-Shamir-Adleman(RSA)encryption algorithm applied to the blockchain system,FNs could verify the identity of the node through its public key to avoid malicious attacks.Additionally,to reduce the computation complexity of the consensus algorithms and the network overhead,we propose a dynamic offloading and resource allocation(DORA)algorithm and a reputation-based democratic byzantine fault tolerant(R-DBFT)algorithm to optimize the offloading decisions and decrease the number of BNs in the consensus algorithm while ensuring the network security.Simulation results demonstrate that the proposed algorithm could efficiently reduce the network overhead,and obtain a considerable performance improvement compared to the related algorithms in the previous literature.展开更多
In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer t...In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer task offloading.For many resource-constrained devices,the computation of many types of tasks is not feasible because they cannot support such computations as they do not have enough available memory and processing capacity.In this scenario,it is worth considering transferring these tasks to resource-rich platforms,such as Edge Data Centers or remote cloud servers.For different reasons,it is more exciting and appropriate to download various tasks to specific download destinations depending on the properties and state of the environment and the nature of the functions.At the same time,establishing an optimal offloading policy,which ensures that all tasks are executed within the required latency and avoids excessive workload on specific computing centers is not easy.This study presents two alternatives to solve the offloading decision paradigm by introducing two well-known algorithms,Graph Neural Networks(GNN)and Deep Q-Network(DQN).It applies the alternatives on a well-known Edge Computing simulator called PureEdgeSimand compares them with the two defaultmethods,Trade-Off and Round Robin.Experiments showed that variants offer a slight improvement in task success rate and workload distribution.In terms of energy efficiency,they provided similar results.Finally,the success rates of different computing centers are tested,and the lack of capacity of remote cloud servers to respond to applications in real-time is demonstrated.These novel ways of finding a download strategy in a local networking environment are unique as they emulate the state and structure of the environment innovatively,considering the quality of its connections and constant updates.The download score defined in this research is a crucial feature for determining the quality of a download path in the GNN training process and has not previously been proposed.Simultaneously,the suitability of Reinforcement Learning(RL)techniques is demonstrated due to the dynamism of the network environment,considering all the key factors that affect the decision to offload a given task,including the actual state of all devices.展开更多
Under the influence of air humidity,dust,aerosols,etc.,in real scenes,haze presents an uneven state.In this way,the image quality and contrast will decrease.In this case,It is difficult to detect the target in the ima...Under the influence of air humidity,dust,aerosols,etc.,in real scenes,haze presents an uneven state.In this way,the image quality and contrast will decrease.In this case,It is difficult to detect the target in the image by the universal detection network.Thus,a dual subnet based on multi-task collaborative training(DSMCT)is proposed in this paper.Firstly,in the training phase,the Gated Context Aggregation Network(GCANet)is used as the supervisory network of YOLOX to promote the extraction of clean information in foggy scenes.In the test phase,only the YOLOX branch needs to be activated to ensure the detection speed of the model.Secondly,the deformable convolution module is used to improve GCANet to enhance the model’s ability to capture details of non-homogeneous fog.Finally,the Coordinate Attention mechanism is introduced into the Vision Transformer and the backbone network of YOLOX is redesigned.In this way,the feature extraction ability of the network for deep-level information can be enhanced.The experimental results on artificial fog data set FOG_VOC and real fog data set RTTS show that the map value of DSMCT reached 86.56%and 62.39%,respectively,which was 2.27%and 4.41%higher than the current most advanced detection model.The DSMCT network has high practicality and effectiveness for target detection in real foggy scenes.展开更多
For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are ac...For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.展开更多
More devices in the Intelligent Internet of Things(AIoT)result in an increased number of tasks that require low latency and real-time responsiveness,leading to an increased demand for computational resources.Cloud com...More devices in the Intelligent Internet of Things(AIoT)result in an increased number of tasks that require low latency and real-time responsiveness,leading to an increased demand for computational resources.Cloud computing’s low-latency performance issues in AIoT scenarios have led researchers to explore fog computing as a complementary extension.However,the effective allocation of resources for task execution within fog environments,characterized by limitations and heterogeneity in computational resources,remains a formidable challenge.To tackle this challenge,in this study,we integrate fog computing and cloud computing.We begin by establishing a fog-cloud environment framework,followed by the formulation of a mathematical model for task scheduling.Lastly,we introduce an enhanced hybrid Equilibrium Optimizer(EHEO)tailored for AIoT task scheduling.The overarching objective is to decrease both the makespan and energy consumption of the fog-cloud system while accounting for task deadlines.The proposed EHEO method undergoes a thorough evaluation against multiple benchmark algorithms,encompassing metrics likemakespan,total energy consumption,success rate,and average waiting time.Comprehensive experimental results unequivocally demonstrate the superior performance of EHEO across all assessed metrics.Notably,in the most favorable conditions,EHEO significantly diminishes both the makespan and energy consumption by approximately 50%and 35.5%,respectively,compared to the secondbest performing approach,which affirms its efficacy in advancing the efficiency of AIoT task scheduling within fog-cloud networks.展开更多
Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.How...Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.However,the majority of the fog nodes in this environment are geographically scattered with resources that are limited in terms of capabilities compared to cloud nodes,thus making the application placement problem more complex than that in cloud computing.An approach for cost-efficient application placement in fog-cloud computing environments that combines the benefits of both fog and cloud computing to optimize the placement of applications and services while minimizing costs.This approach is particularly relevant in scenarios where latency,resource constraints,and cost considerations are crucial factors for the deployment of applications.In this study,we propose a hybrid approach that combines a genetic algorithm(GA)with the Flamingo Search Algorithm(FSA)to place application modules while minimizing cost.We consider four cost-types for application deployment:Computation,communication,energy consumption,and violations.The proposed hybrid approach is called GA-FSA and is designed to place the application modules considering the deadline of the application and deploy them appropriately to fog or cloud nodes to curtail the overall cost of the system.An extensive simulation is conducted to assess the performance of the proposed approach compared to other state-of-the-art approaches.The results demonstrate that GA-FSA approach is superior to the other approaches with respect to task guarantee ratio(TGR)and total cost.展开更多
The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.Howe...The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.However,with the advancement of information and communication technology,new security and privacy challenges have emerged for AMI.To address these challenges and enhance the security and privacy of user data in the smart grid,a Hierarchical Privacy Protection Model in Advanced Metering Infrastructure based on Cloud and Fog Assistance(HPPM-AMICFA)is proposed in this paper.The proposed model integrates cloud and fog computing with hierarchical threshold encryption,offering a flexible and efficient privacy protection solution that significantly enhances data security in the smart grid.The methodology involves setting user protection levels by processing missing data and utilizing fuzzy comprehensive analysis to evaluate user importance,thereby assigning appropriate protection levels.Furthermore,a hierarchical threshold encryption algorithm is developed to provide differentiated protection strategies for fog nodes based on user IDs,ensuring secure aggregation and encryption of user data.Experimental results demonstrate that HPPM-AMICFA effectively resists various attack strategies while minimizing time costs,thereby safeguarding user data in the smart grid.展开更多
In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications...In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.展开更多
针对液晶显示器(LCD)面板的“Chip/FPC on Glass”(C/FOG)工艺生产制造过程中存在的计量延迟大、生产异常无法提前预测的问题,本文提出一种基于神经网络的C/FOG工艺生产制造虚拟计量方法。该方法利用生产机台上的传感器采集生产过程中...针对液晶显示器(LCD)面板的“Chip/FPC on Glass”(C/FOG)工艺生产制造过程中存在的计量延迟大、生产异常无法提前预测的问题,本文提出一种基于神经网络的C/FOG工艺生产制造虚拟计量方法。该方法利用生产机台上的传感器采集生产过程中的过程状态数据,构建基于多尺度一维卷积及通道注意力模型(MS1DC-CA)的虚拟计量模型。通过多个尺度的卷积核提取不同尺度范围内的状态数据特征。在对含有缺失值的原始数据预处理中,提出了基于粒子群算法改进的K近邻填补方法(PSO-KNN Imputation)进行缺失值填充,保留特征的同时,减少因填充值引入的干扰。最后在实际生产采集的数据上进行实验对比分析,实际不良率主要集中在0.1%~0.5%,该虚拟计量模型的拟合均方误差为0.397 7‱,低于其他现有拟合模型,在平均绝对误差、对称平均绝对百分比误差和拟合优度3种评价指标下也均优于其他现有的拟合模型,具有良好的预测性能。展开更多
The double-layer NiCr-Cr_(3)C_(2)/Ni-Zn-Al_(2)O_(3) coatings with sufficient corrosion and wear resistance were prepared on low carbon steel substrates.The intermediate layers Ni-Zn-Al_(2)O_(3) were fabricated by usin...The double-layer NiCr-Cr_(3)C_(2)/Ni-Zn-Al_(2)O_(3) coatings with sufficient corrosion and wear resistance were prepared on low carbon steel substrates.The intermediate layers Ni-Zn-Al_(2)O_(3) were fabricated by using low-pressure cold spray (LPCS) method to improve the salt fog corrosion resistance properties of the supersonic plasma spray (SPS) NiCr-Cr_(3)C_(2) coatings.The friction and wear performance for the double-layer and single-layer NiCr-Cr_(3)C_(2) coatings were carried out by line-contact reciprocating sliding,respectively.Combined with the coating surface analysis techniques,the effect of the salt fog corrosion on the tribological properties of the double-layer coatings was studied.The results showed that the double-layer coatings exhibited better wear resistance than that of the single-layer coatings,due to the better corrosion resistance of the intermediate layer;the wear mass losses of the double-layer coatings was reduced by 70%than that of the single layer coatings and the wear mechanism of coatings after salt fog corrosion conditions is mainly corrosion wear.展开更多
BACKGROUND The precise mechanism by which severe acute respiratory syndrome coronavirus 2(SARS-CoV-2)impacts the central nervous system remains unclear,with manifestations spanning from mild symptoms(e.g.,olfactory an...BACKGROUND The precise mechanism by which severe acute respiratory syndrome coronavirus 2(SARS-CoV-2)impacts the central nervous system remains unclear,with manifestations spanning from mild symptoms(e.g.,olfactory and gustatory deficits,hallucinations,and headache)to severe complications(e.g.,stroke,seizures,encephalitis,and neurally demyelinating lesions).The occurrence of single-pass subdural effusion,as described below,is extremely rare.CASE SUMMARY A 56-year-old male patient presented with left-sided limb weakness and slurred speech as predominant clinical symptoms.Through comprehensive imaging and diagnostic assessments,he was diagnosed with cerebral infarction complicated by hemorrhagic transformation affecting the right frontal,temporal,and parietal regions.In addition,an intracranial infection with SARS-CoV-2 was identified during the rehabilitation process;consequently,an idiopathic subdural effusion developed.Remarkably,the subdural effusion underwent absorption within 6 d,with no recurrence observed during the 3-month follow-up.CONCLUSION Subdural effusion is a potentially rare intracranial complication associated with SARS-CoV-2 infection.展开更多
In fog, visibility is reduced. This reduction in visibility is measured by the meteorological optical range (MOR), which is important for studying human perception and various sensors in foggy conditions. The Cerema P...In fog, visibility is reduced. This reduction in visibility is measured by the meteorological optical range (MOR), which is important for studying human perception and various sensors in foggy conditions. The Cerema PAVIN Fog & Rain platform is capable of producing calibrated fog in order to better analyses it and understand its consequences. The problem is that the droplets produced by the platform are not large enough to resemble real fog. This can have a major impact on measurements since the interaction between electromagnetic waves and fog depends on the wavelength and diameter of the droplets. To remedy this, Cerema is building a new platform with new equipment capable of generating fog. This study analyses different nozzles and associated usage parameters such as the type of water used and the pressure used. The aim is to select the best nozzle with the associated parameters for producing large-diameter droplets and therefore more realistic fog.展开更多
With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of...With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of confidentiality in the fog computing environment is a pivotal task.Among different types of data stored in the fog,the 3D point and mesh fog data are increasingly popular in recent days,due to the growth of 3D modelling and 3D printing technologies.Hence,in this research,we propose a novel scheme for preserving the privacy of 3D point and mesh fog data.Chaotic Cat mapbased data encryption is a recently trending research area due to its unique properties like pseudo-randomness,deterministic nature,sensitivity to initial conditions,ergodicity,etc.To boost encryption efficiency significantly,in this work,we propose a novel Chaotic Cat map.The sequence generated by this map is used to transform the coordinates of the fog data.The improved range of the proposed map is depicted using bifurcation analysis.The quality of the proposed Chaotic Cat map is also analyzed using metrics like Lyapunov exponent and approximate entropy.We also demonstrate the performance of the proposed encryption framework using attacks like brute-force attack and statistical attack.The experimental results clearly depict that the proposed framework produces the best results compared to the previous works in the literature.展开更多
Sleep apnea syndrome(SAS)is a breathing disorder while a person is asleep.The traditional method for examining SAS is Polysomnography(PSG).The standard procedure of PSG requires complete overnight observation in a lab...Sleep apnea syndrome(SAS)is a breathing disorder while a person is asleep.The traditional method for examining SAS is Polysomnography(PSG).The standard procedure of PSG requires complete overnight observation in a laboratory.PSG typically provides accurate results,but it is expensive and time consuming.However,for people with Sleep apnea(SA),available beds and laboratories are limited.Resultantly,it may produce inaccurate diagnosis.Thus,this paper proposes the Internet of Medical Things(IoMT)framework with a machine learning concept of fully connected neural network(FCNN)with k-near-est neighbor(k-NN)classifier.This paper describes smart monitoring of a patient’s sleeping habit and diagnosis of SA using FCNN-KNN+average square error(ASE).For diagnosing SA,the Oxygen saturation(SpO2)sensor device is popularly used for monitoring the heart rate and blood oxygen level.This diagnosis information is securely stored in the IoMT fog computing network.Doctors can care-fully monitor the SA patient remotely on the basis of sensor values,which are efficiently stored in the fog computing network.The proposed technique takes less than 0.2 s with an accuracy of 95%,which is higher than existing models.展开更多
As an essential component of intelligent transportation systems(ITS),electric vehicles(EVs)can store massive amounts of electric power in their batteries and send power back to a charging station(CS)at peak hours to b...As an essential component of intelligent transportation systems(ITS),electric vehicles(EVs)can store massive amounts of electric power in their batteries and send power back to a charging station(CS)at peak hours to balance the power supply and generate profits.However,when the system collects the corresponding power data,several severe security and privacy issues are encountered.The identity and private injection data may be maliciously intercepted by network attackers and be tampered with to damage the services of ITS and smart grids.Existing approaches requiring high computational overhead render them unsuitable for the resource-constrained Internet of Things(IoT)environment.To address above problems,this paper proposes a blockchain-enabled secure and privacy-preserving data aggregation scheme for fog-based ITS.First,a fog computing and blockchain co-aware aggregation framework of power injection data is designed,which provides strong support for ITS to achieve secure and efficient power injection.Second,Paillier homomorphic encryption,the batch aggregation signature mechanism and a Bloom filter are effectively integrated with efficient aggregation of power injection data with security and privacy guarantees.In addition,the fine-grained homomorphic aggregation is designed for power injection data generated by all EVs,which provides solid data support for accurate power dispatching and supply management in ITS.Experiments show that the total computational cost is significantly reduced in the proposed scheme while providing security and privacy guarantees.The proposed scheme is more suitable for ITS with latency-sensitive applications and is also adapted to deploying devices with limited resources.展开更多
The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promisi...The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promising technologies that can improve overall communication performance.It brings on-demand services proximate to the end devices and delivers the requested data in a short time.Fog computing faces several issues such as latency,bandwidth,and link utilization due to limited resources and the high processing demands of end devices.To this end,fog caching plays an imperative role in addressing data dissemination issues.This study provides a comprehensive discussion of fog computing,Internet of Things(IoTs)and the critical issues related to data security and dissemination in fog computing.Moreover,we determine the fog-based caching schemes and contribute to deal with the existing issues of fog computing.Besides,this paper presents a number of caching schemes with their contributions,benefits,and challenges to overcome the problems and limitations of fog computing.We also identify machine learning-based approaches for cache security and management in fog computing,as well as several prospective future research directions in caching,fog computing,and machine learning.展开更多
Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can ...Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can also be viewed as resource-constrained devices if the power, cloudapplications, and data cloud are included in the set of required resources. Ina cloud-fog-based architecture, a task instance running on an end device mayneed to be offloaded to a fog node to complete its execution. However, ina busy network, a second offloading decision is required when the fog nodebecomes overloaded. The possibility of offloading a task, for the second time,to a fog or a cloud node depends to a great extent on task importance, latencyconstraints, and required resources. This paper presents a dynamic service thatdetermines which tasks can endure a second offloading. The task type, latencyconstraints, and amount of required resources are used to select the offloadingdestination node. This study proposes three heuristic offloading algorithms.Each algorithm targets a specific task type. An overloaded fog node can onlyissue one offloading request to execute one of these algorithms accordingto the task offloading priority. Offloading requests are sent to a SoftwareDefined Networking (SDN) controller. The fog node and controller determinethe number of offloaded tasks. Simulation results show that the average timerequired to select offloading nodes was improved by 33% when compared tothe dynamic fog-to-fog offloading algorithm. The distribution of workloadconverges to a uniform distribution when offloading latency-sensitive nonurgenttasks. The lowest offloading priority is assigned to latency-sensitivetasks with hard deadlines. At least 70% of these tasks are offloaded to fognodes that are one to three hops away from the overloaded node.展开更多
Emerging telemedicine trends,such as the Internet of Medical Things(IoMT),facilitate regular and efficient interactions between medical devices and computing devices.The importance of IoMT comes from the need to conti...Emerging telemedicine trends,such as the Internet of Medical Things(IoMT),facilitate regular and efficient interactions between medical devices and computing devices.The importance of IoMT comes from the need to continuously monitor patients’health conditions in real-time during normal daily activities,which is realized with the help of various wearable devices and sensors.One major health problem is workplace stress,which can lead to cardiovascular disease or psychiatric disorders.Therefore,real-time monitoring of employees’stress in the workplace is essential.Stress levels and the source of stress could be detected early in the fog layer so that the negative consequences can be mitigated sooner.However,overwhelming the fog layer with extensive data will increase the load on fog nodes,leading to computational challenges.This study aims to reduce fog computation by proposing machine learning(ML)models with two phases.The first phase of theMLmodel assesses the priority of the situation based on the stress level.In the second phase,a classifier determines the cause of stress,which was either interruptions or time pressure while completing a task.This approach reduced the computation cost for the fog node,as only high-priority records were transferred to the fog.Low-priority records were forwarded to the cloud.Four MLapproaches were compared in terms of accuracy and prediction speed:Knearest neighbors(KNN),a support vector machine(SVM),a bagged tree(BT),and an artificial neural network(ANN).In our experiments,ANN performed best in both phases because it scored an F1 score of 99.97% and had the highest prediction speed compared with KNN,SVM,and BT.展开更多
基金supported in part by the National Natural Science Foundation of China(NSFC)under Grant 62371082 and 62001076in part by the National Key R&D Program of China under Grant 2021YFB1714100in part by the Natural Science Foundation of Chongqing under Grant CSTB2023NSCQ-MSX0726 and cstc2020jcyjmsxmX0878.
文摘Fog computing is considered as a solution to accommodate the emergence of booming requirements from a large variety of resource-limited Internet of Things(IoT)devices.To ensure the security of private data,in this paper,we introduce a blockchain-enabled three-layer device-fog-cloud heterogeneous network.A reputation model is proposed to update the credibility of the fog nodes(FN),which is used to select blockchain nodes(BN)from FNs to participate in the consensus process.According to the Rivest-Shamir-Adleman(RSA)encryption algorithm applied to the blockchain system,FNs could verify the identity of the node through its public key to avoid malicious attacks.Additionally,to reduce the computation complexity of the consensus algorithms and the network overhead,we propose a dynamic offloading and resource allocation(DORA)algorithm and a reputation-based democratic byzantine fault tolerant(R-DBFT)algorithm to optimize the offloading decisions and decrease the number of BNs in the consensus algorithm while ensuring the network security.Simulation results demonstrate that the proposed algorithm could efficiently reduce the network overhead,and obtain a considerable performance improvement compared to the related algorithms in the previous literature.
基金funding from TECNALIA,Basque Research and Technology Alliance(BRTA)supported by the project aOptimization of Deep Learning algorithms for Edge IoT devices for sensorization and control in Buildings and Infrastructures(EMBED)funded by the Gipuzkoa Provincial Council and approved under the 2023 call of the Guipuzcoan Network of Science,Technology and Innovation Program with File Number 2023-CIEN-000051-01.
文摘In a network environment composed of different types of computing centers that can be divided into different layers(clod,edge layer,and others),the interconnection between them offers the possibility of peer-to-peer task offloading.For many resource-constrained devices,the computation of many types of tasks is not feasible because they cannot support such computations as they do not have enough available memory and processing capacity.In this scenario,it is worth considering transferring these tasks to resource-rich platforms,such as Edge Data Centers or remote cloud servers.For different reasons,it is more exciting and appropriate to download various tasks to specific download destinations depending on the properties and state of the environment and the nature of the functions.At the same time,establishing an optimal offloading policy,which ensures that all tasks are executed within the required latency and avoids excessive workload on specific computing centers is not easy.This study presents two alternatives to solve the offloading decision paradigm by introducing two well-known algorithms,Graph Neural Networks(GNN)and Deep Q-Network(DQN).It applies the alternatives on a well-known Edge Computing simulator called PureEdgeSimand compares them with the two defaultmethods,Trade-Off and Round Robin.Experiments showed that variants offer a slight improvement in task success rate and workload distribution.In terms of energy efficiency,they provided similar results.Finally,the success rates of different computing centers are tested,and the lack of capacity of remote cloud servers to respond to applications in real-time is demonstrated.These novel ways of finding a download strategy in a local networking environment are unique as they emulate the state and structure of the environment innovatively,considering the quality of its connections and constant updates.The download score defined in this research is a crucial feature for determining the quality of a download path in the GNN training process and has not previously been proposed.Simultaneously,the suitability of Reinforcement Learning(RL)techniques is demonstrated due to the dynamism of the network environment,considering all the key factors that affect the decision to offload a given task,including the actual state of all devices.
基金This work was jointly supported by the Special Fund for Transformation and Upgrade of Jiangsu Industry and Information Industry-Key Core Technologies(Equipment)Key Industrialization Projects in 2022(No.CMHI-2022-RDG-004):“Key Technology Research for Development of Intelligent Wind Power Operation and Maintenance Mothership in Deep Sea”.
文摘Under the influence of air humidity,dust,aerosols,etc.,in real scenes,haze presents an uneven state.In this way,the image quality and contrast will decrease.In this case,It is difficult to detect the target in the image by the universal detection network.Thus,a dual subnet based on multi-task collaborative training(DSMCT)is proposed in this paper.Firstly,in the training phase,the Gated Context Aggregation Network(GCANet)is used as the supervisory network of YOLOX to promote the extraction of clean information in foggy scenes.In the test phase,only the YOLOX branch needs to be activated to ensure the detection speed of the model.Secondly,the deformable convolution module is used to improve GCANet to enhance the model’s ability to capture details of non-homogeneous fog.Finally,the Coordinate Attention mechanism is introduced into the Vision Transformer and the backbone network of YOLOX is redesigned.In this way,the feature extraction ability of the network for deep-level information can be enhanced.The experimental results on artificial fog data set FOG_VOC and real fog data set RTTS show that the map value of DSMCT reached 86.56%and 62.39%,respectively,which was 2.27%and 4.41%higher than the current most advanced detection model.The DSMCT network has high practicality and effectiveness for target detection in real foggy scenes.
基金supported by the National Natural Science Foundation of China (62173103)the Fundamental Research Funds for the Central Universities of China (3072022JC0402,3072022JC0403)。
文摘For the first time, this article introduces a LiDAR Point Clouds Dataset of Ships composed of both collected and simulated data to address the scarcity of LiDAR data in maritime applications. The collected data are acquired using specialized maritime LiDAR sensors in both inland waterways and wide-open ocean environments. The simulated data is generated by placing a ship in the LiDAR coordinate system and scanning it with a redeveloped Blensor that emulates the operation of a LiDAR sensor equipped with various laser beams. Furthermore,we also render point clouds for foggy and rainy weather conditions. To describe a realistic shipping environment, a dynamic tail wave is modeled by iterating the wave elevation of each point in a time series. Finally, networks serving small objects are migrated to ship applications by feeding our dataset. The positive effect of simulated data is described in object detection experiments, and the negative impact of tail waves as noise is verified in single-object tracking experiments. The Dataset is available at https://github.com/zqy411470859/ship_dataset.
基金in part by the Hubei Natural Science and Research Project under Grant 2020418in part by the 2021 Light of Taihu Science and Technology Projectin part by the 2022 Wuxi Science and Technology Innovation and Entrepreneurship Program.
文摘More devices in the Intelligent Internet of Things(AIoT)result in an increased number of tasks that require low latency and real-time responsiveness,leading to an increased demand for computational resources.Cloud computing’s low-latency performance issues in AIoT scenarios have led researchers to explore fog computing as a complementary extension.However,the effective allocation of resources for task execution within fog environments,characterized by limitations and heterogeneity in computational resources,remains a formidable challenge.To tackle this challenge,in this study,we integrate fog computing and cloud computing.We begin by establishing a fog-cloud environment framework,followed by the formulation of a mathematical model for task scheduling.Lastly,we introduce an enhanced hybrid Equilibrium Optimizer(EHEO)tailored for AIoT task scheduling.The overarching objective is to decrease both the makespan and energy consumption of the fog-cloud system while accounting for task deadlines.The proposed EHEO method undergoes a thorough evaluation against multiple benchmark algorithms,encompassing metrics likemakespan,total energy consumption,success rate,and average waiting time.Comprehensive experimental results unequivocally demonstrate the superior performance of EHEO across all assessed metrics.Notably,in the most favorable conditions,EHEO significantly diminishes both the makespan and energy consumption by approximately 50%and 35.5%,respectively,compared to the secondbest performing approach,which affirms its efficacy in advancing the efficiency of AIoT task scheduling within fog-cloud networks.
基金supported via funding from Prince Sattam bin Abdulaziz University Project Number(PSAU/2024/R/1445).
文摘Fog computing has recently developed as a new paradigm with the aim of addressing time-sensitive applications better than with cloud computing by placing and processing tasks in close proximity to the data sources.However,the majority of the fog nodes in this environment are geographically scattered with resources that are limited in terms of capabilities compared to cloud nodes,thus making the application placement problem more complex than that in cloud computing.An approach for cost-efficient application placement in fog-cloud computing environments that combines the benefits of both fog and cloud computing to optimize the placement of applications and services while minimizing costs.This approach is particularly relevant in scenarios where latency,resource constraints,and cost considerations are crucial factors for the deployment of applications.In this study,we propose a hybrid approach that combines a genetic algorithm(GA)with the Flamingo Search Algorithm(FSA)to place application modules while minimizing cost.We consider four cost-types for application deployment:Computation,communication,energy consumption,and violations.The proposed hybrid approach is called GA-FSA and is designed to place the application modules considering the deadline of the application and deploy them appropriately to fog or cloud nodes to curtail the overall cost of the system.An extensive simulation is conducted to assess the performance of the proposed approach compared to other state-of-the-art approaches.The results demonstrate that GA-FSA approach is superior to the other approaches with respect to task guarantee ratio(TGR)and total cost.
基金This research was funded by the National Natural Science Foundation of China(Grant Number 61902069)Natural Science Foundation of Fujian Province of China(Grant Number 2021J011068)+1 种基金Research Initiation Fund Program of Fujian University of Technology(GY-S24002,GY-Z21048)Fujian Provincial Department of Science and Technology Industrial Guidance Project(Grant Number 2022H0025).
文摘The Advanced Metering Infrastructure(AMI),as a crucial subsystem in the smart grid,is responsible for measuring user electricity consumption and plays a vital role in communication between providers and consumers.However,with the advancement of information and communication technology,new security and privacy challenges have emerged for AMI.To address these challenges and enhance the security and privacy of user data in the smart grid,a Hierarchical Privacy Protection Model in Advanced Metering Infrastructure based on Cloud and Fog Assistance(HPPM-AMICFA)is proposed in this paper.The proposed model integrates cloud and fog computing with hierarchical threshold encryption,offering a flexible and efficient privacy protection solution that significantly enhances data security in the smart grid.The methodology involves setting user protection levels by processing missing data and utilizing fuzzy comprehensive analysis to evaluate user importance,thereby assigning appropriate protection levels.Furthermore,a hierarchical threshold encryption algorithm is developed to provide differentiated protection strategies for fog nodes based on user IDs,ensuring secure aggregation and encryption of user data.Experimental results demonstrate that HPPM-AMICFA effectively resists various attack strategies while minimizing time costs,thereby safeguarding user data in the smart grid.
基金This work was supported in part by the National Science and Technology Council of Taiwan,under Contract NSTC 112-2410-H-324-001-MY2.
文摘In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.
文摘针对液晶显示器(LCD)面板的“Chip/FPC on Glass”(C/FOG)工艺生产制造过程中存在的计量延迟大、生产异常无法提前预测的问题,本文提出一种基于神经网络的C/FOG工艺生产制造虚拟计量方法。该方法利用生产机台上的传感器采集生产过程中的过程状态数据,构建基于多尺度一维卷积及通道注意力模型(MS1DC-CA)的虚拟计量模型。通过多个尺度的卷积核提取不同尺度范围内的状态数据特征。在对含有缺失值的原始数据预处理中,提出了基于粒子群算法改进的K近邻填补方法(PSO-KNN Imputation)进行缺失值填充,保留特征的同时,减少因填充值引入的干扰。最后在实际生产采集的数据上进行实验对比分析,实际不良率主要集中在0.1%~0.5%,该虚拟计量模型的拟合均方误差为0.397 7‱,低于其他现有拟合模型,在平均绝对误差、对称平均绝对百分比误差和拟合优度3种评价指标下也均优于其他现有的拟合模型,具有良好的预测性能。
基金Fundamental Research Funds for Central Universities Project (No. 1CX05021A)Shandong Provincial Key R&D Plan Project (No. 2GHY15108)Shandong Postdoctoral Innovation Project and Qingdao Postdoctoral Applied Research Project。
文摘The double-layer NiCr-Cr_(3)C_(2)/Ni-Zn-Al_(2)O_(3) coatings with sufficient corrosion and wear resistance were prepared on low carbon steel substrates.The intermediate layers Ni-Zn-Al_(2)O_(3) were fabricated by using low-pressure cold spray (LPCS) method to improve the salt fog corrosion resistance properties of the supersonic plasma spray (SPS) NiCr-Cr_(3)C_(2) coatings.The friction and wear performance for the double-layer and single-layer NiCr-Cr_(3)C_(2) coatings were carried out by line-contact reciprocating sliding,respectively.Combined with the coating surface analysis techniques,the effect of the salt fog corrosion on the tribological properties of the double-layer coatings was studied.The results showed that the double-layer coatings exhibited better wear resistance than that of the single-layer coatings,due to the better corrosion resistance of the intermediate layer;the wear mass losses of the double-layer coatings was reduced by 70%than that of the single layer coatings and the wear mechanism of coatings after salt fog corrosion conditions is mainly corrosion wear.
文摘BACKGROUND The precise mechanism by which severe acute respiratory syndrome coronavirus 2(SARS-CoV-2)impacts the central nervous system remains unclear,with manifestations spanning from mild symptoms(e.g.,olfactory and gustatory deficits,hallucinations,and headache)to severe complications(e.g.,stroke,seizures,encephalitis,and neurally demyelinating lesions).The occurrence of single-pass subdural effusion,as described below,is extremely rare.CASE SUMMARY A 56-year-old male patient presented with left-sided limb weakness and slurred speech as predominant clinical symptoms.Through comprehensive imaging and diagnostic assessments,he was diagnosed with cerebral infarction complicated by hemorrhagic transformation affecting the right frontal,temporal,and parietal regions.In addition,an intracranial infection with SARS-CoV-2 was identified during the rehabilitation process;consequently,an idiopathic subdural effusion developed.Remarkably,the subdural effusion underwent absorption within 6 d,with no recurrence observed during the 3-month follow-up.CONCLUSION Subdural effusion is a potentially rare intracranial complication associated with SARS-CoV-2 infection.
文摘In fog, visibility is reduced. This reduction in visibility is measured by the meteorological optical range (MOR), which is important for studying human perception and various sensors in foggy conditions. The Cerema PAVIN Fog & Rain platform is capable of producing calibrated fog in order to better analyses it and understand its consequences. The problem is that the droplets produced by the platform are not large enough to resemble real fog. This can have a major impact on measurements since the interaction between electromagnetic waves and fog depends on the wavelength and diameter of the droplets. To remedy this, Cerema is building a new platform with new equipment capable of generating fog. This study analyses different nozzles and associated usage parameters such as the type of water used and the pressure used. The aim is to select the best nozzle with the associated parameters for producing large-diameter droplets and therefore more realistic fog.
基金This work was supprted by Princess Nourah bint Abdulrahman University Researchers Supporting Project number(PNURSP2022R151),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘With the rapid evolution of Internet technology,fog computing has taken a major role in managing large amounts of data.The major concerns in this domain are security and privacy.Therefore,attaining a reliable level of confidentiality in the fog computing environment is a pivotal task.Among different types of data stored in the fog,the 3D point and mesh fog data are increasingly popular in recent days,due to the growth of 3D modelling and 3D printing technologies.Hence,in this research,we propose a novel scheme for preserving the privacy of 3D point and mesh fog data.Chaotic Cat mapbased data encryption is a recently trending research area due to its unique properties like pseudo-randomness,deterministic nature,sensitivity to initial conditions,ergodicity,etc.To boost encryption efficiency significantly,in this work,we propose a novel Chaotic Cat map.The sequence generated by this map is used to transform the coordinates of the fog data.The improved range of the proposed map is depicted using bifurcation analysis.The quality of the proposed Chaotic Cat map is also analyzed using metrics like Lyapunov exponent and approximate entropy.We also demonstrate the performance of the proposed encryption framework using attacks like brute-force attack and statistical attack.The experimental results clearly depict that the proposed framework produces the best results compared to the previous works in the literature.
基金Taif University Researchers Supporting Project Number(TURSP-2020/98),Taif University,Taif,Saudi Arabia.
文摘Sleep apnea syndrome(SAS)is a breathing disorder while a person is asleep.The traditional method for examining SAS is Polysomnography(PSG).The standard procedure of PSG requires complete overnight observation in a laboratory.PSG typically provides accurate results,but it is expensive and time consuming.However,for people with Sleep apnea(SA),available beds and laboratories are limited.Resultantly,it may produce inaccurate diagnosis.Thus,this paper proposes the Internet of Medical Things(IoMT)framework with a machine learning concept of fully connected neural network(FCNN)with k-near-est neighbor(k-NN)classifier.This paper describes smart monitoring of a patient’s sleeping habit and diagnosis of SA using FCNN-KNN+average square error(ASE).For diagnosing SA,the Oxygen saturation(SpO2)sensor device is popularly used for monitoring the heart rate and blood oxygen level.This diagnosis information is securely stored in the IoMT fog computing network.Doctors can care-fully monitor the SA patient remotely on the basis of sensor values,which are efficiently stored in the fog computing network.The proposed technique takes less than 0.2 s with an accuracy of 95%,which is higher than existing models.
基金The authors received Funding for this study from the National Natural Science Foundation of China(No.61971235)the China Postdoctoral Science Foundation(No.2018M630590)+1 种基金the Jiangsu Planned Projects for Postdoctoral Research Funds(No.2021K501C)the 333 High-level Talents Training Project of Jiangsu Province,and the 1311 Talents Plan of NJUPT.
文摘As an essential component of intelligent transportation systems(ITS),electric vehicles(EVs)can store massive amounts of electric power in their batteries and send power back to a charging station(CS)at peak hours to balance the power supply and generate profits.However,when the system collects the corresponding power data,several severe security and privacy issues are encountered.The identity and private injection data may be maliciously intercepted by network attackers and be tampered with to damage the services of ITS and smart grids.Existing approaches requiring high computational overhead render them unsuitable for the resource-constrained Internet of Things(IoT)environment.To address above problems,this paper proposes a blockchain-enabled secure and privacy-preserving data aggregation scheme for fog-based ITS.First,a fog computing and blockchain co-aware aggregation framework of power injection data is designed,which provides strong support for ITS to achieve secure and efficient power injection.Second,Paillier homomorphic encryption,the batch aggregation signature mechanism and a Bloom filter are effectively integrated with efficient aggregation of power injection data with security and privacy guarantees.In addition,the fine-grained homomorphic aggregation is designed for power injection data generated by all EVs,which provides solid data support for accurate power dispatching and supply management in ITS.Experiments show that the total computational cost is significantly reduced in the proposed scheme while providing security and privacy guarantees.The proposed scheme is more suitable for ITS with latency-sensitive applications and is also adapted to deploying devices with limited resources.
基金Provincial key platforms and major scientific research projects of universities in Guangdong Province,Peoples R China under Grant No.2017GXJK116.
文摘The massive growth of diversified smart devices and continuous data generation poses a challenge to communication architectures.To deal with this problem,communication networks consider fog computing as one of promising technologies that can improve overall communication performance.It brings on-demand services proximate to the end devices and delivers the requested data in a short time.Fog computing faces several issues such as latency,bandwidth,and link utilization due to limited resources and the high processing demands of end devices.To this end,fog caching plays an imperative role in addressing data dissemination issues.This study provides a comprehensive discussion of fog computing,Internet of Things(IoTs)and the critical issues related to data security and dissemination in fog computing.Moreover,we determine the fog-based caching schemes and contribute to deal with the existing issues of fog computing.Besides,this paper presents a number of caching schemes with their contributions,benefits,and challenges to overcome the problems and limitations of fog computing.We also identify machine learning-based approaches for cache security and management in fog computing,as well as several prospective future research directions in caching,fog computing,and machine learning.
基金funded by the Deanship of Scientific Research,Princess Nourah bint Abdulrahman University,through the Program of Research Funding after Publication,Grant No. (PRFA–P–42–10).
文摘Task offloading is a key strategy in Fog Computing (FC). Thedefinition of resource-constrained devices no longer applies to sensors andInternet of Things (IoT) embedded system devices alone. Smart and mobileunits can also be viewed as resource-constrained devices if the power, cloudapplications, and data cloud are included in the set of required resources. Ina cloud-fog-based architecture, a task instance running on an end device mayneed to be offloaded to a fog node to complete its execution. However, ina busy network, a second offloading decision is required when the fog nodebecomes overloaded. The possibility of offloading a task, for the second time,to a fog or a cloud node depends to a great extent on task importance, latencyconstraints, and required resources. This paper presents a dynamic service thatdetermines which tasks can endure a second offloading. The task type, latencyconstraints, and amount of required resources are used to select the offloadingdestination node. This study proposes three heuristic offloading algorithms.Each algorithm targets a specific task type. An overloaded fog node can onlyissue one offloading request to execute one of these algorithms accordingto the task offloading priority. Offloading requests are sent to a SoftwareDefined Networking (SDN) controller. The fog node and controller determinethe number of offloaded tasks. Simulation results show that the average timerequired to select offloading nodes was improved by 33% when compared tothe dynamic fog-to-fog offloading algorithm. The distribution of workloadconverges to a uniform distribution when offloading latency-sensitive nonurgenttasks. The lowest offloading priority is assigned to latency-sensitivetasks with hard deadlines. At least 70% of these tasks are offloaded to fognodes that are one to three hops away from the overloaded node.
基金funded by the Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah,under Grant No.IFPIP:1181-611-1443.
文摘Emerging telemedicine trends,such as the Internet of Medical Things(IoMT),facilitate regular and efficient interactions between medical devices and computing devices.The importance of IoMT comes from the need to continuously monitor patients’health conditions in real-time during normal daily activities,which is realized with the help of various wearable devices and sensors.One major health problem is workplace stress,which can lead to cardiovascular disease or psychiatric disorders.Therefore,real-time monitoring of employees’stress in the workplace is essential.Stress levels and the source of stress could be detected early in the fog layer so that the negative consequences can be mitigated sooner.However,overwhelming the fog layer with extensive data will increase the load on fog nodes,leading to computational challenges.This study aims to reduce fog computation by proposing machine learning(ML)models with two phases.The first phase of theMLmodel assesses the priority of the situation based on the stress level.In the second phase,a classifier determines the cause of stress,which was either interruptions or time pressure while completing a task.This approach reduced the computation cost for the fog node,as only high-priority records were transferred to the fog.Low-priority records were forwarded to the cloud.Four MLapproaches were compared in terms of accuracy and prediction speed:Knearest neighbors(KNN),a support vector machine(SVM),a bagged tree(BT),and an artificial neural network(ANN).In our experiments,ANN performed best in both phases because it scored an F1 score of 99.97% and had the highest prediction speed compared with KNN,SVM,and BT.