In the smart city paradigm, the deployment of Internet of Things(IoT) services and solutions requires extensive communication and computingresources to place and process IoT applications in real time, which consumesa ...In the smart city paradigm, the deployment of Internet of Things(IoT) services and solutions requires extensive communication and computingresources to place and process IoT applications in real time, which consumesa lot of energy and increases operational costs. Usually, IoT applications areplaced in the cloud to provide high-quality services and scalable resources.However, the existing cloud-based approach should consider the above constraintsto efficiently place and process IoT applications. In this paper, anefficient optimization approach for placing IoT applications in a multi-layerfog-cloud environment is proposed using a mathematical model (Mixed-Integer Linear Programming (MILP)). This approach takes into accountIoT application requirements, available resource capacities, and geographicallocations of servers, which would help optimize IoT application placementdecisions, considering multiple objectives such as data transmission, powerconsumption, and cost. Simulation experiments were conducted with variousIoT applications (e.g., augmented reality, infotainment, healthcare, andcompute-intensive) to simulate realistic scenarios. The results showed thatthe proposed approach outperformed the existing cloud-based approach interms of reducing data transmission by 64% and the associated processingand networking power consumption costs by up to 78%. Finally, a heuristicapproach was developed to validate and imitate the presented approach. Itshowed comparable outcomes to the proposed model, with the gap betweenthem reach to a maximum of 5.4% of the total power consumption.展开更多
The modern paradigm of the Internet of Things(IoT)has led to a significant increase in demand for latency-sensitive applications in Fog-based cloud computing.However,such applications cannot meet strict quality of ser...The modern paradigm of the Internet of Things(IoT)has led to a significant increase in demand for latency-sensitive applications in Fog-based cloud computing.However,such applications cannot meet strict quality of service(QoS)requirements.The large-scale deployment of IoT requires more effective use of network infrastructure to ensure QoS when processing big data.Generally,cloud-centric IoT application deployment involves different modules running on terminal devices and cloud servers.Fog devices with different computing capabilities must process the data generated by the end device,so deploying latency-sensitive applications in a heterogeneous fog computing environment is a difficult task.In addition,when there is an inconsistent connection delay between the fog and the terminal device,the deployment of such applications becomes more complicated.In this article,we propose an algorithm that can effectively place application modules on network nodes while considering connection delay,processing power,and sensing data volume.Compared with traditional cloud computing deployment,we conducted simulations in iFogSim to confirm the effectiveness of the algorithm.The simulation results verify the effectiveness of the proposed algorithm in terms of end-to-end delay and network consumption.Therein,latency and execution time is insensitive to the number of sensors.展开更多
文摘In the smart city paradigm, the deployment of Internet of Things(IoT) services and solutions requires extensive communication and computingresources to place and process IoT applications in real time, which consumesa lot of energy and increases operational costs. Usually, IoT applications areplaced in the cloud to provide high-quality services and scalable resources.However, the existing cloud-based approach should consider the above constraintsto efficiently place and process IoT applications. In this paper, anefficient optimization approach for placing IoT applications in a multi-layerfog-cloud environment is proposed using a mathematical model (Mixed-Integer Linear Programming (MILP)). This approach takes into accountIoT application requirements, available resource capacities, and geographicallocations of servers, which would help optimize IoT application placementdecisions, considering multiple objectives such as data transmission, powerconsumption, and cost. Simulation experiments were conducted with variousIoT applications (e.g., augmented reality, infotainment, healthcare, andcompute-intensive) to simulate realistic scenarios. The results showed thatthe proposed approach outperformed the existing cloud-based approach interms of reducing data transmission by 64% and the associated processingand networking power consumption costs by up to 78%. Finally, a heuristicapproach was developed to validate and imitate the presented approach. Itshowed comparable outcomes to the proposed model, with the gap betweenthem reach to a maximum of 5.4% of the total power consumption.
基金This research was supported by the MSIT(Ministry of Science and ICT),Korea,under the ITRC(Information Technology Research Center)support program(IITP-2021-2016-0-00313)supervised by the IITP(Institute for Information&Communications Technology Planning&Evaluation).
文摘The modern paradigm of the Internet of Things(IoT)has led to a significant increase in demand for latency-sensitive applications in Fog-based cloud computing.However,such applications cannot meet strict quality of service(QoS)requirements.The large-scale deployment of IoT requires more effective use of network infrastructure to ensure QoS when processing big data.Generally,cloud-centric IoT application deployment involves different modules running on terminal devices and cloud servers.Fog devices with different computing capabilities must process the data generated by the end device,so deploying latency-sensitive applications in a heterogeneous fog computing environment is a difficult task.In addition,when there is an inconsistent connection delay between the fog and the terminal device,the deployment of such applications becomes more complicated.In this article,we propose an algorithm that can effectively place application modules on network nodes while considering connection delay,processing power,and sensing data volume.Compared with traditional cloud computing deployment,we conducted simulations in iFogSim to confirm the effectiveness of the algorithm.The simulation results verify the effectiveness of the proposed algorithm in terms of end-to-end delay and network consumption.Therein,latency and execution time is insensitive to the number of sensors.