With the emergence of the COVID-19 pandemic,the World Health Organization(WHO)has urged scientists and industrialists to exploremodern information and communication technology(ICT)as a means to reduce or even eliminat...With the emergence of the COVID-19 pandemic,the World Health Organization(WHO)has urged scientists and industrialists to exploremodern information and communication technology(ICT)as a means to reduce or even eliminate it.The World Health Organization recently reported that the virus may infect the organism through any organ in the living body,such as the respiratory,the immunity,the nervous,the digestive,or the cardiovascular system.Targeting the abovementioned goal,we envision an implanted nanosystem embedded in the intra living-body network.The main function of the nanosystem is either to perform diagnosis and mitigation of infectious diseases or to implement a targeted drug delivery system(i.e.,delivery of the therapeutic drug to the diseased tissue or targeted cell).The communication among the nanomachines is accomplished via communication-based molecular diffusion.The control/interconnection of the nanosystem is accomplished through the utilization of Internet of bio-nano things(IoBNT).The proposed nanosystem is designed to employ a coded relay nanomachine disciplined by the decode and forward(DF)principle to ensure reliable drug delivery to the targeted cell.Notably,both the sensitivity of the drug dose and the phenomenon of drug molecules loss before delivery to the target cell site in long-distance due to the molecules diffusion process are taken into account.In this paper,a coded relay NM with conventional coding techniques such as RS and Turbo codes is selected to achieve minimum bit error rate(BER)performance and high signal-to-noise ratio(SNR),while the detection process is based on maximum likelihood(ML)probability and minimum error probability(MEP).The performance analysis of the proposed scheme is evaluated in terms of channel capacity and bit error rate by varying system parameters such as relay position,number of released molecules,relay and receiver size.Analysis results are validated through simulation and demonstrate that the proposed scheme can significantly improve delivery performance of the desirable drugs in the molecular communication system.展开更多
Load forecasting has received crucial research attention to reduce peak load and contribute to the stability of power grid using machine learning or deep learning models.Especially,we need the adequate model to foreca...Load forecasting has received crucial research attention to reduce peak load and contribute to the stability of power grid using machine learning or deep learning models.Especially,we need the adequate model to forecast the maximum load duration based on time-of-use,which is the electricity usage fare policy in order to achieve the goals such as peak load reduction in a power grid.However,the existing single machine learning or deep learning forecasting cannot easily avoid overfitting.Moreover,a majority of the ensemble or hybrid models do not achieve optimal results for forecasting the maximum load duration based on time-of-use.To overcome these limitations,we propose a hybrid deep learning architecture to forecast maximum load duration based on time-of-use.Experimental results indicate that this architecture could achieve the highest average of recall and accuracy(83.43%)compared to benchmark models.To verify the effectiveness of the architecture,another experimental result shows that energy storage system(ESS)scheme in accordance with the forecast results of the proposed model(LSTM-MATO)in the architecture could provide peak load cost savings of 17,535,700 KRW each year comparing with original peak load costs without the method.Therefore,the proposed architecture could be utilized for practical applications such as peak load reduction in the grid.展开更多
Millimeter-Wave(mmWave)Massive MIMO is one of the most effective technology for the fifth-generation(5G)wireless networks.It improves both the spectral and energy efficiency by utilizing the 30–300 GHz millimeter-wav...Millimeter-Wave(mmWave)Massive MIMO is one of the most effective technology for the fifth-generation(5G)wireless networks.It improves both the spectral and energy efficiency by utilizing the 30–300 GHz millimeter-wave bandwidth and a large number of antennas at the base station.However,increasing the number of antennas requires a large number of radio frequency(RF)chains which results in high power consumption.In order to reduce the RF chain’s energy,cost and provide desirable quality-ofservice(QoS)to the subscribers,this paper proposes an energy-efficient hybrid precoding algorithm formm Wave massive MIMO networks based on the idea of RF chains selection.The sparse digital precoding problem is generated by utilizing the analog precoding codebook.Then,it is jointly solved through iterative fractional programming and successive convex optimization(SCA)techniques.Simulation results show that the proposed scheme outperforms the existing schemes and effectively improves the system performance under different operating conditions.展开更多
The Internet of Things(IoT)has allowed for significant advancements in applications not only in the home,business,and environment,but also in factory automation.Industrial Internet of Things(IIoT)brings all of the ben...The Internet of Things(IoT)has allowed for significant advancements in applications not only in the home,business,and environment,but also in factory automation.Industrial Internet of Things(IIoT)brings all of the benefits of the IoT to industrial contexts,allowing for a wide range of applications ranging from remote sensing and actuation to decentralization and autonomy.The expansion of the IoT has been set by serious security threats and obstacles,and one of the most pressing security concerns is the secure exchange of IoT data and fine-grained access control.A privacypreserving multi-dimensional secure query technique for fog-enhanced IIoT was proposed in light of the fact that most existing range query schemes for fog-enhanced IoT cannot provide both multi-dimensional query and privacy protection.The query matrix was then decomposed using auxiliary vectors,and the auxiliary vectorwas then processed usingBGNhomomorphic encryption to create a query trapdoor.Finally,the query trapdoor may be matched to its sensor data using the homomorphic computation used by an IoT device terminal.With the application of particular auxiliary vectors,the spatial complexity might be efficiently decreased.The homomorphic encryption property might ensure the security of sensor data and safeguard the privacy of the user’s inquiry mode.The results of the experiments reveal that the computing and communication expenses are modest.展开更多
The end-to-end delay in a wired network is strongly dependent on congestion on intermediate nodes.Among lots of feasible approaches to avoid congestion efficiently,congestion-aware routing protocols tend to search for...The end-to-end delay in a wired network is strongly dependent on congestion on intermediate nodes.Among lots of feasible approaches to avoid congestion efficiently,congestion-aware routing protocols tend to search for an uncongested path toward the destination through rule-based approaches in reactive/incident-driven and distributed methods.However,these previous approaches have a problem accommodating the changing network environments in autonomous and self-adaptive operations dynamically.To overcome this drawback,we present a new congestion-aware routing protocol based on a Q-learning algorithm in software-defined networks where logically centralized network operation enables intelligent control and management of network resources.In a proposed routing protocol,either one of uncongested neighboring nodes are randomly selected as next hop to distribute traffic load to multiple paths or Q-learning algorithm is applied to decide the next hop by modeling the state,Q-value,and reward function to set the desired path toward the destination.A new reward function that consists of a buffer occupancy,link reliability and hop count is considered.Moreover,look ahead algorithm is employed to update the Q-value with values within two hops simultaneously.This approach leads to a decision of the optimal next hop by taking congestion status in two hops into account,accordingly.Finally,the simulation results presented approximately 20%higher packet delivery ratio and 15%shorter end-to-end delay,compared to those with the existing scheme by avoiding congestion adaptively.展开更多
The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,in...The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.展开更多
As wireless sensor network becomes pervasive, new requirements have been continuously emerged. How-ever, the most of research efforts in wireless sensor network are focused on energy problem since the nodes are usuall...As wireless sensor network becomes pervasive, new requirements have been continuously emerged. How-ever, the most of research efforts in wireless sensor network are focused on energy problem since the nodes are usually battery-powered. Among these requirements, real-time communication is one of the big research challenges in wireless sensor networks because most of query messages carry time information. To meet this requirement, recently several real-time medium access control protocols have been proposed for wireless sensor networks in the literature because waiting time to share medium on each node is one of main source for end-to-end delay. In this paper, we first introduce the specific requirement of wireless sensor real-time MAC protocol. Then, a collection of recent wireless sensor real-time MAC protocols are surveyed, classified, and described emphasizing their advantages and disadvantages whenever possible. Finally we present a dis-cussion about the challenges of current wireless sensor real-time MAC protocols in the literature, and show the conclusion in the end.展开更多
基金supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security).
文摘With the emergence of the COVID-19 pandemic,the World Health Organization(WHO)has urged scientists and industrialists to exploremodern information and communication technology(ICT)as a means to reduce or even eliminate it.The World Health Organization recently reported that the virus may infect the organism through any organ in the living body,such as the respiratory,the immunity,the nervous,the digestive,or the cardiovascular system.Targeting the abovementioned goal,we envision an implanted nanosystem embedded in the intra living-body network.The main function of the nanosystem is either to perform diagnosis and mitigation of infectious diseases or to implement a targeted drug delivery system(i.e.,delivery of the therapeutic drug to the diseased tissue or targeted cell).The communication among the nanomachines is accomplished via communication-based molecular diffusion.The control/interconnection of the nanosystem is accomplished through the utilization of Internet of bio-nano things(IoBNT).The proposed nanosystem is designed to employ a coded relay nanomachine disciplined by the decode and forward(DF)principle to ensure reliable drug delivery to the targeted cell.Notably,both the sensitivity of the drug dose and the phenomenon of drug molecules loss before delivery to the target cell site in long-distance due to the molecules diffusion process are taken into account.In this paper,a coded relay NM with conventional coding techniques such as RS and Turbo codes is selected to achieve minimum bit error rate(BER)performance and high signal-to-noise ratio(SNR),while the detection process is based on maximum likelihood(ML)probability and minimum error probability(MEP).The performance analysis of the proposed scheme is evaluated in terms of channel capacity and bit error rate by varying system parameters such as relay position,number of released molecules,relay and receiver size.Analysis results are validated through simulation and demonstrate that the proposed scheme can significantly improve delivery performance of the desirable drugs in the molecular communication system.
基金supported by Institute for Information&communications Technology Planning&Evaluation(IITP)funded by the Korea government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security)Research Cluster Project,R20143,by Zayed University Research Office.
文摘Load forecasting has received crucial research attention to reduce peak load and contribute to the stability of power grid using machine learning or deep learning models.Especially,we need the adequate model to forecast the maximum load duration based on time-of-use,which is the electricity usage fare policy in order to achieve the goals such as peak load reduction in a power grid.However,the existing single machine learning or deep learning forecasting cannot easily avoid overfitting.Moreover,a majority of the ensemble or hybrid models do not achieve optimal results for forecasting the maximum load duration based on time-of-use.To overcome these limitations,we propose a hybrid deep learning architecture to forecast maximum load duration based on time-of-use.Experimental results indicate that this architecture could achieve the highest average of recall and accuracy(83.43%)compared to benchmark models.To verify the effectiveness of the architecture,another experimental result shows that energy storage system(ESS)scheme in accordance with the forecast results of the proposed model(LSTM-MATO)in the architecture could provide peak load cost savings of 17,535,700 KRW each year comparing with original peak load costs without the method.Therefore,the proposed architecture could be utilized for practical applications such as peak load reduction in the grid.
基金This study was supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)grant funded by the Korean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security).
文摘Millimeter-Wave(mmWave)Massive MIMO is one of the most effective technology for the fifth-generation(5G)wireless networks.It improves both the spectral and energy efficiency by utilizing the 30–300 GHz millimeter-wave bandwidth and a large number of antennas at the base station.However,increasing the number of antennas requires a large number of radio frequency(RF)chains which results in high power consumption.In order to reduce the RF chain’s energy,cost and provide desirable quality-ofservice(QoS)to the subscribers,this paper proposes an energy-efficient hybrid precoding algorithm formm Wave massive MIMO networks based on the idea of RF chains selection.The sparse digital precoding problem is generated by utilizing the analog precoding codebook.Then,it is jointly solved through iterative fractional programming and successive convex optimization(SCA)techniques.Simulation results show that the proposed scheme outperforms the existing schemes and effectively improves the system performance under different operating conditions.
基金This study was supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)grant funded by theKorean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security).
文摘The Internet of Things(IoT)has allowed for significant advancements in applications not only in the home,business,and environment,but also in factory automation.Industrial Internet of Things(IIoT)brings all of the benefits of the IoT to industrial contexts,allowing for a wide range of applications ranging from remote sensing and actuation to decentralization and autonomy.The expansion of the IoT has been set by serious security threats and obstacles,and one of the most pressing security concerns is the secure exchange of IoT data and fine-grained access control.A privacypreserving multi-dimensional secure query technique for fog-enhanced IIoT was proposed in light of the fact that most existing range query schemes for fog-enhanced IoT cannot provide both multi-dimensional query and privacy protection.The query matrix was then decomposed using auxiliary vectors,and the auxiliary vectorwas then processed usingBGNhomomorphic encryption to create a query trapdoor.Finally,the query trapdoor may be matched to its sensor data using the homomorphic computation used by an IoT device terminal.With the application of particular auxiliary vectors,the spatial complexity might be efficiently decreased.The homomorphic encryption property might ensure the security of sensor data and safeguard the privacy of the user’s inquiry mode.The results of the experiments reveal that the computing and communication expenses are modest.
基金This work was supported by Institute for Information&communications Technology Planning&Evaluation(IITP)grant funded by the Korea government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security)and Research Cluster Project,R20143,by Zayed University Research Office.
文摘The end-to-end delay in a wired network is strongly dependent on congestion on intermediate nodes.Among lots of feasible approaches to avoid congestion efficiently,congestion-aware routing protocols tend to search for an uncongested path toward the destination through rule-based approaches in reactive/incident-driven and distributed methods.However,these previous approaches have a problem accommodating the changing network environments in autonomous and self-adaptive operations dynamically.To overcome this drawback,we present a new congestion-aware routing protocol based on a Q-learning algorithm in software-defined networks where logically centralized network operation enables intelligent control and management of network resources.In a proposed routing protocol,either one of uncongested neighboring nodes are randomly selected as next hop to distribute traffic load to multiple paths or Q-learning algorithm is applied to decide the next hop by modeling the state,Q-value,and reward function to set the desired path toward the destination.A new reward function that consists of a buffer occupancy,link reliability and hop count is considered.Moreover,look ahead algorithm is employed to update the Q-value with values within two hops simultaneously.This approach leads to a decision of the optimal next hop by taking congestion status in two hops into account,accordingly.Finally,the simulation results presented approximately 20%higher packet delivery ratio and 15%shorter end-to-end delay,compared to those with the existing scheme by avoiding congestion adaptively.
基金This study is supported by Fujitsu-Waseda Digital Annealer FWDA Research Project and Fujitsu Co-Creation Research Laboratory at Waseda University(Joint Research between Waseda University and Fujitsu Lab).The study was also partly supported by the School of Fundamental Science and Engineering,Faculty of Science and Engineering,Waseda University,Japan.This study was supported by the Institute for Information&Communications Technology Planning&Evaluation(IITP)Grant funded by the Korean government(MSIT)(No.2019-0-01343,Training Key Talents in Industrial Convergence Security)and Research Cluster Project,R20143,by the Zayed University Research Office.
文摘The Internet of Things(IoT)is the fourth technological revolution in the global information industry after computers,the Internet,and mobile communication networks.It combines radio-frequency identification devices,infrared sensors,global positioning systems,and various other technologies.Information sensing equipment is connected via the Internet,thus forming a vast network.When these physical devices are connected to the Internet,the user terminal can be extended and expanded to exchange information,communicate with anything,and carry out identification,positioning,tracking,monitoring,and triggering of corresponding events on each device in the network.In real life,the IoT has a wide range of applications,covering many fields,such as smart homes,smart logistics,fine agriculture and animal husbandry,national defense,and military.One of the most significant factors in wireless channels is interference,which degrades the system performance.Although the existing QR decomposition-based signal detection method is an emerging topic because of its low complexity,it does not solve the problem of poor detection performance.Therefore,this study proposes a maximumlikelihood-based QR decomposition algorithm.The main idea is to estimate the initial level of detection using the maximum likelihood principle,and then the other layer is detected using a reliable decision.The optimal candidate is selected from the feedback by deploying the candidate points in an unreliable scenario.Simulation results show that the proposed algorithm effectively reduces the interference and propagation error compared with the algorithms reported in the literature.
文摘As wireless sensor network becomes pervasive, new requirements have been continuously emerged. How-ever, the most of research efforts in wireless sensor network are focused on energy problem since the nodes are usually battery-powered. Among these requirements, real-time communication is one of the big research challenges in wireless sensor networks because most of query messages carry time information. To meet this requirement, recently several real-time medium access control protocols have been proposed for wireless sensor networks in the literature because waiting time to share medium on each node is one of main source for end-to-end delay. In this paper, we first introduce the specific requirement of wireless sensor real-time MAC protocol. Then, a collection of recent wireless sensor real-time MAC protocols are surveyed, classified, and described emphasizing their advantages and disadvantages whenever possible. Finally we present a dis-cussion about the challenges of current wireless sensor real-time MAC protocols in the literature, and show the conclusion in the end.