Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessi...Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.展开更多
With the continuous expansion of the Industrial Internet of Things(IIoT),more andmore organisations are placing large amounts of data in the cloud to reduce overheads.However,the channel between cloud servers and smar...With the continuous expansion of the Industrial Internet of Things(IIoT),more andmore organisations are placing large amounts of data in the cloud to reduce overheads.However,the channel between cloud servers and smart equipment is not trustworthy,so the issue of data authenticity needs to be addressed.The SM2 digital signature algorithm can provide an authentication mechanism for data to solve such problems.Unfortunately,it still suffers from the problem of key exposure.In order to address this concern,this study first introduces a key-insulated scheme,SM2-KI-SIGN,based on the SM2 algorithm.This scheme boasts strong key insulation and secure keyupdates.Our scheme uses the elliptic curve algorithm,which is not only more efficient but also more suitable for IIoT-cloud environments.Finally,the security proof of SM2-KI-SIGN is given under the Elliptic Curve Discrete Logarithm(ECDL)assumption in the random oracle.展开更多
Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a c...Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a clustering protocol,the selection of a cluster head(CH)plays a key role in prolonging the lifetime of a network.However,most cluster-based protocols,including routing protocols for low-power and lossy networks(RPLs),have used fuzzy logic and probabilistic approaches to select the CH node.Consequently,early battery depletion is produced near the sink.To overcome this issue,a lion optimization algorithm(LOA)for selecting CH in RPL is proposed in this study.LOA-RPL comprises three processes:cluster formation,CH selection,and route establishment.A cluster is formed using the Euclidean distance.CH selection is performed using LOA.Route establishment is implemented using residual energy information.An extensive simulation is conducted in the network simulator ns-3 on various parameters,such as network lifetime,power consumption,packet delivery ratio(PDR),and throughput.The performance of LOA-RPL is also compared with those of RPL,fuzzy rule-based energyefficient clustering and immune-inspired routing(FEEC-IIR),and the routing scheme for IoT that uses shuffled frog-leaping optimization algorithm(RISARPL).The performance evaluation metrics used in this study are network lifetime,power consumption,PDR,and throughput.The proposed LOARPL increases network lifetime by 20%and PDR by 5%–10%compared with RPL,FEEC-IIR,and RISA-RPL.LOA-RPL is also highly energy-efficient compared with other similar routing protocols.展开更多
The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping h...The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping hubs,a clustering convention offers a useful solution for ensuring energy-saving of hubs andHybridMedia Access Control(HMAC)during the course of the organization.Nevertheless,current grouping standards suffer from issues with the grouping structure that impacts the exhibition of these conventions negatively.In this investigation,we recommend an Improved Energy-Proficient Algorithm(IEPA)for HMAC throughout the lifetime of the WSN-based IoT.Three consecutive segments are suggested.For the covering of adjusted clusters,an ideal number of clusters is determined first.Then,fair static clusters are shaped,based on an updated calculation for fluffy cluster heads,to reduce and adapt the energy use of the sensor hubs.Cluster heads(CHs)are,ultimately,selected in optimal locations,with the pivot of the cluster heads working among cluster members.Specifically,the proposed convention diminishes and balances the energy utilization of hubs by improving the grouping structure,where the IEPAis reasonable for systems that need a long time.The assessment results demonstrate that the IEPA performs better than existing conventions.展开更多
In recent years,the Internet of Things technology has developed rapidly,and smart Internet of Things devices have also been widely popularized.A large amount of data is generated every moment.Now we are in the era of ...In recent years,the Internet of Things technology has developed rapidly,and smart Internet of Things devices have also been widely popularized.A large amount of data is generated every moment.Now we are in the era of big data in the Internet of Things.The rapid growth of massive data has brought great challenges to storage technology,which cannot be well coped with by traditional storage technology.The demand for massive data storage has given birth to cloud storage technology.Load balancing technology plays an important role in improving the performance and resource utilization of cloud storage systems.Therefore,it is of great practical significance to study how to improve the performance and resource utilization of cloud storage systems through load balancing technology.On the basis of studying the read strategy of Swift,this article proposes a reread strategy based on load balancing of storage resources to solve the problem of unbalanced read load between interruptions caused by random data copying in Swift.The storage asynchronously tracks the I/O conversion to select the storage with the smallest load for asynchronous reading.The experimental results indicate that the proposed strategy can achieve a better load balancing state in terms of storage I/O utilization and CPU utilization than the random read strategy index of Swift.展开更多
Interconnected devices and intelligent applications have slashed human intervention in the Internet of Things(IoT),making it possible to accomplish tasks with less human interaction.However,it faces many problems,incl...Interconnected devices and intelligent applications have slashed human intervention in the Internet of Things(IoT),making it possible to accomplish tasks with less human interaction.However,it faces many problems,including lower capacity links,energy utilization,enhancement of resources and limited resources due to its openness,heterogeneity,limited resources and extensiveness.It is challenging to route packets in such a constrained environment.In an IoT network constrained by limited resources,minimal routing control overhead is required without packet loss.Such constrained environments can be improved through the optimal routing protocol.It is challenging to route packets in such a constrained environment.Thus,this work is motivated to present an efficient routing protocol for enhancing the lifetime of the IoT network.Lightweight On-demand Ad hoc Distance-vector Routing Protocol—Next Generation(LOADng)protocol is an extended version of the Ad Hoc On-Demand Distance Vector(AODV)protocol.Unlike AODV,LOADng is a lighter version that forbids the intermediate nodes on the route to send a route reply(RREP)for the route request(RREQ),which originated from the source.A resource-constrained IoT network demands minimal routing control overhead and faster packet delivery.So,in this paper,the parameters of the LOADng routing protocol are optimized using the black widow optimization(BWO)algorithm to reduce the control overhead and delay.Furthermore,the performance of the proposed model is analyzed with the default LOADng in terms of delay,delivery ratio and overhead.Obtained results show that the LOADng-BWO protocol outperforms the conventional LOADng protocol.展开更多
In the Internet of Things(IoT),the users have complex needs,and the Web Service Composition(WSC)was introduced to address these needs.The WSC’s main objective is to search for the optimal combination of web services ...In the Internet of Things(IoT),the users have complex needs,and the Web Service Composition(WSC)was introduced to address these needs.The WSC’s main objective is to search for the optimal combination of web services in response to the user needs and the level of Quality of Services(QoS)constraints.The challenge of this problem is the huge number of web services that achieve similar functionality with different levels of QoS constraints.In this paper,we introduce an extension of our previous works on the Artificial Bee Colony(ABC)and Bat Algorithm(BA).A new hybrid algorithm was proposed between the ABC and BA to achieve a better tradeoff between local exploitation and global search.The bat agent is used to improve the solution of exhausted bees after a threshold(limits),and also an Elitist Strategy(ES)is added to BA to increase the convergence rate.The performance and convergence behavior of the proposed hybrid algorithm was tested using extensive comparative experiments with current state-ofthe-art nature-inspired algorithms on 12 benchmark datasets using three evaluation criteria(average fitness values,best fitness values,and execution time)that were measured for 30 different runs.These datasets are created from real-world datasets and artificially to form different scale sizes of WSC datasets.The results show that the proposed algorithm enhances the search performance and convergence rate on finding the near-optimal web services combination compared to competitors.TheWilcoxon signed-rank significant test is usedwhere the proposed algorithm results significantly differ fromother algorithms on 100%of datasets.展开更多
The Industrial Internet of Things(IIoT)has brought numerous benefits,such as improved efficiency,smart analytics,and increased automation.However,it also exposes connected devices,users,applications,and data generated...The Industrial Internet of Things(IIoT)has brought numerous benefits,such as improved efficiency,smart analytics,and increased automation.However,it also exposes connected devices,users,applications,and data generated to cyber security threats that need to be addressed.This work investigates hybrid cyber threats(HCTs),which are now working on an entirely new level with the increasingly adopted IIoT.This work focuses on emerging methods to model,detect,and defend against hybrid cyber attacks using machine learning(ML)techniques.Specifically,a novel ML-based HCT modelling and analysis framework was proposed,in which L1 regularisation and Random Forest were used to cluster features and analyse the importance and impact of each feature in both individual threats and HCTs.A grey relation analysis-based model was employed to construct the correlation between IIoT components and different threats.展开更多
随着物联网(IoT, internet of things)基站的部署愈发密集,网络干扰管控的重要性愈发凸显。物联网中,设备常采用随机接入,以分布式的方式接入信道。在海量设备的物联网场景中,节点之间可能会出现严重的干扰,导致网络的吞吐量性能严重下...随着物联网(IoT, internet of things)基站的部署愈发密集,网络干扰管控的重要性愈发凸显。物联网中,设备常采用随机接入,以分布式的方式接入信道。在海量设备的物联网场景中,节点之间可能会出现严重的干扰,导致网络的吞吐量性能严重下降。为了解决随机接入网络中的干扰管控问题,考虑基于协作接收的多基站时隙Aloha网络,利用强化学习工具,设计自适应传输算法,实现干扰管控,优化网络的吞吐量性能,并提高网络的公平性。首先,设计了基于Q-学习的自适应传输算法,通过仿真验证了该算法面对不同网络流量时均能保障较高的网络吞吐量性能。其次,为了提高网络的公平性,采用惩罚函数法改进自适应传输算法,并通过仿真验证了面向公平性优化后的算法能够大幅提高网络的公平性,并保障网络的吞吐性能。展开更多
In the IoT-based users monitor tasks in the network environment by participating in the data collection process by smart devices.Users monitor their data in the form of fog computing(mobile mass monitoring).Service pr...In the IoT-based users monitor tasks in the network environment by participating in the data collection process by smart devices.Users monitor their data in the form of fog computing(mobile mass monitoring).Service providers are required to pay user rewards without increasing platform costs.One of the NP-Hard methods to maximise the coverage rate and reduce the platform costs(reward)is the Cooperative Based Method for Smart Sensing Tasks(CMST).This article uses chaos theory and fuzzy parameter setting in the forest optimisation algorithm.The proposed method is implemented with MATLAB.The average findings show that the network coverage rate is 31%and the monitoring cost is 11%optimised compared to the CMST scheme and the mapping of the mobile mass monitoring problem to meta-heuristic algorithms.And using the improved forest optimisation algorithm can reduce the costs of the mobile crowd monitoring platform and has a better coverage rate.展开更多
A smart medical service system architecture is proposed in this paper to increase medical resource utilization and improve the efficiency of the medical diagnosis process for complex business scenarios in the Medical ...A smart medical service system architecture is proposed in this paper to increase medical resource utilization and improve the efficiency of the medical diagnosis process for complex business scenarios in the Medical Internet of Things(MIoT)environment.The resource representation model theory,multi-terminal aggregation algorithm,and the resource discovery algorithm based on latent factor model are also studied.A smart medical service system within the IoT environment is then developed,based on the open source project.Experimental results using real-world datasets illustrate that the proposed smart medical service system architecture can promote the intelligent and efficient management of medical resources to an extent,and assists in the develop towards digitization,intelligence,and precision in the field of medicine.展开更多
Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly d...Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly dynamic.Clustering is one of the promising solutions to maintain the route stability in the dynamic network.However,existing algorithms consume a considerable amount of time in the cluster head(CH)selection process.Thus,this study proposes a mobility aware dynamic clustering-based routing(MADCR)protocol in IoV to maximize the lifespan of networks and reduce the end-to-end delay of vehicles.The MADCR protocol consists of cluster formation and CH selection processes.A cluster is formed on the basis of Euclidean distance.The CH is then chosen using the mayfly optimization algorithm(MOA).The CH subsequently receives vehicle data and forwards such data to the Road Side Unit(RSU).The performance of the MADCR protocol is compared with that ofAnt Colony Optimization(ACO),Comprehensive Learning Particle Swarm Optimization(CLPSO),and Clustering Algorithm for Internet of Vehicles based on Dragonfly Optimizer(CAVDO).The proposed MADCR protocol decreases the end-toend delay by 5–80 ms and increases the packet delivery ratio by 5%–15%.展开更多
In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for th...In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for the distribution network only considers the information reported by the Feeder Terminal Unit(FTU)and the fault tolerance rate is low when the information is omitted or misreported.Therefore,this study considers the influence of the distributed generations(DGs)for the distribution network.This takes the CEIAS as a redundant information source and solves the model by applying a binary particle swarm optimization algorithm(BPSO).The improved Dempster/S-hafer evidence theory(D-S evidence theory)is used for evidence fusion to achieve the fault section location for the distribution network.An example is provided to verify that the proposed method can achieve single or multiple fault locations with a higher fault tolerance.展开更多
Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers an...Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers and computing resources is one of the factors that directly affect conservation costs in fog environments.Energy consumption can be reduced by efficacious scheduling methods so that tasks are offloaded on the best possible resources.To deal with this problem,a binary model based on the combination of the Krill Herd Algorithm(KHA)and the Artificial Hummingbird Algorithm(AHA)is introduced as Binary KHA-AHA(BAHA-KHA).KHA is used to improve AHA.Also,the BAHA-KHA local optimal problem for task scheduling in FC environments is solved using the dynamic voltage and frequency scaling(DVFS)method.The Heterogeneous Earliest Finish Time(HEFT)method is used to discover the order of task flow execution.The goal of the BAHA-KHA model is to minimize the number of resources,the communication between dependent tasks,and reduce energy consumption.In this paper,the FC environment is considered to address the workflow scheduling issue to reduce energy consumption and minimize makespan on fog resources.The results were tested on five different workflows(Montage,CyberShake,LIGO,SIPHT,and Epigenomics).The evaluations show that the BAHA-KHA model has the best performance in comparison with the AHA,KHA,PSO and GA algorithms.The BAHA-KHA model has reduced the makespan rate by about 18%and the energy consumption by about 24%in comparison with GA.This is a preview of subscription content,log in via an institution to check access.展开更多
The adoption of Internet of Things(IoT)sensing devices is growing rapidly due to their ability to provide realtime services.However,it is constrained by limited data storage and processing power.It offloads its massiv...The adoption of Internet of Things(IoT)sensing devices is growing rapidly due to their ability to provide realtime services.However,it is constrained by limited data storage and processing power.It offloads its massive data stream to edge devices and the cloud for adequate storage and processing.This further leads to the challenges of data outliers,data redundancies,and cloud resource load balancing that would affect the execution and outcome of data streams.This paper presents a review of existing analytics algorithms deployed on IoT-enabled edge cloud infrastructure that resolved the challenges of data outliers,data redundancies,and cloud resource load balancing.The review highlights the problems solved,the results,the weaknesses of the existing algorithms,and the physical and virtual cloud storage servers for resource load balancing.In addition,it discusses the adoption of network protocols that govern the interaction between the three-layer architecture of IoT sensing devices enabled edge cloud and its prevailing challenges.A total of 72 algorithms covering the categories of classification,regression,clustering,deep learning,and optimization have been reviewed.The classification approach has been widely adopted to solve the problem of redundant data,while clustering and optimization approaches are more used for outlier detection and cloud resource allocation.展开更多
COVID-19 is a contagious disease and its several variants put under stress in all walks of life and economy as well.Early diagnosis of the virus is a crucial task to prevent the spread of the virus as it is a threat t...COVID-19 is a contagious disease and its several variants put under stress in all walks of life and economy as well.Early diagnosis of the virus is a crucial task to prevent the spread of the virus as it is a threat to life in the whole world.However,with the advancement of technology,the Internet of Things(IoT)and social IoT(SIoT),the versatile data produced by smart devices helped a lot in overcoming this lethal disease.Data mining is a technique that could be used for extracting useful information from massive data.In this study,we used five supervised ML strategies for creating a model to analyze and forecast the existence of COVID-19 using the Kaggle dataset“COVID-19 Symptoms and Presence.”RapidMiner Studio ML software was used to apply the Decision Tree(DT),Random Forest(RF),K-Nearest Neighbors(K-NNs)and Naive Bayes(NB),Integrated Decision Tree(ID3)algorithms.To develop the model,the performance of each model was tested using 10-fold cross-validation and compared to major accuracy measures,Cohan’s kappa statistics,properly or mistakenly categorized cases and root means square error.The results demonstrate that DT outperforms other methods,with an accuracy of 98.42%and a root mean square error of 0.11.In the future,a devisedmodel will be highly recommendable and supportive for early prediction/diagnosis of disease by providing different data sets.展开更多
The purpose of this research is to deal with effective block chain framework for secure transactions.The rate of effective data transactions and the interoperability of the ledger are the two major obstacles involved ...The purpose of this research is to deal with effective block chain framework for secure transactions.The rate of effective data transactions and the interoperability of the ledger are the two major obstacles involved in Blockchain and to tackle this issue,Cross-Chain based Transaction(CCT)is introduced.Traditional industries have been restructured by the introduction of Internet of Things(IoT)to become smart industries through the feature of data-driven decision-making.Still,there are a few limitations,like decentralization,security vulnerabilities,poor interoperability,as well as privacy concerns in IoTs.To overcome this limitation,Blockchain has been employed to assure a safer transaction process,especially in asset exchanges.In recent decades,scalable local ledgers implement Blockchains,simultaneously sustaining peer validations of transactions which can be at local or global levels.From the single Hyperledger-based blockchains system,the CCT takes the transaction amid various chains.In addition,the most significant factor for this registration processing strategy is the Signature to ensure security.The application of the Quantum cryptographic algorithm amplifies the proposed Hyperledger-based blockchains,to strengthen the safety of the process.The key has been determined by restricting the number of transactions that reach the global Blockchain using the quantum-based hash function and accomplished by scalable local ledgers,and peer validations of transactions at local and global levels without any issues.The rate of transaction processing for entire peers has enhanced with the ancillary aid of the proposed solution,as it includes the procedure of load distribution.Without any boosted enhancement,the recommended solution utilizes the current transaction strategy,and also,it’s aimed at scalability,resource conservation,and interoperability.The experimental results of the system have been evaluated using the metrics like block weight,ledger memory,the usage of the central processing unit,and the communication overhead.展开更多
The Internet of Things (IoT) and Cloud computing are gaining popularity due to their numerous advantages, including the efficient utilization of internetand computing resources. In recent years, many more IoT applicat...The Internet of Things (IoT) and Cloud computing are gaining popularity due to their numerous advantages, including the efficient utilization of internetand computing resources. In recent years, many more IoT applications have beenextensively used. For instance, Healthcare applications execute computations utilizing the user’s private data stored on cloud servers. However, the main obstaclesfaced by the extensive acceptance and usage of these emerging technologies aresecurity and privacy. Moreover, many healthcare data management system applications have emerged, offering solutions for distinct circumstances. But still, theexisting system has issues with specific security issues, privacy-preserving rate,information loss, etc. Hence, the overall system performance is reduced significantly. A unique blockchain-based technique is proposed to improve anonymityin terms of data access and data privacy to overcome the above-mentioned issues.Initially, the registration phase is done for the device and the user. After that, theGeo-Location and IP Address values collected during registration are convertedinto Hash values using Adler 32 hashing algorithm, and the private and publickeys are generated using the key generation centre. Then the authentication is performed through login. The user then submits a request to the blockchain server,which redirects the request to the associated IoT device in order to obtain thesensed IoT data. The detected data is anonymized in the device and stored inthe cloud server using the Linear Scaling based Rider Optimization algorithmwith integrated KL Anonymity (LSR-KLA) approach. After that, the Time-stamp-based Public and Private Key Schnorr Signature (TSPP-SS) mechanismis used to permit the authorized user to access the data, and the blockchain servertracks the entire transaction. The experimental findings showed that the proposedLSR-KLA and TSPP-SS technique provides better performance in terms of higherprivacy-preserving rate, lower information loss, execution time, and Central Processing Unit (CPU) usage than the existing techniques. Thus, the proposed method allows for better data privacy in the smart healthcare network.展开更多
文摘Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.
基金This work was supported in part by the National Natural Science Foundation of China(Nos.62072074,62076054,62027827,62002047)the Sichuan Science and Technology Innovation Platform and Talent Plan(Nos.2020JDJQ0020,2022JDJQ0039)+2 种基金the Sichuan Science and Technology Support Plan(Nos.2020YFSY0010,2022YFQ0045,2022YFS0220,2023YFG0148,2021YFG0131)the YIBIN Science and Technology Support Plan(No.2021CG003)the Medico-Engineering Cooperation Funds from University of Electronic Science and Technology of China(Nos.ZYGX2021YGLH212,ZYGX2022YGRH012).
文摘With the continuous expansion of the Industrial Internet of Things(IIoT),more andmore organisations are placing large amounts of data in the cloud to reduce overheads.However,the channel between cloud servers and smart equipment is not trustworthy,so the issue of data authenticity needs to be addressed.The SM2 digital signature algorithm can provide an authentication mechanism for data to solve such problems.Unfortunately,it still suffers from the problem of key exposure.In order to address this concern,this study first introduces a key-insulated scheme,SM2-KI-SIGN,based on the SM2 algorithm.This scheme boasts strong key insulation and secure keyupdates.Our scheme uses the elliptic curve algorithm,which is not only more efficient but also more suitable for IIoT-cloud environments.Finally,the security proof of SM2-KI-SIGN is given under the Elliptic Curve Discrete Logarithm(ECDL)assumption in the random oracle.
基金This research was supported by X-mind Corps program of National Research Foundation of Korea(NRF)funded by the Ministry of Science,ICT(No.2019H1D8A1105622)the Soonchunhyang University Research Fund.
文摘Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a clustering protocol,the selection of a cluster head(CH)plays a key role in prolonging the lifetime of a network.However,most cluster-based protocols,including routing protocols for low-power and lossy networks(RPLs),have used fuzzy logic and probabilistic approaches to select the CH node.Consequently,early battery depletion is produced near the sink.To overcome this issue,a lion optimization algorithm(LOA)for selecting CH in RPL is proposed in this study.LOA-RPL comprises three processes:cluster formation,CH selection,and route establishment.A cluster is formed using the Euclidean distance.CH selection is performed using LOA.Route establishment is implemented using residual energy information.An extensive simulation is conducted in the network simulator ns-3 on various parameters,such as network lifetime,power consumption,packet delivery ratio(PDR),and throughput.The performance of LOA-RPL is also compared with those of RPL,fuzzy rule-based energyefficient clustering and immune-inspired routing(FEEC-IIR),and the routing scheme for IoT that uses shuffled frog-leaping optimization algorithm(RISARPL).The performance evaluation metrics used in this study are network lifetime,power consumption,PDR,and throughput.The proposed LOARPL increases network lifetime by 20%and PDR by 5%–10%compared with RPL,FEEC-IIR,and RISA-RPL.LOA-RPL is also highly energy-efficient compared with other similar routing protocols.
文摘The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping hubs,a clustering convention offers a useful solution for ensuring energy-saving of hubs andHybridMedia Access Control(HMAC)during the course of the organization.Nevertheless,current grouping standards suffer from issues with the grouping structure that impacts the exhibition of these conventions negatively.In this investigation,we recommend an Improved Energy-Proficient Algorithm(IEPA)for HMAC throughout the lifetime of the WSN-based IoT.Three consecutive segments are suggested.For the covering of adjusted clusters,an ideal number of clusters is determined first.Then,fair static clusters are shaped,based on an updated calculation for fluffy cluster heads,to reduce and adapt the energy use of the sensor hubs.Cluster heads(CHs)are,ultimately,selected in optimal locations,with the pivot of the cluster heads working among cluster members.Specifically,the proposed convention diminishes and balances the energy utilization of hubs by improving the grouping structure,where the IEPAis reasonable for systems that need a long time.The assessment results demonstrate that the IEPA performs better than existing conventions.
基金This work is supported by the Fundamental Research Funds for the Central Universities(Grant No.HIT.NSRIF.201714)Weihai Science and Technology Development Program(2016DXGJMS15)+1 种基金Key Research and Development Program in Shandong Provincial(2017GGX90103)Weihai Scientific Research and Innovation Fund(2020).
文摘In recent years,the Internet of Things technology has developed rapidly,and smart Internet of Things devices have also been widely popularized.A large amount of data is generated every moment.Now we are in the era of big data in the Internet of Things.The rapid growth of massive data has brought great challenges to storage technology,which cannot be well coped with by traditional storage technology.The demand for massive data storage has given birth to cloud storage technology.Load balancing technology plays an important role in improving the performance and resource utilization of cloud storage systems.Therefore,it is of great practical significance to study how to improve the performance and resource utilization of cloud storage systems through load balancing technology.On the basis of studying the read strategy of Swift,this article proposes a reread strategy based on load balancing of storage resources to solve the problem of unbalanced read load between interruptions caused by random data copying in Swift.The storage asynchronously tracks the I/O conversion to select the storage with the smallest load for asynchronous reading.The experimental results indicate that the proposed strategy can achieve a better load balancing state in terms of storage I/O utilization and CPU utilization than the random read strategy index of Swift.
文摘Interconnected devices and intelligent applications have slashed human intervention in the Internet of Things(IoT),making it possible to accomplish tasks with less human interaction.However,it faces many problems,including lower capacity links,energy utilization,enhancement of resources and limited resources due to its openness,heterogeneity,limited resources and extensiveness.It is challenging to route packets in such a constrained environment.In an IoT network constrained by limited resources,minimal routing control overhead is required without packet loss.Such constrained environments can be improved through the optimal routing protocol.It is challenging to route packets in such a constrained environment.Thus,this work is motivated to present an efficient routing protocol for enhancing the lifetime of the IoT network.Lightweight On-demand Ad hoc Distance-vector Routing Protocol—Next Generation(LOADng)protocol is an extended version of the Ad Hoc On-Demand Distance Vector(AODV)protocol.Unlike AODV,LOADng is a lighter version that forbids the intermediate nodes on the route to send a route reply(RREP)for the route request(RREQ),which originated from the source.A resource-constrained IoT network demands minimal routing control overhead and faster packet delivery.So,in this paper,the parameters of the LOADng routing protocol are optimized using the black widow optimization(BWO)algorithm to reduce the control overhead and delay.Furthermore,the performance of the proposed model is analyzed with the default LOADng in terms of delay,delivery ratio and overhead.Obtained results show that the LOADng-BWO protocol outperforms the conventional LOADng protocol.
基金The authors extend their appreciation to the Deputyship for Research and Innovation,Ministry of Education in Saudi Arabia for funding this research work through the project number 2022/01/22636.
文摘In the Internet of Things(IoT),the users have complex needs,and the Web Service Composition(WSC)was introduced to address these needs.The WSC’s main objective is to search for the optimal combination of web services in response to the user needs and the level of Quality of Services(QoS)constraints.The challenge of this problem is the huge number of web services that achieve similar functionality with different levels of QoS constraints.In this paper,we introduce an extension of our previous works on the Artificial Bee Colony(ABC)and Bat Algorithm(BA).A new hybrid algorithm was proposed between the ABC and BA to achieve a better tradeoff between local exploitation and global search.The bat agent is used to improve the solution of exhausted bees after a threshold(limits),and also an Elitist Strategy(ES)is added to BA to increase the convergence rate.The performance and convergence behavior of the proposed hybrid algorithm was tested using extensive comparative experiments with current state-ofthe-art nature-inspired algorithms on 12 benchmark datasets using three evaluation criteria(average fitness values,best fitness values,and execution time)that were measured for 30 different runs.These datasets are created from real-world datasets and artificially to form different scale sizes of WSC datasets.The results show that the proposed algorithm enhances the search performance and convergence rate on finding the near-optimal web services combination compared to competitors.TheWilcoxon signed-rank significant test is usedwhere the proposed algorithm results significantly differ fromother algorithms on 100%of datasets.
文摘The Industrial Internet of Things(IIoT)has brought numerous benefits,such as improved efficiency,smart analytics,and increased automation.However,it also exposes connected devices,users,applications,and data generated to cyber security threats that need to be addressed.This work investigates hybrid cyber threats(HCTs),which are now working on an entirely new level with the increasingly adopted IIoT.This work focuses on emerging methods to model,detect,and defend against hybrid cyber attacks using machine learning(ML)techniques.Specifically,a novel ML-based HCT modelling and analysis framework was proposed,in which L1 regularisation and Random Forest were used to cluster features and analyse the importance and impact of each feature in both individual threats and HCTs.A grey relation analysis-based model was employed to construct the correlation between IIoT components and different threats.
文摘随着物联网(IoT, internet of things)基站的部署愈发密集,网络干扰管控的重要性愈发凸显。物联网中,设备常采用随机接入,以分布式的方式接入信道。在海量设备的物联网场景中,节点之间可能会出现严重的干扰,导致网络的吞吐量性能严重下降。为了解决随机接入网络中的干扰管控问题,考虑基于协作接收的多基站时隙Aloha网络,利用强化学习工具,设计自适应传输算法,实现干扰管控,优化网络的吞吐量性能,并提高网络的公平性。首先,设计了基于Q-学习的自适应传输算法,通过仿真验证了该算法面对不同网络流量时均能保障较高的网络吞吐量性能。其次,为了提高网络的公平性,采用惩罚函数法改进自适应传输算法,并通过仿真验证了面向公平性优化后的算法能够大幅提高网络的公平性,并保障网络的吞吐性能。
文摘In the IoT-based users monitor tasks in the network environment by participating in the data collection process by smart devices.Users monitor their data in the form of fog computing(mobile mass monitoring).Service providers are required to pay user rewards without increasing platform costs.One of the NP-Hard methods to maximise the coverage rate and reduce the platform costs(reward)is the Cooperative Based Method for Smart Sensing Tasks(CMST).This article uses chaos theory and fuzzy parameter setting in the forest optimisation algorithm.The proposed method is implemented with MATLAB.The average findings show that the network coverage rate is 31%and the monitoring cost is 11%optimised compared to the CMST scheme and the mapping of the mobile mass monitoring problem to meta-heuristic algorithms.And using the improved forest optimisation algorithm can reduce the costs of the mobile crowd monitoring platform and has a better coverage rate.
基金supported by the National Key R&D Program of China(2018YFC1314901)the Natural Science Foundation of China (61871446)the Scientific Research Starting Foundation for New Teachers of Nanjing University of Posts and Telecommunications (NY217033)
文摘A smart medical service system architecture is proposed in this paper to increase medical resource utilization and improve the efficiency of the medical diagnosis process for complex business scenarios in the Medical Internet of Things(MIoT)environment.The resource representation model theory,multi-terminal aggregation algorithm,and the resource discovery algorithm based on latent factor model are also studied.A smart medical service system within the IoT environment is then developed,based on the open source project.Experimental results using real-world datasets illustrate that the proposed smart medical service system architecture can promote the intelligent and efficient management of medical resources to an extent,and assists in the develop towards digitization,intelligence,and precision in the field of medicine.
基金This work was supported by National Natural Science Foundation of China(No.61821001)Science and Tech-nology Key Project of Guangdong Province,China(2019B010157001).
文摘Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly dynamic.Clustering is one of the promising solutions to maintain the route stability in the dynamic network.However,existing algorithms consume a considerable amount of time in the cluster head(CH)selection process.Thus,this study proposes a mobility aware dynamic clustering-based routing(MADCR)protocol in IoV to maximize the lifespan of networks and reduce the end-to-end delay of vehicles.The MADCR protocol consists of cluster formation and CH selection processes.A cluster is formed on the basis of Euclidean distance.The CH is then chosen using the mayfly optimization algorithm(MOA).The CH subsequently receives vehicle data and forwards such data to the Road Side Unit(RSU).The performance of the MADCR protocol is compared with that ofAnt Colony Optimization(ACO),Comprehensive Learning Particle Swarm Optimization(CLPSO),and Clustering Algorithm for Internet of Vehicles based on Dragonfly Optimizer(CAVDO).The proposed MADCR protocol decreases the end-toend delay by 5–80 ms and increases the packet delivery ratio by 5%–15%.
基金supported by the Science and Technology Project of State Grid Shandong Electric Power Company?“Research on the Data-Driven Method for Energy Internet”?(Project No.2018A-100)。
文摘In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for the distribution network only considers the information reported by the Feeder Terminal Unit(FTU)and the fault tolerance rate is low when the information is omitted or misreported.Therefore,this study considers the influence of the distributed generations(DGs)for the distribution network.This takes the CEIAS as a redundant information source and solves the model by applying a binary particle swarm optimization algorithm(BPSO).The improved Dempster/S-hafer evidence theory(D-S evidence theory)is used for evidence fusion to achieve the fault section location for the distribution network.An example is provided to verify that the proposed method can achieve single or multiple fault locations with a higher fault tolerance.
文摘Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers and computing resources is one of the factors that directly affect conservation costs in fog environments.Energy consumption can be reduced by efficacious scheduling methods so that tasks are offloaded on the best possible resources.To deal with this problem,a binary model based on the combination of the Krill Herd Algorithm(KHA)and the Artificial Hummingbird Algorithm(AHA)is introduced as Binary KHA-AHA(BAHA-KHA).KHA is used to improve AHA.Also,the BAHA-KHA local optimal problem for task scheduling in FC environments is solved using the dynamic voltage and frequency scaling(DVFS)method.The Heterogeneous Earliest Finish Time(HEFT)method is used to discover the order of task flow execution.The goal of the BAHA-KHA model is to minimize the number of resources,the communication between dependent tasks,and reduce energy consumption.In this paper,the FC environment is considered to address the workflow scheduling issue to reduce energy consumption and minimize makespan on fog resources.The results were tested on five different workflows(Montage,CyberShake,LIGO,SIPHT,and Epigenomics).The evaluations show that the BAHA-KHA model has the best performance in comparison with the AHA,KHA,PSO and GA algorithms.The BAHA-KHA model has reduced the makespan rate by about 18%and the energy consumption by about 24%in comparison with GA.This is a preview of subscription content,log in via an institution to check access.
文摘The adoption of Internet of Things(IoT)sensing devices is growing rapidly due to their ability to provide realtime services.However,it is constrained by limited data storage and processing power.It offloads its massive data stream to edge devices and the cloud for adequate storage and processing.This further leads to the challenges of data outliers,data redundancies,and cloud resource load balancing that would affect the execution and outcome of data streams.This paper presents a review of existing analytics algorithms deployed on IoT-enabled edge cloud infrastructure that resolved the challenges of data outliers,data redundancies,and cloud resource load balancing.The review highlights the problems solved,the results,the weaknesses of the existing algorithms,and the physical and virtual cloud storage servers for resource load balancing.In addition,it discusses the adoption of network protocols that govern the interaction between the three-layer architecture of IoT sensing devices enabled edge cloud and its prevailing challenges.A total of 72 algorithms covering the categories of classification,regression,clustering,deep learning,and optimization have been reviewed.The classification approach has been widely adopted to solve the problem of redundant data,while clustering and optimization approaches are more used for outlier detection and cloud resource allocation.
基金supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.2021R1A5A1021944 and 2021R1A5A1021944)supported by Kyungpook National University Research Fund,2020.
文摘COVID-19 is a contagious disease and its several variants put under stress in all walks of life and economy as well.Early diagnosis of the virus is a crucial task to prevent the spread of the virus as it is a threat to life in the whole world.However,with the advancement of technology,the Internet of Things(IoT)and social IoT(SIoT),the versatile data produced by smart devices helped a lot in overcoming this lethal disease.Data mining is a technique that could be used for extracting useful information from massive data.In this study,we used five supervised ML strategies for creating a model to analyze and forecast the existence of COVID-19 using the Kaggle dataset“COVID-19 Symptoms and Presence.”RapidMiner Studio ML software was used to apply the Decision Tree(DT),Random Forest(RF),K-Nearest Neighbors(K-NNs)and Naive Bayes(NB),Integrated Decision Tree(ID3)algorithms.To develop the model,the performance of each model was tested using 10-fold cross-validation and compared to major accuracy measures,Cohan’s kappa statistics,properly or mistakenly categorized cases and root means square error.The results demonstrate that DT outperforms other methods,with an accuracy of 98.42%and a root mean square error of 0.11.In the future,a devisedmodel will be highly recommendable and supportive for early prediction/diagnosis of disease by providing different data sets.
文摘The purpose of this research is to deal with effective block chain framework for secure transactions.The rate of effective data transactions and the interoperability of the ledger are the two major obstacles involved in Blockchain and to tackle this issue,Cross-Chain based Transaction(CCT)is introduced.Traditional industries have been restructured by the introduction of Internet of Things(IoT)to become smart industries through the feature of data-driven decision-making.Still,there are a few limitations,like decentralization,security vulnerabilities,poor interoperability,as well as privacy concerns in IoTs.To overcome this limitation,Blockchain has been employed to assure a safer transaction process,especially in asset exchanges.In recent decades,scalable local ledgers implement Blockchains,simultaneously sustaining peer validations of transactions which can be at local or global levels.From the single Hyperledger-based blockchains system,the CCT takes the transaction amid various chains.In addition,the most significant factor for this registration processing strategy is the Signature to ensure security.The application of the Quantum cryptographic algorithm amplifies the proposed Hyperledger-based blockchains,to strengthen the safety of the process.The key has been determined by restricting the number of transactions that reach the global Blockchain using the quantum-based hash function and accomplished by scalable local ledgers,and peer validations of transactions at local and global levels without any issues.The rate of transaction processing for entire peers has enhanced with the ancillary aid of the proposed solution,as it includes the procedure of load distribution.Without any boosted enhancement,the recommended solution utilizes the current transaction strategy,and also,it’s aimed at scalability,resource conservation,and interoperability.The experimental results of the system have been evaluated using the metrics like block weight,ledger memory,the usage of the central processing unit,and the communication overhead.
文摘The Internet of Things (IoT) and Cloud computing are gaining popularity due to their numerous advantages, including the efficient utilization of internetand computing resources. In recent years, many more IoT applications have beenextensively used. For instance, Healthcare applications execute computations utilizing the user’s private data stored on cloud servers. However, the main obstaclesfaced by the extensive acceptance and usage of these emerging technologies aresecurity and privacy. Moreover, many healthcare data management system applications have emerged, offering solutions for distinct circumstances. But still, theexisting system has issues with specific security issues, privacy-preserving rate,information loss, etc. Hence, the overall system performance is reduced significantly. A unique blockchain-based technique is proposed to improve anonymityin terms of data access and data privacy to overcome the above-mentioned issues.Initially, the registration phase is done for the device and the user. After that, theGeo-Location and IP Address values collected during registration are convertedinto Hash values using Adler 32 hashing algorithm, and the private and publickeys are generated using the key generation centre. Then the authentication is performed through login. The user then submits a request to the blockchain server,which redirects the request to the associated IoT device in order to obtain thesensed IoT data. The detected data is anonymized in the device and stored inthe cloud server using the Linear Scaling based Rider Optimization algorithmwith integrated KL Anonymity (LSR-KLA) approach. After that, the Time-stamp-based Public and Private Key Schnorr Signature (TSPP-SS) mechanismis used to permit the authorized user to access the data, and the blockchain servertracks the entire transaction. The experimental findings showed that the proposedLSR-KLA and TSPP-SS technique provides better performance in terms of higherprivacy-preserving rate, lower information loss, execution time, and Central Processing Unit (CPU) usage than the existing techniques. Thus, the proposed method allows for better data privacy in the smart healthcare network.