期刊文献+
共找到567篇文章
< 1 2 29 >
每页显示 20 50 100
Internet of Things Enabled DDoS Attack Detection Using Pigeon Inspired Optimization Algorithm with Deep Learning Approach
1
作者 Turki Ali Alghamdi Saud S.Alotaibi 《Computers, Materials & Continua》 SCIE EI 2024年第9期4047-4064,共18页
Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessi... Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%. 展开更多
关键词 internet of things denial of service deep learning reptile search algorithm feature selection
下载PDF
An Efficient and Provably Secure SM2 Key-Insulated Signature Scheme for Industrial Internet of Things
2
作者 Senshan Ouyang Xiang Liu +3 位作者 Lei Liu Shangchao Wang Baichuan Shao Yang Zhao 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第1期903-915,共13页
With the continuous expansion of the Industrial Internet of Things(IIoT),more andmore organisations are placing large amounts of data in the cloud to reduce overheads.However,the channel between cloud servers and smar... With the continuous expansion of the Industrial Internet of Things(IIoT),more andmore organisations are placing large amounts of data in the cloud to reduce overheads.However,the channel between cloud servers and smart equipment is not trustworthy,so the issue of data authenticity needs to be addressed.The SM2 digital signature algorithm can provide an authentication mechanism for data to solve such problems.Unfortunately,it still suffers from the problem of key exposure.In order to address this concern,this study first introduces a key-insulated scheme,SM2-KI-SIGN,based on the SM2 algorithm.This scheme boasts strong key insulation and secure keyupdates.Our scheme uses the elliptic curve algorithm,which is not only more efficient but also more suitable for IIoT-cloud environments.Finally,the security proof of SM2-KI-SIGN is given under the Elliptic Curve Discrete Logarithm(ECDL)assumption in the random oracle. 展开更多
关键词 KEY-INSULATED SM2 algorithm digital signature Industrial internet of things(IIoT) provable security
下载PDF
LOA-RPL:Novel Energy-Efficient Routing Protocol for the Internet of Things Using Lion Optimization Algorithm to Maximize Network Lifetime
3
作者 Sankar Sennan Somula Ramasubbareddy +2 位作者 Anand Nayyar Yunyoung Nam Mohamed Abouhawwash 《Computers, Materials & Continua》 SCIE EI 2021年第10期351-371,共21页
Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a c... Energy conservation is a significant task in the Internet of Things(IoT)because IoT involves highly resource-constrained devices.Clustering is an effective technique for saving energy by reducing duplicate data.In a clustering protocol,the selection of a cluster head(CH)plays a key role in prolonging the lifetime of a network.However,most cluster-based protocols,including routing protocols for low-power and lossy networks(RPLs),have used fuzzy logic and probabilistic approaches to select the CH node.Consequently,early battery depletion is produced near the sink.To overcome this issue,a lion optimization algorithm(LOA)for selecting CH in RPL is proposed in this study.LOA-RPL comprises three processes:cluster formation,CH selection,and route establishment.A cluster is formed using the Euclidean distance.CH selection is performed using LOA.Route establishment is implemented using residual energy information.An extensive simulation is conducted in the network simulator ns-3 on various parameters,such as network lifetime,power consumption,packet delivery ratio(PDR),and throughput.The performance of LOA-RPL is also compared with those of RPL,fuzzy rule-based energyefficient clustering and immune-inspired routing(FEEC-IIR),and the routing scheme for IoT that uses shuffled frog-leaping optimization algorithm(RISARPL).The performance evaluation metrics used in this study are network lifetime,power consumption,PDR,and throughput.The proposed LOARPL increases network lifetime by 20%and PDR by 5%–10%compared with RPL,FEEC-IIR,and RISA-RPL.LOA-RPL is also highly energy-efficient compared with other similar routing protocols. 展开更多
关键词 internet of things cluster head clustering protocol optimization algorithm lion optimization algorithm network lifetime routing protocol wireless sensor networks energy consumption low-power and lossy networks
下载PDF
An Energy-Efficient Protocol for Internet of Things Based Wireless Sensor Networks
4
作者 Mohammed Mubarak Mustafa Ahmed Abelmonem Khalifa +1 位作者 Korhan Cengiz Nikola Ivkovic 《Computers, Materials & Continua》 SCIE EI 2023年第5期2397-2412,共16页
The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping h... The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping hubs,a clustering convention offers a useful solution for ensuring energy-saving of hubs andHybridMedia Access Control(HMAC)during the course of the organization.Nevertheless,current grouping standards suffer from issues with the grouping structure that impacts the exhibition of these conventions negatively.In this investigation,we recommend an Improved Energy-Proficient Algorithm(IEPA)for HMAC throughout the lifetime of the WSN-based IoT.Three consecutive segments are suggested.For the covering of adjusted clusters,an ideal number of clusters is determined first.Then,fair static clusters are shaped,based on an updated calculation for fluffy cluster heads,to reduce and adapt the energy use of the sensor hubs.Cluster heads(CHs)are,ultimately,selected in optimal locations,with the pivot of the cluster heads working among cluster members.Specifically,the proposed convention diminishes and balances the energy utilization of hubs by improving the grouping structure,where the IEPAis reasonable for systems that need a long time.The assessment results demonstrate that the IEPA performs better than existing conventions. 展开更多
关键词 Energy consumption improved energy-proficient algorithm internet of things wireless sensor network
下载PDF
Exploration on the Load Balancing Technique for Platform of Internet of Things
5
作者 Donglei Lu Dongjie Zhu +6 位作者 Yundong Sun Haiwen Du Xiaofang Li Rongning Qu Yansong Wang Ning Cao Helen Min Zhou 《Computer Systems Science & Engineering》 SCIE EI 2021年第9期339-350,共12页
In recent years,the Internet of Things technology has developed rapidly,and smart Internet of Things devices have also been widely popularized.A large amount of data is generated every moment.Now we are in the era of ... In recent years,the Internet of Things technology has developed rapidly,and smart Internet of Things devices have also been widely popularized.A large amount of data is generated every moment.Now we are in the era of big data in the Internet of Things.The rapid growth of massive data has brought great challenges to storage technology,which cannot be well coped with by traditional storage technology.The demand for massive data storage has given birth to cloud storage technology.Load balancing technology plays an important role in improving the performance and resource utilization of cloud storage systems.Therefore,it is of great practical significance to study how to improve the performance and resource utilization of cloud storage systems through load balancing technology.On the basis of studying the read strategy of Swift,this article proposes a reread strategy based on load balancing of storage resources to solve the problem of unbalanced read load between interruptions caused by random data copying in Swift.The storage asynchronously tracks the I/O conversion to select the storage with the smallest load for asynchronous reading.The experimental results indicate that the proposed strategy can achieve a better load balancing state in terms of storage I/O utilization and CPU utilization than the random read strategy index of Swift. 展开更多
关键词 the internet of things cloud storage SWIFT load balancing scheduling algorithm
下载PDF
Optimized Tuning of LOADng Routing Protocol Parameters for IoT
6
作者 Divya Sharma Sanjay Jain Vivek Maik 《Computer Systems Science & Engineering》 SCIE EI 2023年第8期1549-1561,共13页
Interconnected devices and intelligent applications have slashed human intervention in the Internet of Things(IoT),making it possible to accomplish tasks with less human interaction.However,it faces many problems,incl... Interconnected devices and intelligent applications have slashed human intervention in the Internet of Things(IoT),making it possible to accomplish tasks with less human interaction.However,it faces many problems,including lower capacity links,energy utilization,enhancement of resources and limited resources due to its openness,heterogeneity,limited resources and extensiveness.It is challenging to route packets in such a constrained environment.In an IoT network constrained by limited resources,minimal routing control overhead is required without packet loss.Such constrained environments can be improved through the optimal routing protocol.It is challenging to route packets in such a constrained environment.Thus,this work is motivated to present an efficient routing protocol for enhancing the lifetime of the IoT network.Lightweight On-demand Ad hoc Distance-vector Routing Protocol—Next Generation(LOADng)protocol is an extended version of the Ad Hoc On-Demand Distance Vector(AODV)protocol.Unlike AODV,LOADng is a lighter version that forbids the intermediate nodes on the route to send a route reply(RREP)for the route request(RREQ),which originated from the source.A resource-constrained IoT network demands minimal routing control overhead and faster packet delivery.So,in this paper,the parameters of the LOADng routing protocol are optimized using the black widow optimization(BWO)algorithm to reduce the control overhead and delay.Furthermore,the performance of the proposed model is analyzed with the default LOADng in terms of delay,delivery ratio and overhead.Obtained results show that the LOADng-BWO protocol outperforms the conventional LOADng protocol. 展开更多
关键词 internet of things optimization algorithm routing protocol LOADng
下载PDF
Hybridizing Artificial Bee Colony with Bat Algorithm for Web Service Composition
7
作者 Tariq Ahamed Ahanger Fadl Dahan Usman Tariq 《Computer Systems Science & Engineering》 SCIE EI 2023年第8期2429-2445,共17页
In the Internet of Things(IoT),the users have complex needs,and the Web Service Composition(WSC)was introduced to address these needs.The WSC’s main objective is to search for the optimal combination of web services ... In the Internet of Things(IoT),the users have complex needs,and the Web Service Composition(WSC)was introduced to address these needs.The WSC’s main objective is to search for the optimal combination of web services in response to the user needs and the level of Quality of Services(QoS)constraints.The challenge of this problem is the huge number of web services that achieve similar functionality with different levels of QoS constraints.In this paper,we introduce an extension of our previous works on the Artificial Bee Colony(ABC)and Bat Algorithm(BA).A new hybrid algorithm was proposed between the ABC and BA to achieve a better tradeoff between local exploitation and global search.The bat agent is used to improve the solution of exhausted bees after a threshold(limits),and also an Elitist Strategy(ES)is added to BA to increase the convergence rate.The performance and convergence behavior of the proposed hybrid algorithm was tested using extensive comparative experiments with current state-ofthe-art nature-inspired algorithms on 12 benchmark datasets using three evaluation criteria(average fitness values,best fitness values,and execution time)that were measured for 30 different runs.These datasets are created from real-world datasets and artificially to form different scale sizes of WSC datasets.The results show that the proposed algorithm enhances the search performance and convergence rate on finding the near-optimal web services combination compared to competitors.TheWilcoxon signed-rank significant test is usedwhere the proposed algorithm results significantly differ fromother algorithms on 100%of datasets. 展开更多
关键词 internet of things artificial bee colony bat algorithm elitist strategy web service composition
下载PDF
A Review of Hybrid Cyber Threats Modelling and Detection Using Artificial Intelligence in IIoT
8
作者 Yifan Liu Shancang Li +1 位作者 Xinheng Wang Li Xu 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第8期1233-1261,共29页
The Industrial Internet of Things(IIoT)has brought numerous benefits,such as improved efficiency,smart analytics,and increased automation.However,it also exposes connected devices,users,applications,and data generated... The Industrial Internet of Things(IIoT)has brought numerous benefits,such as improved efficiency,smart analytics,and increased automation.However,it also exposes connected devices,users,applications,and data generated to cyber security threats that need to be addressed.This work investigates hybrid cyber threats(HCTs),which are now working on an entirely new level with the increasingly adopted IIoT.This work focuses on emerging methods to model,detect,and defend against hybrid cyber attacks using machine learning(ML)techniques.Specifically,a novel ML-based HCT modelling and analysis framework was proposed,in which L1 regularisation and Random Forest were used to cluster features and analyse the importance and impact of each feature in both individual threats and HCTs.A grey relation analysis-based model was employed to construct the correlation between IIoT components and different threats. 展开更多
关键词 Cyber security Industrial internet of things artificial intelligence machine learning algorithms hybrid cyber threats
下载PDF
基于强化学习的多基站协作接收时隙Aloha网络信道接入机制
9
作者 黄元康 詹文 孙兴华 《物联网学报》 2024年第2期26-35,共10页
随着物联网(IoT, internet of things)基站的部署愈发密集,网络干扰管控的重要性愈发凸显。物联网中,设备常采用随机接入,以分布式的方式接入信道。在海量设备的物联网场景中,节点之间可能会出现严重的干扰,导致网络的吞吐量性能严重下... 随着物联网(IoT, internet of things)基站的部署愈发密集,网络干扰管控的重要性愈发凸显。物联网中,设备常采用随机接入,以分布式的方式接入信道。在海量设备的物联网场景中,节点之间可能会出现严重的干扰,导致网络的吞吐量性能严重下降。为了解决随机接入网络中的干扰管控问题,考虑基于协作接收的多基站时隙Aloha网络,利用强化学习工具,设计自适应传输算法,实现干扰管控,优化网络的吞吐量性能,并提高网络的公平性。首先,设计了基于Q-学习的自适应传输算法,通过仿真验证了该算法面对不同网络流量时均能保障较高的网络吞吐量性能。其次,为了提高网络的公平性,采用惩罚函数法改进自适应传输算法,并通过仿真验证了面向公平性优化后的算法能够大幅提高网络的公平性,并保障网络的吞吐性能。 展开更多
关键词 强化学习 物联网 随机接入 多基站网络 时隙aloha
下载PDF
基于Python Anaconda的物联网ALOHA算法仿真研究
10
作者 杨群伟 郭丽清 王宁 《软件导刊》 2024年第8期151-155,共5页
ALOHA算法是RFID技术解决多标签冲突的关键方法,传统多采用MATLAB对其进行仿真验证。但受中美两国竞争影响,高校使用受限,现Python大数据分析工具已成为MATLAB的有效替代。首先介绍ALOHA算法工作原理,其次使用排列组合和极限思想推导其... ALOHA算法是RFID技术解决多标签冲突的关键方法,传统多采用MATLAB对其进行仿真验证。但受中美两国竞争影响,高校使用受限,现Python大数据分析工具已成为MATLAB的有效替代。首先介绍ALOHA算法工作原理,其次使用排列组合和极限思想推导其理论吞吐率的数学表达式,最后引入Python大数据分析/科学计算平台Anaconda,实现了经典ALOHA算法的碰撞模拟和吞吐率分析。该方法有助于改进ALOHA算法的理论教学和研究工作,并解决了MATLAB使用受限的问题。 展开更多
关键词 物联网aloha算法 数据可视化 计算机仿真 Python Anaconda
下载PDF
Improving mobile mass monitoring in the IoT environment based on Fog computing using an improved forest optimization algorithm
11
作者 Tahere Motedayen Mahdi Yaghoobi Maryam Kheirabadi 《Journal of Control and Decision》 EI 2024年第1期36-49,共14页
In the IoT-based users monitor tasks in the network environment by participating in the data collection process by smart devices.Users monitor their data in the form of fog computing(mobile mass monitoring).Service pr... In the IoT-based users monitor tasks in the network environment by participating in the data collection process by smart devices.Users monitor their data in the form of fog computing(mobile mass monitoring).Service providers are required to pay user rewards without increasing platform costs.One of the NP-Hard methods to maximise the coverage rate and reduce the platform costs(reward)is the Cooperative Based Method for Smart Sensing Tasks(CMST).This article uses chaos theory and fuzzy parameter setting in the forest optimisation algorithm.The proposed method is implemented with MATLAB.The average findings show that the network coverage rate is 31%and the monitoring cost is 11%optimised compared to the CMST scheme and the mapping of the mobile mass monitoring problem to meta-heuristic algorithms.And using the improved forest optimisation algorithm can reduce the costs of the mobile crowd monitoring platform and has a better coverage rate. 展开更多
关键词 internet of things mobile mass monitoring forest optimization algorithm chaos theory fuzzy system
原文传递
A Study on Service-Oriented Smart Medical Systems Combined with Key Algorithms in the IoT Environment 被引量:5
12
作者 Shan Lu Anzhi Wang +4 位作者 Shenqi Jing Tao Shan Xin Zhang Yongan Guo Yun Liu 《China Communications》 SCIE CSCD 2019年第9期235-249,共15页
A smart medical service system architecture is proposed in this paper to increase medical resource utilization and improve the efficiency of the medical diagnosis process for complex business scenarios in the Medical ... A smart medical service system architecture is proposed in this paper to increase medical resource utilization and improve the efficiency of the medical diagnosis process for complex business scenarios in the Medical Internet of Things(MIoT)environment.The resource representation model theory,multi-terminal aggregation algorithm,and the resource discovery algorithm based on latent factor model are also studied.A smart medical service system within the IoT environment is then developed,based on the open source project.Experimental results using real-world datasets illustrate that the proposed smart medical service system architecture can promote the intelligent and efficient management of medical resources to an extent,and assists in the develop towards digitization,intelligence,and precision in the field of medicine. 展开更多
关键词 internet of things RESOURCE representationmodel RESOURCE DISCOVERY algorithm SMART MEDICAL service system
下载PDF
MADCR:Mobility Aware Dynamic Clustering-Based Routing Protocol in Internet of Vehicles 被引量:9
13
作者 Sankar Sennan Somula Ramasubbareddy +3 位作者 Sathiyabhama Balasubramaniyam Anand Nayyar Chaker Abdelaziz Kerrache Muhammad Bilal 《China Communications》 SCIE CSCD 2021年第7期69-85,共17页
Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly d... Internet of Vehicles(IoV)is an evolution of the Internet of Things(IoT)to improve the capabilities of vehicular ad-hoc networks(VANETs)in intelligence transport systems.The network topology in IoV paradigm is highly dynamic.Clustering is one of the promising solutions to maintain the route stability in the dynamic network.However,existing algorithms consume a considerable amount of time in the cluster head(CH)selection process.Thus,this study proposes a mobility aware dynamic clustering-based routing(MADCR)protocol in IoV to maximize the lifespan of networks and reduce the end-to-end delay of vehicles.The MADCR protocol consists of cluster formation and CH selection processes.A cluster is formed on the basis of Euclidean distance.The CH is then chosen using the mayfly optimization algorithm(MOA).The CH subsequently receives vehicle data and forwards such data to the Road Side Unit(RSU).The performance of the MADCR protocol is compared with that ofAnt Colony Optimization(ACO),Comprehensive Learning Particle Swarm Optimization(CLPSO),and Clustering Algorithm for Internet of Vehicles based on Dragonfly Optimizer(CAVDO).The proposed MADCR protocol decreases the end-toend delay by 5–80 ms and increases the packet delivery ratio by 5%–15%. 展开更多
关键词 clustering protocol internet of things internet of vehicles optimization algorithm Mayfly algorithm
下载PDF
Fault location of distribution networks based on multi-source information 被引量:8
14
作者 Wenbo Li Jianjun Su +2 位作者 Xin Wang Jiamei Li Qian Ai 《Global Energy Interconnection》 2020年第1期77-85,共9页
In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for th... In order to promote the development of the Internet of Things(IoT),there has been an increase in the coverage of the customer electric information acquisition system(CEIAS).The traditional fault location method for the distribution network only considers the information reported by the Feeder Terminal Unit(FTU)and the fault tolerance rate is low when the information is omitted or misreported.Therefore,this study considers the influence of the distributed generations(DGs)for the distribution network.This takes the CEIAS as a redundant information source and solves the model by applying a binary particle swarm optimization algorithm(BPSO).The improved Dempster/S-hafer evidence theory(D-S evidence theory)is used for evidence fusion to achieve the fault section location for the distribution network.An example is provided to verify that the proposed method can achieve single or multiple fault locations with a higher fault tolerance. 展开更多
关键词 internet of things Multi-source information D-S evidence theory Binary particle swarm optimization algorithm Fault tolerance
下载PDF
The Application of Hybrid Krill Herd Artificial Hummingbird Algorithm for Scientific Workflow Scheduling in Fog Computing 被引量:1
15
作者 Aveen Othman Abdalrahman Daniel Pilevarzadeh +1 位作者 Shafi Ghafouri Ali Ghaffari 《Journal of Bionic Engineering》 SCIE EI CSCD 2023年第5期2443-2464,共22页
Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers an... Fog Computing(FC)provides processing and storage resources at the edge of the Internet of Things(IoT).By doing so,FC can help reduce latency and improve reliability of IoT networks.The energy consumption of servers and computing resources is one of the factors that directly affect conservation costs in fog environments.Energy consumption can be reduced by efficacious scheduling methods so that tasks are offloaded on the best possible resources.To deal with this problem,a binary model based on the combination of the Krill Herd Algorithm(KHA)and the Artificial Hummingbird Algorithm(AHA)is introduced as Binary KHA-AHA(BAHA-KHA).KHA is used to improve AHA.Also,the BAHA-KHA local optimal problem for task scheduling in FC environments is solved using the dynamic voltage and frequency scaling(DVFS)method.The Heterogeneous Earliest Finish Time(HEFT)method is used to discover the order of task flow execution.The goal of the BAHA-KHA model is to minimize the number of resources,the communication between dependent tasks,and reduce energy consumption.In this paper,the FC environment is considered to address the workflow scheduling issue to reduce energy consumption and minimize makespan on fog resources.The results were tested on five different workflows(Montage,CyberShake,LIGO,SIPHT,and Epigenomics).The evaluations show that the BAHA-KHA model has the best performance in comparison with the AHA,KHA,PSO and GA algorithms.The BAHA-KHA model has reduced the makespan rate by about 18%and the energy consumption by about 24%in comparison with GA.This is a preview of subscription content,log in via an institution to check access. 展开更多
关键词 Workflow Scheduling Fog Computing internet of things Hummingbird algorithm Krill algorithm
原文传递
IoT data analytic algorithms on edge-cloud infrastructure:A review
16
作者 Abel E.Edje M.S.Abd Latiff Weng Howe Chan 《Digital Communications and Networks》 SCIE CSCD 2023年第6期1486-1515,共30页
The adoption of Internet of Things(IoT)sensing devices is growing rapidly due to their ability to provide realtime services.However,it is constrained by limited data storage and processing power.It offloads its massiv... The adoption of Internet of Things(IoT)sensing devices is growing rapidly due to their ability to provide realtime services.However,it is constrained by limited data storage and processing power.It offloads its massive data stream to edge devices and the cloud for adequate storage and processing.This further leads to the challenges of data outliers,data redundancies,and cloud resource load balancing that would affect the execution and outcome of data streams.This paper presents a review of existing analytics algorithms deployed on IoT-enabled edge cloud infrastructure that resolved the challenges of data outliers,data redundancies,and cloud resource load balancing.The review highlights the problems solved,the results,the weaknesses of the existing algorithms,and the physical and virtual cloud storage servers for resource load balancing.In addition,it discusses the adoption of network protocols that govern the interaction between the three-layer architecture of IoT sensing devices enabled edge cloud and its prevailing challenges.A total of 72 algorithms covering the categories of classification,regression,clustering,deep learning,and optimization have been reviewed.The classification approach has been widely adopted to solve the problem of redundant data,while clustering and optimization approaches are more used for outlier detection and cloud resource allocation. 展开更多
关键词 internet of things Cloud platform Edge Analytic algorithms Processes Network communication protocols
下载PDF
COVID-19 Outbreak Prediction by Using Machine Learning Algorithms
17
作者 Tahir Sher Abdul Rehman Dongsun Kim 《Computers, Materials & Continua》 SCIE EI 2023年第1期1561-1574,共14页
COVID-19 is a contagious disease and its several variants put under stress in all walks of life and economy as well.Early diagnosis of the virus is a crucial task to prevent the spread of the virus as it is a threat t... COVID-19 is a contagious disease and its several variants put under stress in all walks of life and economy as well.Early diagnosis of the virus is a crucial task to prevent the spread of the virus as it is a threat to life in the whole world.However,with the advancement of technology,the Internet of Things(IoT)and social IoT(SIoT),the versatile data produced by smart devices helped a lot in overcoming this lethal disease.Data mining is a technique that could be used for extracting useful information from massive data.In this study,we used five supervised ML strategies for creating a model to analyze and forecast the existence of COVID-19 using the Kaggle dataset“COVID-19 Symptoms and Presence.”RapidMiner Studio ML software was used to apply the Decision Tree(DT),Random Forest(RF),K-Nearest Neighbors(K-NNs)and Naive Bayes(NB),Integrated Decision Tree(ID3)algorithms.To develop the model,the performance of each model was tested using 10-fold cross-validation and compared to major accuracy measures,Cohan’s kappa statistics,properly or mistakenly categorized cases and root means square error.The results demonstrate that DT outperforms other methods,with an accuracy of 98.42%and a root mean square error of 0.11.In the future,a devisedmodel will be highly recommendable and supportive for early prediction/diagnosis of disease by providing different data sets. 展开更多
关键词 COVID-19 prediction COVID-19 analysis machine learning(ML) algorithms internet of things(IoT) social IoT(SIoT)
下载PDF
An Interoperability Cross-Block Chain Framework for Secure Transactions in IoT
18
作者 N.Anand Kumar A.Grace Selvarani P.Vivekanandan 《Computer Systems Science & Engineering》 SCIE EI 2023年第10期1077-1090,共14页
The purpose of this research is to deal with effective block chain framework for secure transactions.The rate of effective data transactions and the interoperability of the ledger are the two major obstacles involved ... The purpose of this research is to deal with effective block chain framework for secure transactions.The rate of effective data transactions and the interoperability of the ledger are the two major obstacles involved in Blockchain and to tackle this issue,Cross-Chain based Transaction(CCT)is introduced.Traditional industries have been restructured by the introduction of Internet of Things(IoT)to become smart industries through the feature of data-driven decision-making.Still,there are a few limitations,like decentralization,security vulnerabilities,poor interoperability,as well as privacy concerns in IoTs.To overcome this limitation,Blockchain has been employed to assure a safer transaction process,especially in asset exchanges.In recent decades,scalable local ledgers implement Blockchains,simultaneously sustaining peer validations of transactions which can be at local or global levels.From the single Hyperledger-based blockchains system,the CCT takes the transaction amid various chains.In addition,the most significant factor for this registration processing strategy is the Signature to ensure security.The application of the Quantum cryptographic algorithm amplifies the proposed Hyperledger-based blockchains,to strengthen the safety of the process.The key has been determined by restricting the number of transactions that reach the global Blockchain using the quantum-based hash function and accomplished by scalable local ledgers,and peer validations of transactions at local and global levels without any issues.The rate of transaction processing for entire peers has enhanced with the ancillary aid of the proposed solution,as it includes the procedure of load distribution.Without any boosted enhancement,the recommended solution utilizes the current transaction strategy,and also,it’s aimed at scalability,resource conservation,and interoperability.The experimental results of the system have been evaluated using the metrics like block weight,ledger memory,the usage of the central processing unit,and the communication overhead. 展开更多
关键词 internet of things(IoT) scalability blockchain INTEROPERABILITY security ledger size transaction rate cross-chain based transaction(CCT) quantum cryptographic algorithm
下载PDF
Novel Block Chain Technique for Data Privacy and Access Anonymity in Smart Healthcare
19
作者 J.Priya C.Palanisamy 《Intelligent Automation & Soft Computing》 SCIE 2023年第1期243-259,共17页
The Internet of Things (IoT) and Cloud computing are gaining popularity due to their numerous advantages, including the efficient utilization of internetand computing resources. In recent years, many more IoT applicat... The Internet of Things (IoT) and Cloud computing are gaining popularity due to their numerous advantages, including the efficient utilization of internetand computing resources. In recent years, many more IoT applications have beenextensively used. For instance, Healthcare applications execute computations utilizing the user’s private data stored on cloud servers. However, the main obstaclesfaced by the extensive acceptance and usage of these emerging technologies aresecurity and privacy. Moreover, many healthcare data management system applications have emerged, offering solutions for distinct circumstances. But still, theexisting system has issues with specific security issues, privacy-preserving rate,information loss, etc. Hence, the overall system performance is reduced significantly. A unique blockchain-based technique is proposed to improve anonymityin terms of data access and data privacy to overcome the above-mentioned issues.Initially, the registration phase is done for the device and the user. After that, theGeo-Location and IP Address values collected during registration are convertedinto Hash values using Adler 32 hashing algorithm, and the private and publickeys are generated using the key generation centre. Then the authentication is performed through login. The user then submits a request to the blockchain server,which redirects the request to the associated IoT device in order to obtain thesensed IoT data. The detected data is anonymized in the device and stored inthe cloud server using the Linear Scaling based Rider Optimization algorithmwith integrated KL Anonymity (LSR-KLA) approach. After that, the Time-stamp-based Public and Private Key Schnorr Signature (TSPP-SS) mechanismis used to permit the authorized user to access the data, and the blockchain servertracks the entire transaction. The experimental findings showed that the proposedLSR-KLA and TSPP-SS technique provides better performance in terms of higherprivacy-preserving rate, lower information loss, execution time, and Central Processing Unit (CPU) usage than the existing techniques. Thus, the proposed method allows for better data privacy in the smart healthcare network. 展开更多
关键词 Adler 32 hashing algorithm linear scaling based rider optimization algorithm with integrated KL anonymity(LSR-KLA) timestamp-based public and private key schnorr signature(TSPP-SS) blockchain internet of things(IoT) healthcare
下载PDF
基于物联网的灌溉系统定量控制算法设计研究 被引量:1
20
作者 王小花 郑思思 《农机化研究》 北大核心 2024年第9期236-239,共4页
精量节水灌溉成为农业生产发展的关键技术。为此,基于硬地育秧过程对土壤水分的精准化控制需求,设计了一种定量灌溉控制系统。同时,根据育秧过程中的土壤需水状况监测结果,设计定量灌溉控制算法,对育秧过程中的水分状况进行预警,实现了... 精量节水灌溉成为农业生产发展的关键技术。为此,基于硬地育秧过程对土壤水分的精准化控制需求,设计了一种定量灌溉控制系统。同时,根据育秧过程中的土壤需水状况监测结果,设计定量灌溉控制算法,对育秧过程中的水分状况进行预警,实现了精准化灌溉控制。 展开更多
关键词 节水灌溉 控制算法 定量控制 物联网
下载PDF
上一页 1 2 29 下一页 到第
使用帮助 返回顶部