Internet of Health Things(IoHT)is a subset of Internet of Things(IoT)technology that includes interconnected medical devices and sensors used in medical and healthcare information systems.However,IoHT is susceptible t...Internet of Health Things(IoHT)is a subset of Internet of Things(IoT)technology that includes interconnected medical devices and sensors used in medical and healthcare information systems.However,IoHT is susceptible to cybersecurity threats due to its reliance on low-power biomedical devices and the use of open wireless channels for communication.In this article,we intend to address this shortcoming,and as a result,we propose a new scheme called,the certificateless anonymous authentication(CAA)scheme.The proposed scheme is based on hyperelliptic curve cryptography(HECC),an enhanced variant of elliptic curve cryptography(ECC)that employs a smaller key size of 80 bits as compared to 160 bits.The proposed scheme is secure against various attacks in both formal and informal security analyses.The formal study makes use of the Real-or-Random(ROR)model.A thorough comparative study of the proposed scheme is conducted for the security and efficiency of the proposed scheme with the relevant existing schemes.The results demonstrate that the proposed scheme not only ensures high security for health-related data but also increases efficiency.The proposed scheme’s computation cost is 2.88 ms,and the communication cost is 1440 bits,which shows its better efficiency compared to its counterpart schemes.展开更多
The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective se...The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective server module.Although IoTs are cornerstones in different application domains,the device’s authenticity,i.e.,of server(s)and ordinary devices,is the most crucial issue and must be resolved on a priority basis.Therefore,various field-proven methodologies were presented to streamline the verification process of the communicating devices;however,location-aware authentication has not been reported as per our knowledge,which is a crucial metric,especially in scenarios where devices are mobile.This paper presents a lightweight and location-aware device-to-server authentication technique where the device’s membership with the nearest server is subjected to its location information along with other measures.Initially,Media Access Control(MAC)address and Advance Encryption Scheme(AES)along with a secret shared key,i.e.,λ_(i) of 128 bits,have been utilized by Trusted Authority(TA)to generate MaskIDs,which are used instead of the original ID,for every device,i.e.,server and member,and are shared in the offline phase.Secondly,TA shares a list of authentic devices,i.e.,server S_(j) and members C_(i),with every device in the IoT for the onward verification process,which is required to be executed before the initialization of the actual communication process.Additionally,every device should be located such that it lies within the coverage area of a server,and this location information is used in the authentication process.A thorough analytical analysis was carried out to check the susceptibility of the proposed and existing authentication approaches against well-known intruder attacks,i.e.,man-in-the-middle,masquerading,device,and server impersonations,etc.,especially in the IoT domain.Moreover,proposed authentication and existing state-of-the-art approaches have been simulated in the real environment of IoT to verify their performance,particularly in terms of various evaluation metrics,i.e.,processing,communication,and storage overheads.These results have verified the superiority of the proposed scheme against existing state-of-the-art approaches,preferably in terms of communication,storage,and processing costs.展开更多
Nowadays, devices are connected across all areas, from intelligent buildings and smart cities to Industry 4.0 andsmart healthcare. With the exponential growth of Internet of Things usage in our world, IoT security is ...Nowadays, devices are connected across all areas, from intelligent buildings and smart cities to Industry 4.0 andsmart healthcare. With the exponential growth of Internet of Things usage in our world, IoT security is still thebiggest challenge for its deployment. The main goal of IoT security is to ensure the accessibility of services providedby an IoT environment, protect privacy, and confidentiality, and guarantee the safety of IoT users, infrastructures,data, and devices. Authentication, as the first line of defense against security threats, becomes the priority ofeveryone. It can either grant or deny users access to resources according to their legitimacy. As a result, studyingand researching authentication issues within IoT is extremely important. As a result, studying and researchingauthentication issues within IoT is extremely important. This article presents a comparative study of recent researchin IoT security;it provides an analysis of recent authentication protocols from2019 to 2023 that cover several areaswithin IoT (such as smart cities, healthcare, and industry). This survey sought to provide an IoT security researchsummary, the biggest susceptibilities, and attacks, the appropriate technologies, and the most used simulators. Itillustrates that the resistance of protocols against attacks, and their computational and communication cost arelinked directly to the cryptography technique used to build it. Furthermore, it discusses the gaps in recent schemesand provides some future research directions.展开更多
The rapid expansion of Internet of Things (IoT) devices across various sectors is driven by steadily increasingdemands for interconnected and smart technologies. Nevertheless, the surge in the number of IoT device has...The rapid expansion of Internet of Things (IoT) devices across various sectors is driven by steadily increasingdemands for interconnected and smart technologies. Nevertheless, the surge in the number of IoT device hascaught the attention of cyber hackers, as it provides them with expanded avenues to access valuable data. Thishas resulted in a myriad of security challenges, including information leakage, malware propagation, and financialloss, among others. Consequently, developing an intrusion detection system to identify both active and potentialintrusion traffic in IoT networks is of paramount importance. In this paper, we propose ResNeSt-biGRU, a practicalintrusion detection model that combines the strengths of ResNeSt, a variant of Residual Neural Network, andbidirectionalGated RecurrentUnitNetwork (biGRU).Our ResNeSt-biGRUframework diverges fromconventionalintrusion detection systems (IDS) by employing this dual-layeredmechanism that exploits the temporal continuityand spatial feature within network data streams, a methodological innovation that enhances detection accuracy.In conjunction with this, we introduce the PreIoT dataset, a compilation of prevalent IoT network behaviors, totrain and evaluate IDSmodels with a focus on identifying potential intrusion traffics. The effectiveness of proposedscheme is demonstrated through testing, wherein it achieved an average accuracy of 99.90% on theN-BaIoT datasetas well as on the PreIoT dataset and 94.45% on UNSW-NB15 dataset. The outcomes of this research reveal thepotential of ResNeSt-biGRU to bolster security measures, diminish intrusion-related vulnerabilities, and preservethe overall security of IoT ecosystems.展开更多
The Internet of Medical Things(IoMT)is an application of the Internet of Things(IoT)in the medical field.It is a cutting-edge technique that connects medical sensors and their applications to healthcare systems,which ...The Internet of Medical Things(IoMT)is an application of the Internet of Things(IoT)in the medical field.It is a cutting-edge technique that connects medical sensors and their applications to healthcare systems,which is essential in smart healthcare.However,Personal Health Records(PHRs)are normally kept in public cloud servers controlled by IoMT service providers,so privacy and security incidents may be frequent.Fortunately,Searchable Encryption(SE),which can be used to execute queries on encrypted data,can address the issue above.Nevertheless,most existing SE schemes cannot solve the vector dominance threshold problem.In response to this,we present a SE scheme called Vector Dominance with Threshold Searchable Encryption(VDTSE)in this study.We use a Lagrangian polynomial technique and convert the vector dominance threshold problem into a constraint that the number of two equal-length vectors’corresponding bits excluding wildcards is not less than a threshold t.Then,we solve the problem using the proposed technique modified in Hidden Vector Encryption(HVE).This technique makes the trapdoor size linear to the number of attributes and thus much smaller than that of other similar SE schemes.A rigorous experimental analysis of a specific application for privacy-preserving diabetes demonstrates the feasibility of the proposed VDTSE scheme.展开更多
The automatic collection of power grid situation information, along with real-time multimedia interaction between the front and back ends during the accident handling process, has generated a massive amount of power g...The automatic collection of power grid situation information, along with real-time multimedia interaction between the front and back ends during the accident handling process, has generated a massive amount of power grid data. While wireless communication offers a convenient channel for grid terminal access and data transmission, it is important to note that the bandwidth of wireless communication is limited. Additionally, the broadcast nature of wireless transmission raises concerns about the potential for unauthorized eavesdropping during data transmission. To address these challenges and achieve reliable, secure, and real-time transmission of power grid data, an intelligent security transmission strategy with sensor-transmission-computing linkage is proposed in this paper. The primary objective of this strategy is to maximize the confidentiality capacity of the system. To tackle this, an optimization problem is formulated, taking into consideration interruption probability and interception probability as constraints. To efficiently solve this optimization problem, a low-complexity algorithm rooted in deep reinforcement learning is designed, which aims to derive a suboptimal solution for the problem at hand. Ultimately, through simulation results, the validity of the proposed strategy in guaranteed communication security, stability, and timeliness is substantiated. The results confirm that the proposed intelligent security transmission strategy significantly contributes to the safeguarding of communication integrity, system stability, and timely data delivery.展开更多
With the adoption of cutting-edge communication technologies such as 5G/6G systems and the extensive development of devices,crowdsensing systems in the Internet of Things(IoT)are now conducting complicated video analy...With the adoption of cutting-edge communication technologies such as 5G/6G systems and the extensive development of devices,crowdsensing systems in the Internet of Things(IoT)are now conducting complicated video analysis tasks such as behaviour recognition.These applications have dramatically increased the diversity of IoT systems.Specifically,behaviour recognition in videos usually requires a combinatorial analysis of the spatial information about objects and information about their dynamic actions in the temporal dimension.Behaviour recognition may even rely more on the modeling of temporal information containing short-range and long-range motions,in contrast to computer vision tasks involving images that focus on understanding spatial information.However,current solutions fail to jointly and comprehensively analyse short-range motions between adjacent frames and long-range temporal aggregations at large scales in videos.In this paper,we propose a novel behaviour recognition method based on the integration of multigranular(IMG)motion features,which can provide support for deploying video analysis in multimedia IoT crowdsensing systems.In particular,we achieve reliable motion information modeling by integrating a channel attention-based short-term motion feature enhancement module(CSEM)and a cascaded long-term motion feature integration module(CLIM).We evaluate our model on several action recognition benchmarks,such as HMDB51,Something-Something and UCF101.The experimental results demonstrate that our approach outperforms the previous state-of-the-art methods,which confirms its effective-ness and efficiency.展开更多
Effective user authentication is key to ensuring equipment security,data privacy,and personalized services in Internet of Things(IoT)systems.However,conventional mode-based authentication methods(e.g.,passwords and sm...Effective user authentication is key to ensuring equipment security,data privacy,and personalized services in Internet of Things(IoT)systems.However,conventional mode-based authentication methods(e.g.,passwords and smart cards)may be vulnerable to a broad range of attacks(e.g.,eavesdropping and side-channel attacks).Hence,there have been attempts to design biometric-based authentication solutions,which rely on physiological and behavioral characteristics.Behavioral characteristics need continuous monitoring and specific environmental settings,which can be challenging to implement in practice.However,we can also leverage Artificial Intelligence(AI)in the extraction and classification of physiological characteristics from IoT devices processing to facilitate authentication.Thus,we review the literature on the use of AI in physiological characteristics recognition pub-lished after 2015.We use the three-layer architecture of the IoT(i.e.,sensing layer,feature layer,and algorithm layer)to guide the discussion of existing approaches and their limitations.We also identify a number of future research opportunities,which will hopefully guide the design of next generation solutions.展开更多
There are numerous terminals in the satellite Internet of Things(IoT).To save cost and reduce power consumption,the system needs terminals to catch the characteristics of low power consumption and light control.The re...There are numerous terminals in the satellite Internet of Things(IoT).To save cost and reduce power consumption,the system needs terminals to catch the characteristics of low power consumption and light control.The regular random access(RA)protocols may generate large amounts of collisions,which degrade the system throughout severally.The near-far effect and power control technologies are not applicable in capture effect to obtain power difference,resulting in the collisions that cannot be separated.In fact,the optimal design at the receiving end can also realize the condition of packet power domain separation,but there are few relevant researches.In this paper,an auxiliary beamforming scheme is proposed for power domain signal separation.It adds an auxiliary reception beam based on the conventional beam,utilizing the correlation of packets in time-frequency domain between the main and auxiliary beam to complete signal separation.The roll-off belt of auxiliary beam is used to create the carrier-to-noise ratio(CNR)difference.This paper uses the genetic algorithm to optimize the auxiliary beam direction.Simulation results show that the proposed scheme outperforms slotted ALOHA(SA)in terms of system throughput per-formance and without bringing terminals additional control burden.展开更多
The cloud platform has limited defense resources to fully protect the edge servers used to process crowd sensing data in Internet of Things.To guarantee the network's overall security,we present a network defense ...The cloud platform has limited defense resources to fully protect the edge servers used to process crowd sensing data in Internet of Things.To guarantee the network's overall security,we present a network defense resource allocation with multi-armed bandits to maximize the network's overall benefit.Firstly,we propose the method for dynamic setting of node defense resource thresholds to obtain the defender(attacker)benefit function of edge servers(nodes)and distribution.Secondly,we design a defense resource sharing mechanism for neighboring nodes to obtain the defense capability of nodes.Subsequently,we use the decomposability and Lipschitz conti-nuity of the defender's total expected utility to reduce the difference between the utility's discrete and continuous arms and analyze the difference theoretically.Finally,experimental results show that the method maximizes the defender's total expected utility and reduces the difference between the discrete and continuous arms of the utility.展开更多
Due to the overwhelming characteristics of the Internet of Things(IoT)and its adoption in approximately every aspect of our lives,the concept of individual devices’privacy has gained prominent attention from both cus...Due to the overwhelming characteristics of the Internet of Things(IoT)and its adoption in approximately every aspect of our lives,the concept of individual devices’privacy has gained prominent attention from both customers,i.e.,people,and industries as wearable devices collect sensitive information about patients(both admitted and outdoor)in smart healthcare infrastructures.In addition to privacy,outliers or noise are among the crucial issues,which are directly correlated with IoT infrastructures,as most member devices are resource-limited and could generate or transmit false data that is required to be refined before processing,i.e.,transmitting.Therefore,the development of privacy-preserving information fusion techniques is highly encouraged,especially those designed for smart IoT-enabled domains.In this paper,we are going to present an effective hybrid approach that can refine raw data values captured by the respectivemember device before transmission while preserving its privacy through the utilization of the differential privacy technique in IoT infrastructures.Sliding window,i.e.,δi based dynamic programming methodology,is implemented at the device level to ensure precise and accurate detection of outliers or noisy data,and refine it prior to activation of the respective transmission activity.Additionally,an appropriate privacy budget has been selected,which is enough to ensure the privacy of every individualmodule,i.e.,a wearable device such as a smartwatch attached to the patient’s body.In contrast,the end module,i.e.,the server in this case,can extract important information with approximately the maximum level of accuracy.Moreover,refined data has been processed by adding an appropriate nose through the Laplace mechanism to make it useless or meaningless for the adversary modules in the IoT.The proposed hybrid approach is trusted from both the device’s privacy and the integrity of the transmitted information perspectives.Simulation and analytical results have proved that the proposed privacy-preserving information fusion technique for wearable devices is an ideal solution for resource-constrained infrastructures such as IoT and the Internet ofMedical Things,where both device privacy and information integrity are important.Finally,the proposed hybrid approach is proven against well-known intruder attacks,especially those related to the privacy of the respective device in IoT infrastructures.展开更多
Burial depth is a crucial factor affecting the forces and deformation of tunnels during earthquakes.One key issue is a lack of understanding of the effect of a change in the buried depth of a single-side tunnel on the...Burial depth is a crucial factor affecting the forces and deformation of tunnels during earthquakes.One key issue is a lack of understanding of the effect of a change in the buried depth of a single-side tunnel on the seismic response of a double-tunnel system.In this study,shaking table tests were designed and performed based on a tunnel under construction in Dalian,China.Numerical models were established using the equivalent linear method combined with ABAQUS finite element software to analyze the seismic response of the interacting system.The results showed that the amplification coefficient of the soil acceleration did not change evidently with the burial depth of the new tunnel but decreased as the seismic amplitude increased.In addition,the existing tunnel acceleration,earth pressure,and internal force were hardly affected by the change in the burial depth;for the new tunnel,the acceleration and internal force decreased as the burial depth increased,while the earth pressure increased.This shows that the earth pressure distribution in a double-tunnel system is relatively complex and mainly concentrated on the arch spandrel and arch springing of the relative area.Overall,when the horizontal clearance between the center of the two tunnels was more than twice the sum of the radius of the outer edges of the two tunnels,the change in the burial depth of the new tunnel had little effect on the existing one,and the tunnel structure was deemed safe.These results provide a preliminary understanding and reference for the seismic performance of a double-tunnel system.展开更多
As an ingenious convergence between the Internet of Things and social networks,the Social Internet of Things(SIoT)can provide effective and intelligent information services and has become one of the main platforms for...As an ingenious convergence between the Internet of Things and social networks,the Social Internet of Things(SIoT)can provide effective and intelligent information services and has become one of the main platforms for people to spread and share information.Nevertheless,SIoT is characterized by high openness and autonomy,multiple kinds of information can spread rapidly,freely and cooperatively in SIoT,which makes it challenging to accurately reveal the characteristics of the information diffusion process and effectively control its diffusion.To this end,with the aim of exploring multi-information cooperative diffusion processes in SIoT,we first develop a dynamics model for multi-information cooperative diffusion based on the system dynamics theory in this paper.Subsequently,the characteristics and laws of the dynamical evolution process of multi-information cooperative diffusion are theoretically investigated,and the diffusion trend is predicted.On this basis,to further control the multi-information cooperative diffusion process efficiently,we propose two control strategies for information diffusion with control objectives,develop an optimal control system for the multi-information cooperative diffusion process,and propose the corresponding optimal control method.The optimal solution distribution of the control strategy satisfying the control system constraints and the control budget constraints is solved using the optimal control theory.Finally,extensive simulation experiments based on real dataset from Twitter validate the correctness and effectiveness of the proposed model,strategy and method.展开更多
The recent development of the Internet of Things(IoTs)resulted in the growth of IoT-based DDoS attacks.The detection of Botnet in IoT systems implements advanced cybersecurity measures to detect and reduce malevolent ...The recent development of the Internet of Things(IoTs)resulted in the growth of IoT-based DDoS attacks.The detection of Botnet in IoT systems implements advanced cybersecurity measures to detect and reduce malevolent botnets in interconnected devices.Anomaly detection models evaluate transmission patterns,network traffic,and device behaviour to detect deviations from usual activities.Machine learning(ML)techniques detect patterns signalling botnet activity,namely sudden traffic increase,unusual command and control patterns,or irregular device behaviour.In addition,intrusion detection systems(IDSs)and signature-based techniques are applied to recognize known malware signatures related to botnets.Various ML and deep learning(DL)techniques have been developed to detect botnet attacks in IoT systems.To overcome security issues in an IoT environment,this article designs a gorilla troops optimizer with DL-enabled botnet attack detection and classification(GTODL-BADC)technique.The GTODL-BADC technique follows feature selection(FS)with optimal DL-based classification for accomplishing security in an IoT environment.For data preprocessing,the min-max data normalization approach is primarily used.The GTODL-BADC technique uses the GTO algorithm to select features and elect optimal feature subsets.Moreover,the multi-head attention-based long short-term memory(MHA-LSTM)technique was applied for botnet detection.Finally,the tree seed algorithm(TSA)was used to select the optimum hyperparameter for the MHA-LSTM method.The experimental validation of the GTODL-BADC technique can be tested on a benchmark dataset.The simulation results highlighted that the GTODL-BADC technique demonstrates promising performance in the botnet detection process.展开更多
Purpose – This study aims to analyze the factors, evaluation techniques of the durability of existing railwayengineering.Design/methodology/approach – China has built a railway network of over 150,000 km. Ensuring t...Purpose – This study aims to analyze the factors, evaluation techniques of the durability of existing railwayengineering.Design/methodology/approach – China has built a railway network of over 150,000 km. Ensuring thesafety of the existing railway engineering is of great significance for maintaining normal railway operationorder. However, railway engineering is a strip structure that crosses multiple complex environments. Andrailway engineering will withstand high-frequency impact loads from trains. The above factors have led todifferences in the deterioration characteristics and maintenance strategies of railway engineering compared toconventional concrete structures. Therefore, it is very important to analyze the key factors that affect thedurability of railway structures and propose technologies for durability evaluation.Findings – The factors that affect the durability and reliability of railway engineering are mainly divided intothree categories: material factors, environmental factors and load factors. Among them, material factors alsoinclude influencing factors, such as raw materials, mix proportions and so on. Environmental factors varydepending on the service environment of railway engineering, and the durability and deterioration of concretehave different failure mechanisms. Load factors include static load and train dynamic load. The on-site rapiddetection methods for five common diseases in railway engineering are also proposed in this paper. Thesemethods can quickly evaluate the durability of existing railway engineering concrete.Originality/value – The research can provide some new evaluation techniques and methods for thedurability of existing railway engineering.展开更多
One of the major challenges arising in internet of military things(IoMT)is accommodating massive connectivity while providing guaranteed quality of service(QoS)in terms of ultra-high reliability.In this regard,this pa...One of the major challenges arising in internet of military things(IoMT)is accommodating massive connectivity while providing guaranteed quality of service(QoS)in terms of ultra-high reliability.In this regard,this paper presents a class of code-domain nonorthogonal multiple accesses(NOMAs)for uplink ultra reliable networking of massive IoMT based on tactical datalink such as Link-16 and joint tactical information distribution system(JTIDS).In the considered scenario,a satellite equipped with Nr antennas servers K devices including vehicles,drones,ships,sensors,handset radios,etc.Nonorthogonal coded modulation,a special form of multiple input multiple output(MIMO)-NOMA is proposed.The discussion starts with evaluating the output signal to interference-plus-noise(SINR)of receiver filter,leading to the unveiling of a closed-form expression for overloading systems as the number of users is significantly larger than the number of devices admitted such that massive connectivity is rendered.The expression allows for the development of simple yet successful interference suppression based on power allocation and phase shaping techniques that maximizes the sum rate since it is equivalent to fixed-point programming as can be proved.The proposed design is exemplified by nonlinear modulation schemes such as minimum shift keying(MSK)and Gaussian MSK(GMSK),two pivotal modulation formats in IoMT standards such as Link-16 and JITDS.Numerical results show that near capacity performance is offered.Fortunately,the performance is obtained using simple forward error corrections(FECs)of higher coding rate than existing schemes do,while the transmit power is reduced by 6 dB.The proposed design finds wide applications not only in IoMT but also in deep space communications,where ultra reliability and massive connectivity is a keen concern.展开更多
Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessi...Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.展开更多
The proliferation of IoT devices requires innovative approaches to gaining insights while preserving privacy and resources amid unprecedented data generation.However,FL development for IoT is still in its infancy and ...The proliferation of IoT devices requires innovative approaches to gaining insights while preserving privacy and resources amid unprecedented data generation.However,FL development for IoT is still in its infancy and needs to be explored in various areas to understand the key challenges for deployment in real-world scenarios.The paper systematically reviewed the available literature using the PRISMA guiding principle.The study aims to provide a detailed overview of the increasing use of FL in IoT networks,including the architecture and challenges.A systematic review approach is used to collect,categorize and analyze FL-IoT-based articles.Asearch was performed in the IEEE,Elsevier,Arxiv,ACM,and WOS databases and 92 articles were finally examined.Inclusion measures were published in English and with the keywords“FL”and“IoT”.The methodology begins with an overview of recent advances in FL and the IoT,followed by a discussion of how these two technologies can be integrated.To be more specific,we examine and evaluate the capabilities of FL by talking about communication protocols,frameworks and architecture.We then present a comprehensive analysis of the use of FL in a number of key IoT applications,including smart healthcare,smart transportation,smart cities,smart industry,smart finance,and smart agriculture.The key findings from this analysis of FL IoT services and applications are also presented.Finally,we performed a comparative analysis with FL IID(independent and identical data)and non-ID,traditional centralized deep learning(DL)approaches.We concluded that FL has better performance,especially in terms of privacy protection and resource utilization.FL is excellent for preserving privacy becausemodel training takes place on individual devices or edge nodes,eliminating the need for centralized data aggregation,which poses significant privacy risks.To facilitate development in this rapidly evolving field,the insights presented are intended to help practitioners and researchers navigate the complex terrain of FL and IoT.展开更多
Time synchronization(TS)is crucial for ensuring the secure and reliable functioning of the distribution power Internet of Things(IoT).Multi-clock source time synchronization(MTS)has significant advantages of high reli...Time synchronization(TS)is crucial for ensuring the secure and reliable functioning of the distribution power Internet of Things(IoT).Multi-clock source time synchronization(MTS)has significant advantages of high reliability and accuracy but still faces challenges such as optimization of the multi-clock source selection and the clock source weight calculation at different timescales,and the coupling of synchronization latency jitter and pulse phase difference.In this paper,the multi-timescale MTS model is conducted,and the reinforcement learning(RL)and analytic hierarchy process(AHP)-based multi-timescale MTS algorithm is designed to improve the weighted summation of synchronization latency jitter standard deviation and average pulse phase difference.Specifically,the multi-clock source selection is optimized based on Softmax in the large timescale,and the clock source weight calculation is optimized based on lower confidence bound-assisted AHP in the small timescale.Simulation shows that the proposed algorithm can effectively reduce time synchronization delay standard deviation and average pulse phase difference.展开更多
Due to the limited computational capability and the diversity of the Internet of Things devices working in different environment,we consider fewshot learning-based automatic modulation classification(AMC)to improve it...Due to the limited computational capability and the diversity of the Internet of Things devices working in different environment,we consider fewshot learning-based automatic modulation classification(AMC)to improve its reliability.A data enhancement module(DEM)is designed by a convolutional layer to supplement frequency-domain information as well as providing nonlinear mapping that is beneficial for AMC.Multimodal network is designed to have multiple residual blocks,where each residual block has multiple convolutional kernels of different sizes for diverse feature extraction.Moreover,a deep supervised loss function is designed to supervise all parts of the network including the hidden layers and the DEM.Since different model may output different results,cooperative classifier is designed to avoid the randomness of single model and improve the reliability.Simulation results show that this few-shot learning-based AMC method can significantly improve the AMC accuracy compared to the existing methods.展开更多
文摘Internet of Health Things(IoHT)is a subset of Internet of Things(IoT)technology that includes interconnected medical devices and sensors used in medical and healthcare information systems.However,IoHT is susceptible to cybersecurity threats due to its reliance on low-power biomedical devices and the use of open wireless channels for communication.In this article,we intend to address this shortcoming,and as a result,we propose a new scheme called,the certificateless anonymous authentication(CAA)scheme.The proposed scheme is based on hyperelliptic curve cryptography(HECC),an enhanced variant of elliptic curve cryptography(ECC)that employs a smaller key size of 80 bits as compared to 160 bits.The proposed scheme is secure against various attacks in both formal and informal security analyses.The formal study makes use of the Real-or-Random(ROR)model.A thorough comparative study of the proposed scheme is conducted for the security and efficiency of the proposed scheme with the relevant existing schemes.The results demonstrate that the proposed scheme not only ensures high security for health-related data but also increases efficiency.The proposed scheme’s computation cost is 2.88 ms,and the communication cost is 1440 bits,which shows its better efficiency compared to its counterpart schemes.
文摘The Internet of Things(IoT)is a smart networking infrastructure of physical devices,i.e.,things,that are embedded with sensors,actuators,software,and other technologies,to connect and share data with the respective server module.Although IoTs are cornerstones in different application domains,the device’s authenticity,i.e.,of server(s)and ordinary devices,is the most crucial issue and must be resolved on a priority basis.Therefore,various field-proven methodologies were presented to streamline the verification process of the communicating devices;however,location-aware authentication has not been reported as per our knowledge,which is a crucial metric,especially in scenarios where devices are mobile.This paper presents a lightweight and location-aware device-to-server authentication technique where the device’s membership with the nearest server is subjected to its location information along with other measures.Initially,Media Access Control(MAC)address and Advance Encryption Scheme(AES)along with a secret shared key,i.e.,λ_(i) of 128 bits,have been utilized by Trusted Authority(TA)to generate MaskIDs,which are used instead of the original ID,for every device,i.e.,server and member,and are shared in the offline phase.Secondly,TA shares a list of authentic devices,i.e.,server S_(j) and members C_(i),with every device in the IoT for the onward verification process,which is required to be executed before the initialization of the actual communication process.Additionally,every device should be located such that it lies within the coverage area of a server,and this location information is used in the authentication process.A thorough analytical analysis was carried out to check the susceptibility of the proposed and existing authentication approaches against well-known intruder attacks,i.e.,man-in-the-middle,masquerading,device,and server impersonations,etc.,especially in the IoT domain.Moreover,proposed authentication and existing state-of-the-art approaches have been simulated in the real environment of IoT to verify their performance,particularly in terms of various evaluation metrics,i.e.,processing,communication,and storage overheads.These results have verified the superiority of the proposed scheme against existing state-of-the-art approaches,preferably in terms of communication,storage,and processing costs.
文摘Nowadays, devices are connected across all areas, from intelligent buildings and smart cities to Industry 4.0 andsmart healthcare. With the exponential growth of Internet of Things usage in our world, IoT security is still thebiggest challenge for its deployment. The main goal of IoT security is to ensure the accessibility of services providedby an IoT environment, protect privacy, and confidentiality, and guarantee the safety of IoT users, infrastructures,data, and devices. Authentication, as the first line of defense against security threats, becomes the priority ofeveryone. It can either grant or deny users access to resources according to their legitimacy. As a result, studyingand researching authentication issues within IoT is extremely important. As a result, studying and researchingauthentication issues within IoT is extremely important. This article presents a comparative study of recent researchin IoT security;it provides an analysis of recent authentication protocols from2019 to 2023 that cover several areaswithin IoT (such as smart cities, healthcare, and industry). This survey sought to provide an IoT security researchsummary, the biggest susceptibilities, and attacks, the appropriate technologies, and the most used simulators. Itillustrates that the resistance of protocols against attacks, and their computational and communication cost arelinked directly to the cryptography technique used to build it. Furthermore, it discusses the gaps in recent schemesand provides some future research directions.
基金the National Natural Science Foundation of China(No.61662004).
文摘The rapid expansion of Internet of Things (IoT) devices across various sectors is driven by steadily increasingdemands for interconnected and smart technologies. Nevertheless, the surge in the number of IoT device hascaught the attention of cyber hackers, as it provides them with expanded avenues to access valuable data. Thishas resulted in a myriad of security challenges, including information leakage, malware propagation, and financialloss, among others. Consequently, developing an intrusion detection system to identify both active and potentialintrusion traffic in IoT networks is of paramount importance. In this paper, we propose ResNeSt-biGRU, a practicalintrusion detection model that combines the strengths of ResNeSt, a variant of Residual Neural Network, andbidirectionalGated RecurrentUnitNetwork (biGRU).Our ResNeSt-biGRUframework diverges fromconventionalintrusion detection systems (IDS) by employing this dual-layeredmechanism that exploits the temporal continuityand spatial feature within network data streams, a methodological innovation that enhances detection accuracy.In conjunction with this, we introduce the PreIoT dataset, a compilation of prevalent IoT network behaviors, totrain and evaluate IDSmodels with a focus on identifying potential intrusion traffics. The effectiveness of proposedscheme is demonstrated through testing, wherein it achieved an average accuracy of 99.90% on theN-BaIoT datasetas well as on the PreIoT dataset and 94.45% on UNSW-NB15 dataset. The outcomes of this research reveal thepotential of ResNeSt-biGRU to bolster security measures, diminish intrusion-related vulnerabilities, and preservethe overall security of IoT ecosystems.
基金supported in part by the National Natural Science Foundation of China under Grant Nos.61872289 and 62172266in part by the Henan Key Laboratory of Network Cryptography Technology LNCT2020-A07the Guangxi Key Laboratory of Trusted Software under Grant No.KX202308.
文摘The Internet of Medical Things(IoMT)is an application of the Internet of Things(IoT)in the medical field.It is a cutting-edge technique that connects medical sensors and their applications to healthcare systems,which is essential in smart healthcare.However,Personal Health Records(PHRs)are normally kept in public cloud servers controlled by IoMT service providers,so privacy and security incidents may be frequent.Fortunately,Searchable Encryption(SE),which can be used to execute queries on encrypted data,can address the issue above.Nevertheless,most existing SE schemes cannot solve the vector dominance threshold problem.In response to this,we present a SE scheme called Vector Dominance with Threshold Searchable Encryption(VDTSE)in this study.We use a Lagrangian polynomial technique and convert the vector dominance threshold problem into a constraint that the number of two equal-length vectors’corresponding bits excluding wildcards is not less than a threshold t.Then,we solve the problem using the proposed technique modified in Hidden Vector Encryption(HVE).This technique makes the trapdoor size linear to the number of attributes and thus much smaller than that of other similar SE schemes.A rigorous experimental analysis of a specific application for privacy-preserving diabetes demonstrates the feasibility of the proposed VDTSE scheme.
文摘The automatic collection of power grid situation information, along with real-time multimedia interaction between the front and back ends during the accident handling process, has generated a massive amount of power grid data. While wireless communication offers a convenient channel for grid terminal access and data transmission, it is important to note that the bandwidth of wireless communication is limited. Additionally, the broadcast nature of wireless transmission raises concerns about the potential for unauthorized eavesdropping during data transmission. To address these challenges and achieve reliable, secure, and real-time transmission of power grid data, an intelligent security transmission strategy with sensor-transmission-computing linkage is proposed in this paper. The primary objective of this strategy is to maximize the confidentiality capacity of the system. To tackle this, an optimization problem is formulated, taking into consideration interruption probability and interception probability as constraints. To efficiently solve this optimization problem, a low-complexity algorithm rooted in deep reinforcement learning is designed, which aims to derive a suboptimal solution for the problem at hand. Ultimately, through simulation results, the validity of the proposed strategy in guaranteed communication security, stability, and timeliness is substantiated. The results confirm that the proposed intelligent security transmission strategy significantly contributes to the safeguarding of communication integrity, system stability, and timely data delivery.
基金supported by National Natural Science Foundation of China under grant No.62271125,No.62273071Sichuan Science and Technology Program(No.2022YFG0038,No.2021YFG0018)+1 种基金by Xinjiang Science and Technology Program(No.2022273061)by the Fundamental Research Funds for the Central Universities(No.ZYGX2020ZB034,No.ZYGX2021J019).
文摘With the adoption of cutting-edge communication technologies such as 5G/6G systems and the extensive development of devices,crowdsensing systems in the Internet of Things(IoT)are now conducting complicated video analysis tasks such as behaviour recognition.These applications have dramatically increased the diversity of IoT systems.Specifically,behaviour recognition in videos usually requires a combinatorial analysis of the spatial information about objects and information about their dynamic actions in the temporal dimension.Behaviour recognition may even rely more on the modeling of temporal information containing short-range and long-range motions,in contrast to computer vision tasks involving images that focus on understanding spatial information.However,current solutions fail to jointly and comprehensively analyse short-range motions between adjacent frames and long-range temporal aggregations at large scales in videos.In this paper,we propose a novel behaviour recognition method based on the integration of multigranular(IMG)motion features,which can provide support for deploying video analysis in multimedia IoT crowdsensing systems.In particular,we achieve reliable motion information modeling by integrating a channel attention-based short-term motion feature enhancement module(CSEM)and a cascaded long-term motion feature integration module(CLIM).We evaluate our model on several action recognition benchmarks,such as HMDB51,Something-Something and UCF101.The experimental results demonstrate that our approach outperforms the previous state-of-the-art methods,which confirms its effective-ness and efficiency.
基金funded in part by the National Natural Science Foundation of China under Grant No.61872038in part by the Fundamental Research Funds for the Central Universities under Grant No.FRF-GF-20-15B.
文摘Effective user authentication is key to ensuring equipment security,data privacy,and personalized services in Internet of Things(IoT)systems.However,conventional mode-based authentication methods(e.g.,passwords and smart cards)may be vulnerable to a broad range of attacks(e.g.,eavesdropping and side-channel attacks).Hence,there have been attempts to design biometric-based authentication solutions,which rely on physiological and behavioral characteristics.Behavioral characteristics need continuous monitoring and specific environmental settings,which can be challenging to implement in practice.However,we can also leverage Artificial Intelligence(AI)in the extraction and classification of physiological characteristics from IoT devices processing to facilitate authentication.Thus,we review the literature on the use of AI in physiological characteristics recognition pub-lished after 2015.We use the three-layer architecture of the IoT(i.e.,sensing layer,feature layer,and algorithm layer)to guide the discussion of existing approaches and their limitations.We also identify a number of future research opportunities,which will hopefully guide the design of next generation solutions.
基金supported by the National Science Foundation of China(No.U21A20450)Natural Science Foundation of Jiangsu Province Major Project(No.BK20192002)+1 种基金National Natural Science Foundation of China(No.61971440)National Natural Science Foundation of China(No.62271266).
文摘There are numerous terminals in the satellite Internet of Things(IoT).To save cost and reduce power consumption,the system needs terminals to catch the characteristics of low power consumption and light control.The regular random access(RA)protocols may generate large amounts of collisions,which degrade the system throughout severally.The near-far effect and power control technologies are not applicable in capture effect to obtain power difference,resulting in the collisions that cannot be separated.In fact,the optimal design at the receiving end can also realize the condition of packet power domain separation,but there are few relevant researches.In this paper,an auxiliary beamforming scheme is proposed for power domain signal separation.It adds an auxiliary reception beam based on the conventional beam,utilizing the correlation of packets in time-frequency domain between the main and auxiliary beam to complete signal separation.The roll-off belt of auxiliary beam is used to create the carrier-to-noise ratio(CNR)difference.This paper uses the genetic algorithm to optimize the auxiliary beam direction.Simulation results show that the proposed scheme outperforms slotted ALOHA(SA)in terms of system throughput per-formance and without bringing terminals additional control burden.
基金supported by the National Natural Science Foundation of China(NSFC)[grant numbers 62172377,61872205]the Shandong Provincial Natural Science Foundation[grant number ZR2019MF018]the Startup Research Foundation for Distinguished Scholars No.202112016.
文摘The cloud platform has limited defense resources to fully protect the edge servers used to process crowd sensing data in Internet of Things.To guarantee the network's overall security,we present a network defense resource allocation with multi-armed bandits to maximize the network's overall benefit.Firstly,we propose the method for dynamic setting of node defense resource thresholds to obtain the defender(attacker)benefit function of edge servers(nodes)and distribution.Secondly,we design a defense resource sharing mechanism for neighboring nodes to obtain the defense capability of nodes.Subsequently,we use the decomposability and Lipschitz conti-nuity of the defender's total expected utility to reduce the difference between the utility's discrete and continuous arms and analyze the difference theoretically.Finally,experimental results show that the method maximizes the defender's total expected utility and reduces the difference between the discrete and continuous arms of the utility.
基金Ministry of Higher Education of Malaysia under theResearch GrantLRGS/1/2019/UKM-UKM/5/2 and Princess Nourah bint Abdulrahman University for financing this researcher through Supporting Project Number(PNURSP2024R235),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Due to the overwhelming characteristics of the Internet of Things(IoT)and its adoption in approximately every aspect of our lives,the concept of individual devices’privacy has gained prominent attention from both customers,i.e.,people,and industries as wearable devices collect sensitive information about patients(both admitted and outdoor)in smart healthcare infrastructures.In addition to privacy,outliers or noise are among the crucial issues,which are directly correlated with IoT infrastructures,as most member devices are resource-limited and could generate or transmit false data that is required to be refined before processing,i.e.,transmitting.Therefore,the development of privacy-preserving information fusion techniques is highly encouraged,especially those designed for smart IoT-enabled domains.In this paper,we are going to present an effective hybrid approach that can refine raw data values captured by the respectivemember device before transmission while preserving its privacy through the utilization of the differential privacy technique in IoT infrastructures.Sliding window,i.e.,δi based dynamic programming methodology,is implemented at the device level to ensure precise and accurate detection of outliers or noisy data,and refine it prior to activation of the respective transmission activity.Additionally,an appropriate privacy budget has been selected,which is enough to ensure the privacy of every individualmodule,i.e.,a wearable device such as a smartwatch attached to the patient’s body.In contrast,the end module,i.e.,the server in this case,can extract important information with approximately the maximum level of accuracy.Moreover,refined data has been processed by adding an appropriate nose through the Laplace mechanism to make it useless or meaningless for the adversary modules in the IoT.The proposed hybrid approach is trusted from both the device’s privacy and the integrity of the transmitted information perspectives.Simulation and analytical results have proved that the proposed privacy-preserving information fusion technique for wearable devices is an ideal solution for resource-constrained infrastructures such as IoT and the Internet ofMedical Things,where both device privacy and information integrity are important.Finally,the proposed hybrid approach is proven against well-known intruder attacks,especially those related to the privacy of the respective device in IoT infrastructures.
基金Scientific Research Fund of Liaoning Provincial Education Department under Grant No.LJKZ0336。
文摘Burial depth is a crucial factor affecting the forces and deformation of tunnels during earthquakes.One key issue is a lack of understanding of the effect of a change in the buried depth of a single-side tunnel on the seismic response of a double-tunnel system.In this study,shaking table tests were designed and performed based on a tunnel under construction in Dalian,China.Numerical models were established using the equivalent linear method combined with ABAQUS finite element software to analyze the seismic response of the interacting system.The results showed that the amplification coefficient of the soil acceleration did not change evidently with the burial depth of the new tunnel but decreased as the seismic amplitude increased.In addition,the existing tunnel acceleration,earth pressure,and internal force were hardly affected by the change in the burial depth;for the new tunnel,the acceleration and internal force decreased as the burial depth increased,while the earth pressure increased.This shows that the earth pressure distribution in a double-tunnel system is relatively complex and mainly concentrated on the arch spandrel and arch springing of the relative area.Overall,when the horizontal clearance between the center of the two tunnels was more than twice the sum of the radius of the outer edges of the two tunnels,the change in the burial depth of the new tunnel had little effect on the existing one,and the tunnel structure was deemed safe.These results provide a preliminary understanding and reference for the seismic performance of a double-tunnel system.
基金supported by the National Natural Science Foundation of China(Grant Nos.62102240,62071283)the China Postdoctoral Science Foundation(Grant No.2020M683421)the Key R&D Program of Shaanxi Province(Grant No.2020ZDLGY10-05).
文摘As an ingenious convergence between the Internet of Things and social networks,the Social Internet of Things(SIoT)can provide effective and intelligent information services and has become one of the main platforms for people to spread and share information.Nevertheless,SIoT is characterized by high openness and autonomy,multiple kinds of information can spread rapidly,freely and cooperatively in SIoT,which makes it challenging to accurately reveal the characteristics of the information diffusion process and effectively control its diffusion.To this end,with the aim of exploring multi-information cooperative diffusion processes in SIoT,we first develop a dynamics model for multi-information cooperative diffusion based on the system dynamics theory in this paper.Subsequently,the characteristics and laws of the dynamical evolution process of multi-information cooperative diffusion are theoretically investigated,and the diffusion trend is predicted.On this basis,to further control the multi-information cooperative diffusion process efficiently,we propose two control strategies for information diffusion with control objectives,develop an optimal control system for the multi-information cooperative diffusion process,and propose the corresponding optimal control method.The optimal solution distribution of the control strategy satisfying the control system constraints and the control budget constraints is solved using the optimal control theory.Finally,extensive simulation experiments based on real dataset from Twitter validate the correctness and effectiveness of the proposed model,strategy and method.
文摘The recent development of the Internet of Things(IoTs)resulted in the growth of IoT-based DDoS attacks.The detection of Botnet in IoT systems implements advanced cybersecurity measures to detect and reduce malevolent botnets in interconnected devices.Anomaly detection models evaluate transmission patterns,network traffic,and device behaviour to detect deviations from usual activities.Machine learning(ML)techniques detect patterns signalling botnet activity,namely sudden traffic increase,unusual command and control patterns,or irregular device behaviour.In addition,intrusion detection systems(IDSs)and signature-based techniques are applied to recognize known malware signatures related to botnets.Various ML and deep learning(DL)techniques have been developed to detect botnet attacks in IoT systems.To overcome security issues in an IoT environment,this article designs a gorilla troops optimizer with DL-enabled botnet attack detection and classification(GTODL-BADC)technique.The GTODL-BADC technique follows feature selection(FS)with optimal DL-based classification for accomplishing security in an IoT environment.For data preprocessing,the min-max data normalization approach is primarily used.The GTODL-BADC technique uses the GTO algorithm to select features and elect optimal feature subsets.Moreover,the multi-head attention-based long short-term memory(MHA-LSTM)technique was applied for botnet detection.Finally,the tree seed algorithm(TSA)was used to select the optimum hyperparameter for the MHA-LSTM method.The experimental validation of the GTODL-BADC technique can be tested on a benchmark dataset.The simulation results highlighted that the GTODL-BADC technique demonstrates promising performance in the botnet detection process.
基金funded by the National Key Research and Development Program of China(No:2020YFC1909900)the National Natural Science Foundation of China(No:51908550)the Scientific Research Project of China Academy of Railway Sciences Group Corporation Limited(No:2021YJ173).
文摘Purpose – This study aims to analyze the factors, evaluation techniques of the durability of existing railwayengineering.Design/methodology/approach – China has built a railway network of over 150,000 km. Ensuring thesafety of the existing railway engineering is of great significance for maintaining normal railway operationorder. However, railway engineering is a strip structure that crosses multiple complex environments. Andrailway engineering will withstand high-frequency impact loads from trains. The above factors have led todifferences in the deterioration characteristics and maintenance strategies of railway engineering compared toconventional concrete structures. Therefore, it is very important to analyze the key factors that affect thedurability of railway structures and propose technologies for durability evaluation.Findings – The factors that affect the durability and reliability of railway engineering are mainly divided intothree categories: material factors, environmental factors and load factors. Among them, material factors alsoinclude influencing factors, such as raw materials, mix proportions and so on. Environmental factors varydepending on the service environment of railway engineering, and the durability and deterioration of concretehave different failure mechanisms. Load factors include static load and train dynamic load. The on-site rapiddetection methods for five common diseases in railway engineering are also proposed in this paper. Thesemethods can quickly evaluate the durability of existing railway engineering concrete.Originality/value – The research can provide some new evaluation techniques and methods for thedurability of existing railway engineering.
基金supported in part by the National Natural Science Foundation of China(Grant Nos.61601346 and 62377039)the Natural Science Basic Research Plan in Shaanxi Province of China(Grant No.2018JQ6044)+2 种基金the Ministry of Industry and Information Technology of the People's Republic of China(Grant No.2023-276-1-1)the Fundamental Research Funds for the Central Universities,Northwestern Polytechnical University(Grant No.31020180QD089)the Aeronautical Science Foundation of China(Grant Nos.20200043053004 and 20200043053005)。
文摘One of the major challenges arising in internet of military things(IoMT)is accommodating massive connectivity while providing guaranteed quality of service(QoS)in terms of ultra-high reliability.In this regard,this paper presents a class of code-domain nonorthogonal multiple accesses(NOMAs)for uplink ultra reliable networking of massive IoMT based on tactical datalink such as Link-16 and joint tactical information distribution system(JTIDS).In the considered scenario,a satellite equipped with Nr antennas servers K devices including vehicles,drones,ships,sensors,handset radios,etc.Nonorthogonal coded modulation,a special form of multiple input multiple output(MIMO)-NOMA is proposed.The discussion starts with evaluating the output signal to interference-plus-noise(SINR)of receiver filter,leading to the unveiling of a closed-form expression for overloading systems as the number of users is significantly larger than the number of devices admitted such that massive connectivity is rendered.The expression allows for the development of simple yet successful interference suppression based on power allocation and phase shaping techniques that maximizes the sum rate since it is equivalent to fixed-point programming as can be proved.The proposed design is exemplified by nonlinear modulation schemes such as minimum shift keying(MSK)and Gaussian MSK(GMSK),two pivotal modulation formats in IoMT standards such as Link-16 and JITDS.Numerical results show that near capacity performance is offered.Fortunately,the performance is obtained using simple forward error corrections(FECs)of higher coding rate than existing schemes do,while the transmit power is reduced by 6 dB.The proposed design finds wide applications not only in IoMT but also in deep space communications,where ultra reliability and massive connectivity is a keen concern.
文摘Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.
文摘The proliferation of IoT devices requires innovative approaches to gaining insights while preserving privacy and resources amid unprecedented data generation.However,FL development for IoT is still in its infancy and needs to be explored in various areas to understand the key challenges for deployment in real-world scenarios.The paper systematically reviewed the available literature using the PRISMA guiding principle.The study aims to provide a detailed overview of the increasing use of FL in IoT networks,including the architecture and challenges.A systematic review approach is used to collect,categorize and analyze FL-IoT-based articles.Asearch was performed in the IEEE,Elsevier,Arxiv,ACM,and WOS databases and 92 articles were finally examined.Inclusion measures were published in English and with the keywords“FL”and“IoT”.The methodology begins with an overview of recent advances in FL and the IoT,followed by a discussion of how these two technologies can be integrated.To be more specific,we examine and evaluate the capabilities of FL by talking about communication protocols,frameworks and architecture.We then present a comprehensive analysis of the use of FL in a number of key IoT applications,including smart healthcare,smart transportation,smart cities,smart industry,smart finance,and smart agriculture.The key findings from this analysis of FL IoT services and applications are also presented.Finally,we performed a comparative analysis with FL IID(independent and identical data)and non-ID,traditional centralized deep learning(DL)approaches.We concluded that FL has better performance,especially in terms of privacy protection and resource utilization.FL is excellent for preserving privacy becausemodel training takes place on individual devices or edge nodes,eliminating the need for centralized data aggregation,which poses significant privacy risks.To facilitate development in this rapidly evolving field,the insights presented are intended to help practitioners and researchers navigate the complex terrain of FL and IoT.
基金supported by Science and Technology Project of China Southern Power Grid Company Limited under Grant Number 036000KK52200058(GDKJXM20202001).
文摘Time synchronization(TS)is crucial for ensuring the secure and reliable functioning of the distribution power Internet of Things(IoT).Multi-clock source time synchronization(MTS)has significant advantages of high reliability and accuracy but still faces challenges such as optimization of the multi-clock source selection and the clock source weight calculation at different timescales,and the coupling of synchronization latency jitter and pulse phase difference.In this paper,the multi-timescale MTS model is conducted,and the reinforcement learning(RL)and analytic hierarchy process(AHP)-based multi-timescale MTS algorithm is designed to improve the weighted summation of synchronization latency jitter standard deviation and average pulse phase difference.Specifically,the multi-clock source selection is optimized based on Softmax in the large timescale,and the clock source weight calculation is optimized based on lower confidence bound-assisted AHP in the small timescale.Simulation shows that the proposed algorithm can effectively reduce time synchronization delay standard deviation and average pulse phase difference.
基金supported in part by National Key Research and Development Program of China under Grant 2021YFB2900404.
文摘Due to the limited computational capability and the diversity of the Internet of Things devices working in different environment,we consider fewshot learning-based automatic modulation classification(AMC)to improve its reliability.A data enhancement module(DEM)is designed by a convolutional layer to supplement frequency-domain information as well as providing nonlinear mapping that is beneficial for AMC.Multimodal network is designed to have multiple residual blocks,where each residual block has multiple convolutional kernels of different sizes for diverse feature extraction.Moreover,a deep supervised loss function is designed to supervise all parts of the network including the hidden layers and the DEM.Since different model may output different results,cooperative classifier is designed to avoid the randomness of single model and improve the reliability.Simulation results show that this few-shot learning-based AMC method can significantly improve the AMC accuracy compared to the existing methods.