With the rapid growth in the number of mobile devices and user connectivity,the demand for higher system capacity and improved qualityof-service is required.As the demand for high-speed wireless communication grows,nu...With the rapid growth in the number of mobile devices and user connectivity,the demand for higher system capacity and improved qualityof-service is required.As the demand for high-speed wireless communication grows,numerous modulation techniques in the frequency,temporal,and spatial domains,such as orthogonal frequency division multiplexing(OFDM),time division multiple access(TDMA),space division multiple access(SDMA),and multiple-input multiple-output(MIMO),are being developed.Along with those approaches,electromagnetic waves’orbital angular momentum(OAM)is attracting attention because it has the potential to boost the wireless communication capacity.Antenna electromagnetic radiation can be described by a sum of Eigen functions with unique eigenvalues,as is well known.In order to address such issues,the millimeter-wave(mmWave)communication is proposed which is considered as one of the potential technology for 5G wireless networks.The intrinsic feature of all electromagnetic waves is OAM.The OAM beams’unique qualities have led to a slew of new uses.Broadband OAM generators,on the other hand,have gotten very little attention,especially in the mmWave frequency band.The use of OAM in conjunction with mmWave can reduce the beam power loss,enhance the received signal quality,and hence increase the systemcapacity.The transmitter and receiver antennas must be coaxial and parallel to achieve precise mode detection.The proposed mmWave integrated with OAM system model is discussed in this study.The channel model is created using the channel transition characteristics.The simulation results demonstrate that the proposed system model is a good way to boost the system capacity.展开更多
Wireless data traffic has expanded at a rate that reminds us of Moore’s prediction for integrated circuits in recent years,necessitating ongoing attempts to supply wireless systems with ever-larger data rates in the ...Wireless data traffic has expanded at a rate that reminds us of Moore’s prediction for integrated circuits in recent years,necessitating ongoing attempts to supply wireless systems with ever-larger data rates in the near future,despite the under-deployment of 5G networks.Terahertz(THz)communication has been considered a viable response to communication blackout due to the rapid development of THz technology and sensors.THz communication has a high frequency,which allows for better penetration.It is a fast expanding and evolving industry,driven by an increase in wireless traffic volume and data transfer speeds.A THz modulator based on a hybrid metasurface was devised and built in this work.The device’s modulation capabilities were modelled and proved experimentally.The electrolyte is an ion-gel medium implanted between graphene and metasurface,and the active material is graphene.On the metasurface,the interaction between the THz wave and graphene is improved.Additionally,an external bias voltage was employed to actively regulate the THz waves by tuning the electrical conductivity of graphene.The results show that with a minimal bias voltage,the device can achieve a modulation depth of up to 73%at the resonant frequency.Furthermore,during the modulation process,the resonance frequency remains almost constant.As a result,the proposed gadget offers a unique tool for substantial THz amplitude modulation at low voltages.展开更多
The Internet has penetrated all aspects of human society and has promoted social progress.Cyber-crimes in many forms are commonplace and are dangerous to society and national security.Cybersecurity has become a major ...The Internet has penetrated all aspects of human society and has promoted social progress.Cyber-crimes in many forms are commonplace and are dangerous to society and national security.Cybersecurity has become a major concern for citizens and governments.The Internet functions and software applications play a vital role in cybersecurity research and practice.Most of the cyber-attacks are based on exploits in system or application software.It is of utmost urgency to investigate software security problems.The demand for Wi-Fi applications is proliferating but the security problem is growing,requiring an optimal solution from researchers.To overcome the shortcomings of the wired equivalent privacy(WEP)algorithm,the existing literature proposed security schemes forWi-Fi protected access(WPA)/WPA2.However,in practical applications,the WPA/WPA2 scheme still has some weaknesses that attackers exploit.To destroy a WPA/WPA2 security,it is necessary to get a PSK pre-shared key in pre-shared key mode,or an MSK master session key in the authentication mode.Brute-force cracking attacks can get a phase-shift keying(PSK)or a minimum shift keying(MSK).In real-world applications,many wireless local area networks(LANs)use the pre-shared key mode.Therefore,brute-force cracking of WPA/WPA2-PSK is important in that context.This article proposes a new mechanism to crack theWi-Fi password using a graphical processing unit(GPU)and enhances the efficiency through parallel computing of multiple GPU chips.Experimental results show that the proposed algorithm is effective and provides a procedure to enhance the security of Wi-Fi networks.展开更多
Wireless sensor networks(WSNs)have gotten a lot of attention as useful tools for gathering data.The energy problem has been a fundamental constraint and challenge faced by many WSN applications due to the size and cos...Wireless sensor networks(WSNs)have gotten a lot of attention as useful tools for gathering data.The energy problem has been a fundamental constraint and challenge faced by many WSN applications due to the size and cost constraints of the sensor nodes.This paper proposed a data fusion model based on the back propagation neural network(BPNN)model to address the problem of a large number of invalid or redundant data.Using three layeredbased BPNNs and a TEEN threshold,the proposed model describes the cluster structure and filters out unnecessary details.During the information transmission process,the neural network’s output function is used to deal with a large amount of sensing data,where the feature value of sensing data is extracted and transmitted to the sink node.In terms of life cycle,data traffic,and network use,simulation results show that the proposed data fusion model outperforms the traditional TEEN protocol.As a result,the proposed scheme increases the life cycle of the network thereby lowering energy usage and traffic.展开更多
The growing need for renewable energy and zero carbon dioxide emissions has fueled the development of thermoelectric generators with improved power generating capability.Along with the endeavor to develop thermoelectr...The growing need for renewable energy and zero carbon dioxide emissions has fueled the development of thermoelectric generators with improved power generating capability.Along with the endeavor to develop thermoelectric materials with greater figures of merit,the geometrical and structural optimization of thermoelectric generators is equally critical for maximum power output and efficiency.Green energy strategies that are constantly updated are a viable option for addressing the global energy issue while also protecting the environment.There have been significant focuses on the development of thermoelectric modules for a range of solar,automotive,military,and aerospace applications in recent years due to various advantages including as low vibration,great reliability and durability,and the absence of moving components.In order to enhance the system performance of the thermoelectric generator,an artificial neural network(ANN)based algorithm is proposed.Furthermore,to achieve high efficiency and system stability,a buck converter is designed and deployed.Simulation and experimental findings demonstrate that the suggested method is viable and available,and that it is almost similar to the real value in the steady state with the least power losses,making it ideal for vehicle exhaust thermoelectric generator applications.Furthermore,the proposed hybrid algorithm has a high reference value for the development of a dependable and efficient car exhaust thermoelectric generating system.展开更多
Wireless communication is one of the rapidly-growing elds of the communication industry.This continuous growth motivates the antenna community to design new radiating structures to meet the needs of the market.The 5G ...Wireless communication is one of the rapidly-growing elds of the communication industry.This continuous growth motivates the antenna community to design new radiating structures to meet the needs of the market.The 5G wireless communication has received a lot of attention from both academia and industry and signicant efforts have been made to improve different aspects,such as data rate,latency,mobility,reliability and QoS.Antenna design has received renewed attention in the last decade due to its potential applications in 5G,IoT,mmWave,and massive MIMO.This paper proposes a novel design of broadband antenna for 5G mmWave and optical communication networks.It is a hybrid structure that works for both spectrums and contains an absorption dielectric material with an electrical large size.A hybrid transmission line theory ray-tracing technique is proposed efcient and rapid simulation and optimization of the proposed antenna design.The operating frequency and wavelength of the proposed antenna are 28 GHz in the mmWave band and 1550 nm for the optical spectrum.The spatial frequency is 30 lp/mm when the contrast transfer function is reduced to 0.7 for the optical signal.The effective focal length and aperture are 816.86 and 200 mm.The half-power beamwidth is 3.29◦and the gain is 32.97 dBi for the mmWave band.Simulation results show that the proposed hybrid antenna can effectively be deployed simultaneously for both optical and mmWave 5G communication networks.展开更多
Deep neural networks(DNN)are widely employed in a wide range of intelligent applications,including image and video recognition.However,due to the enormous amount of computations required by DNN.Therefore,performing DN...Deep neural networks(DNN)are widely employed in a wide range of intelligent applications,including image and video recognition.However,due to the enormous amount of computations required by DNN.Therefore,performing DNN inference tasks locally is problematic for resourceconstrained Internet of Things(IoT)devices.Existing cloud approaches are sensitive to problems like erratic communication delays and unreliable remote server performance.The utilization of IoT device collaboration to create distributed and scalable DNN task inference is a very promising strategy.The existing research,on the other hand,exclusively looks at the static split method in the scenario of homogeneous IoT devices.As a result,there is a pressing need to investigate how to divide DNN tasks adaptively among IoT devices with varying capabilities and resource constraints,and execute the task inference cooperatively.Two major obstacles confront the aforementioned research problems:1)In a heterogeneous dynamic multi-device environment,it is difficult to estimate the multi-layer inference delay of DNN tasks;2)It is difficult to intelligently adapt the collaborative inference approach in real time.As a result,a multi-layer delay prediction model with fine-grained interpretability is proposed initially.Furthermore,for DNN inference tasks,evolutionary reinforcement learning(ERL)is employed to adaptively discover the approximate best split strategy.Experiments show that,in a heterogeneous dynamic environment,the proposed framework can provide considerable DNN inference acceleration.When the number of devices is 2,3,and 4,the delay acceleration of the proposed algorithm is 1.81 times,1.98 times and 5.28 times that of the EE algorithm,respectively.展开更多
The Internet has become an unavoidable trend of all things due to the rapid growth of networking technology,smart home technology encompasses a variety of sectors,including intelligent transportation,allowing users to...The Internet has become an unavoidable trend of all things due to the rapid growth of networking technology,smart home technology encompasses a variety of sectors,including intelligent transportation,allowing users to communicate with anybody or any device at any time and from anywhere.However,most things are different now.Background:Structured data is a form of separated storage that slows down the rate at which everything is connected.Data pattern matching is commonly used in data connectivity and can help with the issues mentioned above.Aim:The present pattern matching system is ineffective due to the heterogeneity and rapid expansion of large IoT data.The method requires a lot of manual work and has a poor match with real-world applications.In the modern IoT context,solving the challenge of automatic pattern matching is complex.Methodology:A three-layer mapping matching is proposed for heterogeneous data from the IoT,and a hierarchical pattern matching technique.The feature classification matching,relational feature clustering matching,and mixed element matching are all examples of feature classification matching.Through layer-by-layer matching,the algorithm gradually narrows the matching space,improving matching quality,reducing the number of matching between components and the degree of manual participation,and producing a better automatic mode matching.Results:The algorithm’s efficiency and performance are tested using a large number of data samples,and the results show that the technique is practical and effective.Conclusion:the proposed algorithm utilizes the instance information of the data pattern.It deploys three-layer mapping matching approach and mixed element matching and realizes the automatic pattern matching of heterogeneous data which reduces the matching space between elements in complex patterns.It improves the efficiency and accuracy of automatic matching.展开更多
At present,the microwave frequency band bandwidth used for mobile communication is only 600 MHz.In 2020,the 5G mobile Communication required about 1 GHz of bandwidth,so people need to tap new spectrum resources to mee...At present,the microwave frequency band bandwidth used for mobile communication is only 600 MHz.In 2020,the 5G mobile Communication required about 1 GHz of bandwidth,so people need to tap new spectrum resources to meet the development needs of mobile Internet traffic that will increase by 1,000 times in the next 10 years.Utilize the potentially large bandwidth(30∼300 GHz)of the millimeter wave frequency band to provide higher data rates is regarded as the potential development trend of the future wireless communication technology.A microstrip patch implementation approach based on electromagnetic coupling feeding is presented to increase the bandwidth of a dual-polarized millimeter-wave antenna.To extend the antenna unit's impedance bandwidth,coplanar parasitic patches and spatial parallel parasitic patches are used,and a 22 sub-array antenna is developed using paired inverse feed technology.The standing wave at the centre frequency of 37.5 GHz is less than 2 GHz.The antenna array's relative bandwidth is 6.13 percent,the isolation is>30 dB,the cross-polarization is−23.6 dB,and the gain is 11.5 dBi,according to the norm.The proposed dual-polarized microstrip antenna has the characteristics of wide frequency bandwidth,large port isolation,low cross-polarization,and high gain.The antenna performance meets the general engineering requirements of millimeter-wave dual-polarized antennas.展开更多
文摘With the rapid growth in the number of mobile devices and user connectivity,the demand for higher system capacity and improved qualityof-service is required.As the demand for high-speed wireless communication grows,numerous modulation techniques in the frequency,temporal,and spatial domains,such as orthogonal frequency division multiplexing(OFDM),time division multiple access(TDMA),space division multiple access(SDMA),and multiple-input multiple-output(MIMO),are being developed.Along with those approaches,electromagnetic waves’orbital angular momentum(OAM)is attracting attention because it has the potential to boost the wireless communication capacity.Antenna electromagnetic radiation can be described by a sum of Eigen functions with unique eigenvalues,as is well known.In order to address such issues,the millimeter-wave(mmWave)communication is proposed which is considered as one of the potential technology for 5G wireless networks.The intrinsic feature of all electromagnetic waves is OAM.The OAM beams’unique qualities have led to a slew of new uses.Broadband OAM generators,on the other hand,have gotten very little attention,especially in the mmWave frequency band.The use of OAM in conjunction with mmWave can reduce the beam power loss,enhance the received signal quality,and hence increase the systemcapacity.The transmitter and receiver antennas must be coaxial and parallel to achieve precise mode detection.The proposed mmWave integrated with OAM system model is discussed in this study.The channel model is created using the channel transition characteristics.The simulation results demonstrate that the proposed system model is a good way to boost the system capacity.
文摘Wireless data traffic has expanded at a rate that reminds us of Moore’s prediction for integrated circuits in recent years,necessitating ongoing attempts to supply wireless systems with ever-larger data rates in the near future,despite the under-deployment of 5G networks.Terahertz(THz)communication has been considered a viable response to communication blackout due to the rapid development of THz technology and sensors.THz communication has a high frequency,which allows for better penetration.It is a fast expanding and evolving industry,driven by an increase in wireless traffic volume and data transfer speeds.A THz modulator based on a hybrid metasurface was devised and built in this work.The device’s modulation capabilities were modelled and proved experimentally.The electrolyte is an ion-gel medium implanted between graphene and metasurface,and the active material is graphene.On the metasurface,the interaction between the THz wave and graphene is improved.Additionally,an external bias voltage was employed to actively regulate the THz waves by tuning the electrical conductivity of graphene.The results show that with a minimal bias voltage,the device can achieve a modulation depth of up to 73%at the resonant frequency.Furthermore,during the modulation process,the resonance frequency remains almost constant.As a result,the proposed gadget offers a unique tool for substantial THz amplitude modulation at low voltages.
文摘The Internet has penetrated all aspects of human society and has promoted social progress.Cyber-crimes in many forms are commonplace and are dangerous to society and national security.Cybersecurity has become a major concern for citizens and governments.The Internet functions and software applications play a vital role in cybersecurity research and practice.Most of the cyber-attacks are based on exploits in system or application software.It is of utmost urgency to investigate software security problems.The demand for Wi-Fi applications is proliferating but the security problem is growing,requiring an optimal solution from researchers.To overcome the shortcomings of the wired equivalent privacy(WEP)algorithm,the existing literature proposed security schemes forWi-Fi protected access(WPA)/WPA2.However,in practical applications,the WPA/WPA2 scheme still has some weaknesses that attackers exploit.To destroy a WPA/WPA2 security,it is necessary to get a PSK pre-shared key in pre-shared key mode,or an MSK master session key in the authentication mode.Brute-force cracking attacks can get a phase-shift keying(PSK)or a minimum shift keying(MSK).In real-world applications,many wireless local area networks(LANs)use the pre-shared key mode.Therefore,brute-force cracking of WPA/WPA2-PSK is important in that context.This article proposes a new mechanism to crack theWi-Fi password using a graphical processing unit(GPU)and enhances the efficiency through parallel computing of multiple GPU chips.Experimental results show that the proposed algorithm is effective and provides a procedure to enhance the security of Wi-Fi networks.
文摘Wireless sensor networks(WSNs)have gotten a lot of attention as useful tools for gathering data.The energy problem has been a fundamental constraint and challenge faced by many WSN applications due to the size and cost constraints of the sensor nodes.This paper proposed a data fusion model based on the back propagation neural network(BPNN)model to address the problem of a large number of invalid or redundant data.Using three layeredbased BPNNs and a TEEN threshold,the proposed model describes the cluster structure and filters out unnecessary details.During the information transmission process,the neural network’s output function is used to deal with a large amount of sensing data,where the feature value of sensing data is extracted and transmitted to the sink node.In terms of life cycle,data traffic,and network use,simulation results show that the proposed data fusion model outperforms the traditional TEEN protocol.As a result,the proposed scheme increases the life cycle of the network thereby lowering energy usage and traffic.
文摘The growing need for renewable energy and zero carbon dioxide emissions has fueled the development of thermoelectric generators with improved power generating capability.Along with the endeavor to develop thermoelectric materials with greater figures of merit,the geometrical and structural optimization of thermoelectric generators is equally critical for maximum power output and efficiency.Green energy strategies that are constantly updated are a viable option for addressing the global energy issue while also protecting the environment.There have been significant focuses on the development of thermoelectric modules for a range of solar,automotive,military,and aerospace applications in recent years due to various advantages including as low vibration,great reliability and durability,and the absence of moving components.In order to enhance the system performance of the thermoelectric generator,an artificial neural network(ANN)based algorithm is proposed.Furthermore,to achieve high efficiency and system stability,a buck converter is designed and deployed.Simulation and experimental findings demonstrate that the suggested method is viable and available,and that it is almost similar to the real value in the steady state with the least power losses,making it ideal for vehicle exhaust thermoelectric generator applications.Furthermore,the proposed hybrid algorithm has a high reference value for the development of a dependable and efficient car exhaust thermoelectric generating system.
文摘Wireless communication is one of the rapidly-growing elds of the communication industry.This continuous growth motivates the antenna community to design new radiating structures to meet the needs of the market.The 5G wireless communication has received a lot of attention from both academia and industry and signicant efforts have been made to improve different aspects,such as data rate,latency,mobility,reliability and QoS.Antenna design has received renewed attention in the last decade due to its potential applications in 5G,IoT,mmWave,and massive MIMO.This paper proposes a novel design of broadband antenna for 5G mmWave and optical communication networks.It is a hybrid structure that works for both spectrums and contains an absorption dielectric material with an electrical large size.A hybrid transmission line theory ray-tracing technique is proposed efcient and rapid simulation and optimization of the proposed antenna design.The operating frequency and wavelength of the proposed antenna are 28 GHz in the mmWave band and 1550 nm for the optical spectrum.The spatial frequency is 30 lp/mm when the contrast transfer function is reduced to 0.7 for the optical signal.The effective focal length and aperture are 816.86 and 200 mm.The half-power beamwidth is 3.29◦and the gain is 32.97 dBi for the mmWave band.Simulation results show that the proposed hybrid antenna can effectively be deployed simultaneously for both optical and mmWave 5G communication networks.
文摘Deep neural networks(DNN)are widely employed in a wide range of intelligent applications,including image and video recognition.However,due to the enormous amount of computations required by DNN.Therefore,performing DNN inference tasks locally is problematic for resourceconstrained Internet of Things(IoT)devices.Existing cloud approaches are sensitive to problems like erratic communication delays and unreliable remote server performance.The utilization of IoT device collaboration to create distributed and scalable DNN task inference is a very promising strategy.The existing research,on the other hand,exclusively looks at the static split method in the scenario of homogeneous IoT devices.As a result,there is a pressing need to investigate how to divide DNN tasks adaptively among IoT devices with varying capabilities and resource constraints,and execute the task inference cooperatively.Two major obstacles confront the aforementioned research problems:1)In a heterogeneous dynamic multi-device environment,it is difficult to estimate the multi-layer inference delay of DNN tasks;2)It is difficult to intelligently adapt the collaborative inference approach in real time.As a result,a multi-layer delay prediction model with fine-grained interpretability is proposed initially.Furthermore,for DNN inference tasks,evolutionary reinforcement learning(ERL)is employed to adaptively discover the approximate best split strategy.Experiments show that,in a heterogeneous dynamic environment,the proposed framework can provide considerable DNN inference acceleration.When the number of devices is 2,3,and 4,the delay acceleration of the proposed algorithm is 1.81 times,1.98 times and 5.28 times that of the EE algorithm,respectively.
基金The Authors would like to thank as well the Royal Academy of Engineering(UK)for supporting this research under grant Transforming Systems through Partnership Project Number TSP1040.
文摘The Internet has become an unavoidable trend of all things due to the rapid growth of networking technology,smart home technology encompasses a variety of sectors,including intelligent transportation,allowing users to communicate with anybody or any device at any time and from anywhere.However,most things are different now.Background:Structured data is a form of separated storage that slows down the rate at which everything is connected.Data pattern matching is commonly used in data connectivity and can help with the issues mentioned above.Aim:The present pattern matching system is ineffective due to the heterogeneity and rapid expansion of large IoT data.The method requires a lot of manual work and has a poor match with real-world applications.In the modern IoT context,solving the challenge of automatic pattern matching is complex.Methodology:A three-layer mapping matching is proposed for heterogeneous data from the IoT,and a hierarchical pattern matching technique.The feature classification matching,relational feature clustering matching,and mixed element matching are all examples of feature classification matching.Through layer-by-layer matching,the algorithm gradually narrows the matching space,improving matching quality,reducing the number of matching between components and the degree of manual participation,and producing a better automatic mode matching.Results:The algorithm’s efficiency and performance are tested using a large number of data samples,and the results show that the technique is practical and effective.Conclusion:the proposed algorithm utilizes the instance information of the data pattern.It deploys three-layer mapping matching approach and mixed element matching and realizes the automatic pattern matching of heterogeneous data which reduces the matching space between elements in complex patterns.It improves the efficiency and accuracy of automatic matching.
文摘At present,the microwave frequency band bandwidth used for mobile communication is only 600 MHz.In 2020,the 5G mobile Communication required about 1 GHz of bandwidth,so people need to tap new spectrum resources to meet the development needs of mobile Internet traffic that will increase by 1,000 times in the next 10 years.Utilize the potentially large bandwidth(30∼300 GHz)of the millimeter wave frequency band to provide higher data rates is regarded as the potential development trend of the future wireless communication technology.A microstrip patch implementation approach based on electromagnetic coupling feeding is presented to increase the bandwidth of a dual-polarized millimeter-wave antenna.To extend the antenna unit's impedance bandwidth,coplanar parasitic patches and spatial parallel parasitic patches are used,and a 22 sub-array antenna is developed using paired inverse feed technology.The standing wave at the centre frequency of 37.5 GHz is less than 2 GHz.The antenna array's relative bandwidth is 6.13 percent,the isolation is>30 dB,the cross-polarization is−23.6 dB,and the gain is 11.5 dBi,according to the norm.The proposed dual-polarized microstrip antenna has the characteristics of wide frequency bandwidth,large port isolation,low cross-polarization,and high gain.The antenna performance meets the general engineering requirements of millimeter-wave dual-polarized antennas.