The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range comm...The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range communication capabilities of smart mobile devices,the decentralized content sharing approach has emerged as a suitable and promising alternative.Decentralized content sharing uses a peer-to-peer network among colocated smart mobile device users to fulfil content requests.Several articles have been published to date to address its different aspects including group management,interest extraction,message forwarding,participation incentive,and content replication.This survey paper summarizes and critically analyzes recent advancements in decentralized content sharing and highlights potential research issues that need further consideration.展开更多
Due to the fading characteristics of wireless channels and the burstiness of data traffic,how to deal with congestion in Ad-hoc networks with effective algorithms is still open and challenging.In this paper,we focus o...Due to the fading characteristics of wireless channels and the burstiness of data traffic,how to deal with congestion in Ad-hoc networks with effective algorithms is still open and challenging.In this paper,we focus on enabling congestion control to minimize network transmission delays through flexible power control.To effectively solve the congestion problem,we propose a distributed cross-layer scheduling algorithm,which is empowered by graph-based multi-agent deep reinforcement learning.The transmit power is adaptively adjusted in real-time by our algorithm based only on local information(i.e.,channel state information and queue length)and local communication(i.e.,information exchanged with neighbors).Moreover,the training complexity of the algorithm is low due to the regional cooperation based on the graph attention network.In the evaluation,we show that our algorithm can reduce the transmission delay of data flow under severe signal interference and drastically changing channel states,and demonstrate the adaptability and stability in different topologies.The method is general and can be extended to various types of topologies.展开更多
Most existing work on survivability in mobile ad-hoc networks(MANETs) focuses on two dimensional(2D) networks.However,many real applications run in three dimensional(3D) networks,e.g.,climate and ocean monitoring,and ...Most existing work on survivability in mobile ad-hoc networks(MANETs) focuses on two dimensional(2D) networks.However,many real applications run in three dimensional(3D) networks,e.g.,climate and ocean monitoring,and air defense systems.The impact on network survivability due to node behaviors was presented,and a quantitative analysis method on survivability was developed in 3D MANETs by modeling node behaviors and analyzing 3D network connectivity.Node behaviors were modeled by using a semi-Markov process.The node minimum degree of 3D MANETs was discussed.An effective approach to derive the survivability of k-connected networks was proposed through analyzing the connectivity of 3D MANETs caused by node misbehaviors,based on the model of node isolation.The quantitative analysis of node misbehaviors on the survivability in 3D MANETs is obtained through mathematical description,and the effectiveness and rationality of the proposed approach are verified through numerical analysis.The analytical results show that the effect from black and gray attack on network survivability is much severer than other misbehaviors.展开更多
This work presents a multi-criteria analysis of the MAC (media access control) layer misbehavior of the IEEE (Institute of Electrical and Electronics Engineers) 802.11 standard, whose principle is to cheat at the ...This work presents a multi-criteria analysis of the MAC (media access control) layer misbehavior of the IEEE (Institute of Electrical and Electronics Engineers) 802.11 standard, whose principle is to cheat at the protocol to increase the transmission rate by greedy nodes at the expense of the other honest nodes. In fact, IEEE 802.11 forces nodes for access to the channel to wait for a back off interval, randomly selected from a specified range, before initiating a transmission. Greedy nodes may wait for smaller back-off intervals than honest nodes, and then obtaining an unfair assignment. In the first of our works a state of art on the research on IEEE 802.11 MAC layer misbehavior are presented. Then the impact of this misbehavior at the reception is given, and we will generalize this impact on a large scale. An analysis of the correlation between the throughput and the inter-packets time is given. Afterwards, we will define a new metric for measuring the performance and capability of the network.展开更多
A dynamical model is constructed to depict the spatial-temporal evolution of malware in mobile wireless sensor networks(MWSNs). Based on such a model, we design a hybrid control scheme combining parameter perturbation...A dynamical model is constructed to depict the spatial-temporal evolution of malware in mobile wireless sensor networks(MWSNs). Based on such a model, we design a hybrid control scheme combining parameter perturbation and state feedback to effectively manipulate the spatiotemporal dynamics of malware propagation. The hybrid control can not only suppress the Turing instability caused by diffusion factor but can also adjust the occurrence of Hopf bifurcation induced by time delay. Numerical simulation results show that the hybrid control strategy can efficiently manipulate the transmission dynamics to achieve our expected desired properties, thus reducing the harm of malware propagation to MWSNs.展开更多
Mobile edge computing(MEC)-enabled satellite-terrestrial networks(STNs)can provide Internet of Things(IoT)devices with global computing services.Sometimes,the network state information is uncertain or unknown.To deal ...Mobile edge computing(MEC)-enabled satellite-terrestrial networks(STNs)can provide Internet of Things(IoT)devices with global computing services.Sometimes,the network state information is uncertain or unknown.To deal with this situation,we investigate online learning-based offloading decision and resource allocation in MEC-enabled STNs in this paper.The problem of minimizing the average sum task completion delay of all IoT devices over all time periods is formulated.We decompose this optimization problem into a task offloading decision problem and a computing resource allocation problem.A joint optimization scheme of offloading decision and resource allocation is then proposed,which consists of a task offloading decision algorithm based on the devices cooperation aided upper confidence bound(UCB)algorithm and a computing resource allocation algorithm based on the Lagrange multiplier method.Simulation results validate that the proposed scheme performs better than other baseline schemes.展开更多
Traffic in today’s cities is a serious problem that increases travel times,negatively affects the environment,and drains financial resources.This study presents an Artificial Intelligence(AI)augmentedMobile Ad Hoc Ne...Traffic in today’s cities is a serious problem that increases travel times,negatively affects the environment,and drains financial resources.This study presents an Artificial Intelligence(AI)augmentedMobile Ad Hoc Networks(MANETs)based real-time prediction paradigm for urban traffic challenges.MANETs are wireless networks that are based on mobile devices and may self-organize.The distributed nature of MANETs and the power of AI approaches are leveraged in this framework to provide reliable and timely traffic congestion forecasts.This study suggests a unique Chaotic Spatial Fuzzy Polynomial Neural Network(CSFPNN)technique to assess real-time data acquired from various sources within theMANETs.The framework uses the proposed approach to learn from the data and create predictionmodels to detect possible traffic problems and their severity in real time.Real-time traffic prediction allows for proactive actions like resource allocation,dynamic route advice,and traffic signal optimization to reduce congestion.The framework supports effective decision-making,decreases travel time,lowers fuel use,and enhances overall urban mobility by giving timely information to pedestrians,drivers,and urban planners.Extensive simulations and real-world datasets are used to test the proposed framework’s prediction accuracy,responsiveness,and scalability.Experimental results show that the suggested framework successfully anticipates urban traffic issues in real-time,enables proactive traffic management,and aids in creating smarter,more sustainable cities.展开更多
In an era where digital technology is paramount, higher education institutions like the University of Zambia (UNZA) are employing advanced computer networks to enhance their operational capacity and offer cutting-edge...In an era where digital technology is paramount, higher education institutions like the University of Zambia (UNZA) are employing advanced computer networks to enhance their operational capacity and offer cutting-edge services to their academic fraternity. Spanning across the Great East Road campus, UNZA has established one of the most extensive computer networks in Zambia, serving a burgeoning community of over 20,000 active users through a Metropolitan Area Network (MAN). However, as the digital landscape continues to evolve, it is besieged with burgeoning challenges that threaten the very fabric of network integrity—cyber security threats and the imperatives of maintaining high Quality of Service (QoS). In an effort to mitigate these threats and ensure network efficiency, the development of a mobile application to monitor temperatures in the server room was imperative. According to L. Wei, X. Zeng, and T. Shen, the use of wireless sensory networks to monitor the temperature of train switchgear contact points represents a cost-effective solution. The system is based on wireless communication technology and is detailed in their paper, “A wireless solution for train switchgear contact temperature monitoring and alarming system based on wireless communication technology”, published in the International Journal of Communications, Network and System Sciences, vol. 8, no. 4, pp. 79-87, 2015 [1]. Therefore, in this study, a mobile application technology was explored for monitoring of temperatures in the server room in order to aid Cisco device performance. Additionally, this paper also explores the hardening of Cisco device security and QoS which are the cornerstones of this study.展开更多
Mobile and Internet network coverage plays an important role in digital transformation and the exploitation of new services. The evolution of mobile networks from the first generation (1G) to the 5th generation is sti...Mobile and Internet network coverage plays an important role in digital transformation and the exploitation of new services. The evolution of mobile networks from the first generation (1G) to the 5th generation is still a long process. 2G networks have developed the messaging service, which complements the already operational voice service. 2G technology has rapidly progressed to the third generation (3G), incorporating multimedia data transmission techniques. It then progressed to fourth generation (4G) and LTE (Long Term Evolution), increasing the transmission speed to improve 3G. Currently, developed countries have already moved to 5G. In developing countries, including Burundi, a member of the East African Community (ECA) where more than 80% are connected to 2G technologies, 40% are connected to the 3G network and 25% to the 4G network and are not yet connected to the 5G network and then still a process. The objective of this article is to analyze the coverage of 2G, 3G and 4G networks in Burundi. This analysis will make it possible to identify possible deficits in order to reduce the digital divide between connected urban areas and remote rural areas. Furthermore, this analysis will draw the attention of decision-makers to the need to deploy networks and coverage to allow the population to access mobile and Internet services and thus enable the digitalization of the population. Finally, this article shows the level of coverage, the digital divide and an overview of the deployment of base stations (BTS) throughout the country to promote the transformation and digital inclusion of services.展开更多
Survivability refers to the ability of a network system to fulfill critical services in a timely manner to end users in the presence of failures and/or attacks. In order to establish a highly survivable system, it is ...Survivability refers to the ability of a network system to fulfill critical services in a timely manner to end users in the presence of failures and/or attacks. In order to establish a highly survivable system, it is necessary to measure its survivability to evaluate the performance of the system's services under adverse conditions. According to survivability requirements of large-scale mobile ad-hoc networks (MANETs), we propose a novel model for quantitative evaluation on survivability. The proposed model considers various types of faults and connection states of mobile hosts, and uses the continuous time Markov chain (CTMC) to describe the survivability of MANETs in a precise manner. We introduce the reliability theory to perform quantitative analysis and survivability evaluation of segment-by-segment routing (SSR), multipath-based segment-by-segment routing (MP-SSR), and segment-by-segment-based multipath routing (SS-MPR) in large-scale MANETs. The proposed model can be used to analyze the network performance much more easily than a simulation-based approach. Numerical validation shows that the proposed model can be used to obtain a better evaluation result on the survivability of large-scale MANETs.展开更多
Mobile computing is the most powerful application for network com-munication and connectivity,given recent breakthroughs in thefield of wireless networks or Mobile Ad-hoc networks(MANETs).There are several obstacles th...Mobile computing is the most powerful application for network com-munication and connectivity,given recent breakthroughs in thefield of wireless networks or Mobile Ad-hoc networks(MANETs).There are several obstacles that effective networks confront and the networks must be able to transport data from one system to another with adequate precision.For most applications,a frame-work must ensure that the retrieved data reflects the transmitted data.Before driv-ing to other nodes,if the frame between the two nodes is deformed in the data-link layer,it must be repaired.Most link-layer protocols immediately disregard the frame and enable the high-layer protocols to transmit it down.In other words,because of asset information must be secured from threats,information is a valu-able resource.In MANETs,some applications necessitate the use of a network method for detecting and blocking these assaults.Building a secure intrusion detection system in the network,which provides security to the nodes and route paths in the network,is a major difficulty in MANET.Attacks on the network can jeopardize security issues discovered by the intrusion detection system engine,which are then blocked by the network’s intrusion prevention engine.By bringing the Secure Intrusion Detection System(S-IDS)into the network,a new technique for implementing security goals and preventing attacks will be developed.The Secure Energy Routing(SER)protocol for MANETs is introduced in this study.The protocol addresses the issue of network security by detecting and preventing attacks in the network.The data transmission in the MANET is forwarded using Elliptical Curve Cryptography(ECC)with an objective to improve the level of security.Network Simulator–2 is used to simulate the network and experiments are compared with existing methods.展开更多
Jamming attack is quite serious threat for Mobile networks that collapses all necessary communication infrastructure. Since mobile nodes in Mobile Ad Hoc Networks (MANET) communicate in a multi-hop mode, there is alwa...Jamming attack is quite serious threat for Mobile networks that collapses all necessary communication infrastructure. Since mobile nodes in Mobile Ad Hoc Networks (MANET) communicate in a multi-hop mode, there is always a possibility for an intruder to launch a jamming attack in order to intercept communication among communication nodes. In this study, a network simulation has been carried out in order to explore and evaluate the possible impacts of jamming attack on MACAW protocol. Ad-hoc network modelling is used to provide communication infrastructure among mobile nodes in order to modelling the simulation scenarios. In simulation model, these nodes have used AODV routing protocol which is designed for MANET while second scenario contains simulated MACAW node models for comparison. On the other hand, this paper is the first study that addresses performance evaluation of MACAW protocol under a constant Jamming Attack. The performance of MACAW protocol is simulated through OPNET Modeler 14.5 software.展开更多
Power saving is one of the key issues in Mobile Ad-Hoc Networks (MANETs). It can be realized in Medium Access Control (MAC) layer and network layer. However, previous attentions were mainly paid to MAC layer or ne...Power saving is one of the key issues in Mobile Ad-Hoc Networks (MANETs). It can be realized in Medium Access Control (MAC) layer and network layer. However, previous attentions were mainly paid to MAC layer or network layer with the aim of improving the channel utilization by adopting variable-range transmission power control. In this paper we focus on the power saving in both MAC layer and network layer, and propose a Power Adjusting Algorithm (PAA). In the presence of host's mobility, PAA is designed to conserve energy by adjusting the transmission power to maintain the route's connectivity and restarting the route discovery periodically to find a new route with better energy efficiency dynamically. After analyzing the operations of PAA, we find that the length of route discovery restarting period is a critical argument which will affect power saving, and an energy consumption model is abstracted to find the optimal value of the restarting period by analyzing the energy consumption of this algorithm. PAA can handle the mobility of MANET by adjusting the transmission power and in the meantime save energy by restarting route discovery periodically to balance the energy consumption on route discovery and packet delivering. Simulation results show that, PAA saves nearly 40% energy compared with Dynamic Source Routing protocol when the maximum speed of mobile hosts is larger than 8 m/s.展开更多
In coded caching,users cache pieces of files under a specific arrangement so that the server can satisfy their requests simultaneously in the broadcast scenario via e Xclusive OR(XOR)operation and therefore reduce the...In coded caching,users cache pieces of files under a specific arrangement so that the server can satisfy their requests simultaneously in the broadcast scenario via e Xclusive OR(XOR)operation and therefore reduce the amount of transmission data.However,when users’locations are changing,the uploading of caching information is frequent and extensive that the traffic increase outweighed the traffic reduction that the traditional coded caching achieved.In this paper,we propose mobile coded caching schemes to reduce network traffic in mobility scenarios,which achieve a lower cost on caching information uploading.In the cache placement phase,the proposed scheme first constructs caching patterns,and then assigns the caching patterns to users according to the graph coloring method and four color theorem in our centralized cache placement algorithm or randomly in our decentralized cache placement algorithm.Then users are divided into groups based on their caching patterns.As a benefit,when user movements occur,the types of caching pattern,rather than the whole caching information of which file pieces are cached,are uploaded.In the content delivery phase,XOR coded caching messages are reconstructed.Transmission data volume is derived to measure the performance of the proposed schemes.Numerical results show that the proposed schemes achieve great improvement in traffic offloading.展开更多
A large number of Web APIs have been released as services in mobile communications,but the service provided by a single Web API is usually limited.To enrich the services in mobile communications,developers have combin...A large number of Web APIs have been released as services in mobile communications,but the service provided by a single Web API is usually limited.To enrich the services in mobile communications,developers have combined Web APIs and developed a new service,which is known as a mashup.The emergence of mashups greatly increases the number of services in mobile communications,especially in mobile networks and the Internet-of-Things(IoT),and has encouraged companies and individuals to develop even more mashups,which has led to the dramatic increase in the number of mashups.Such a trend brings with it big data,such as the massive text data from the mashups themselves and continually-generated usage data.Thus,the question of how to determine the most suitable mashups from big data has become a challenging problem.In this paper,we propose a mashup recommendation framework from big data in mobile networks and the IoT.The proposed framework is driven by machine learning techniques,including neural embedding,clustering,and matrix factorization.We employ neural embedding to learn the distributed representation of mashups and propose to use cluster analysis to learn the relationship among the mashups.We also develop a novel Joint Matrix Factorization(JMF)model to complete the mashup recommendation task,where we design a new objective function and an optimization algorithm.We then crawl through a real-world large mashup dataset and perform experiments.The experimental results demonstrate that our framework achieves high accuracy in mashup recommendation and performs better than all compared baselines.展开更多
Accurate traffic pattern prediction in largescale networks is of great importance for intelligent system management and automatic resource allocation.System-level mobile traffic forecasting has significant challenges ...Accurate traffic pattern prediction in largescale networks is of great importance for intelligent system management and automatic resource allocation.System-level mobile traffic forecasting has significant challenges due to the tremendous temporal and spatial dynamics introduced by diverse Internet user behaviors and frequent traffic migration.Spatialtemporal graph modeling is an efficient approach for analyzing the spatial relations and temporal trends of mobile traffic in a large system.Previous research may not reflect the optimal dependency by ignoring inter-base station dependency or pre-determining the explicit geological distance as the interrelationship of base stations.To overcome the limitations of graph structure,this study proposes an adaptive graph convolutional network(AGCN)that captures the latent spatial dependency by developing self-adaptive dependency matrices and acquires temporal dependency using recurrent neural networks.Evaluated on two mobile network datasets,the experimental results demonstrate that this method outperforms other baselines and reduces the mean absolute error by 3.7%and 5.6%compared to time-series based approaches.展开更多
Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient co...Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient connection among IIoT devices,CRNs enhance spectrum utilization by using licensed spectrum.However,the routing protocol in these networks is considered one of the main problems due to node mobility and time-variant channel selection.Specifically,the channel selection for routing protocol is indispensable in CRNs to provide an adequate adaptation to the Primary User(PU)activity and create a robust routing path.This study aims to construct a robust routing path by minimizing PU interference and routing delay to maximize throughput within the IIoT domain.Thus,a generic routing framework from a cross-layer perspective is investigated that intends to share the information resources by exploiting a recently proposed method,namely,Channel Availability Probability.Moreover,a novel cross-layer-oriented routing protocol is proposed by using a time-variant channel estimation technique.This protocol combines lower layer(Physical layer and Data Link layer)sensing that is derived from the channel estimation model.Also,it periodically updates and stores the routing table for optimal route decision-making.Moreover,in order to achieve higher throughput and lower delay,a new routing metric is presented.To evaluate the performance of the proposed protocol,network simulations have been conducted and also compared to the widely used routing protocols,as a benchmark.The simulation results of different routing scenarios demonstrate that our proposed solution outperforms the existing protocols in terms of the standard network performance metrics involving packet delivery ratio(with an improved margin of around 5–20%approximately)under varying numbers of PUs and cognitive users in Mobile Cognitive Radio Networks(MCRNs).Moreover,the cross-layer routing protocol successfully achieves high routing performance in finding a robust route,selecting the high channel stability,and reducing the probability of PU interference for continued communication.展开更多
Mobile Industrial Internet of Things(IIoT)applications have achieved the explosive growth in recent years.The mobile IIoT has flourished and become the backbone of the industry,laying a solid foundation for the interc...Mobile Industrial Internet of Things(IIoT)applications have achieved the explosive growth in recent years.The mobile IIoT has flourished and become the backbone of the industry,laying a solid foundation for the interconnection of all things.The variety of application scenarios has brought serious challenges to mobile IIoT networks,which face complex and changeable communication environments.Ensuring data secure transmission is critical for mobile IIoT networks.This paper investigates the data secure transmission performance prediction of mobile IIoT networks.To cut down computational complexity,we propose a data secure transmission scheme employing Transmit Antenna Selection(TAS).The novel secrecy performance expressions are first derived.Then,to realize real-time secrecy analysis,we design an improved Convolutional Neural Network(CNN)model,and propose an intelligent data secure transmission performance prediction algorithm.For mobile signals,the important features may be removed by the pooling layers.This will lead to negative effects on the secrecy performance prediction.A novel nine-layer improved CNN model is designed.Out of the input and output layers,it removes the pooling layer and contains six convolution layers.Elman,Back-Propagation(BP)and LeNet methods are employed to compare with the proposed algorithm.Through simulation analysis,good prediction accuracy is achieved by the CNN algorithm.The prediction accuracy obtains a 59%increase.展开更多
For mobile satellite networks, an appropriate handover scheme should be devised to shorten handover delay with optimized application of network resources. By introducing the handover cost model of service, this articl...For mobile satellite networks, an appropriate handover scheme should be devised to shorten handover delay with optimized application of network resources. By introducing the handover cost model of service, this article proposes a rerouting triggering scheme for path optimization after handover and a new minimum cost handover algorithm for mobile satellite networks. This algorithm ensures the quality of service (QoS) parameters, such as delay, during the handover and minimizes the handover costs. Simulation indicates that this algorithm is superior to other current algorithms in guaranteeing the QoS and decreasing handover costs.展开更多
The satellite-terrestrial networks possess the ability to transcend geographical constraints inherent in traditional communication networks,enabling global coverage and offering users ubiquitous computing power suppor...The satellite-terrestrial networks possess the ability to transcend geographical constraints inherent in traditional communication networks,enabling global coverage and offering users ubiquitous computing power support,which is an important development direction of future communications.In this paper,we take into account a multi-scenario network model under the coverage of low earth orbit(LEO)satellite,which can provide computing resources to users in faraway areas to improve task processing efficiency.However,LEO satellites experience limitations in computing and communication resources and the channels are time-varying and complex,which makes the extraction of state information a daunting task.Therefore,we explore the dynamic resource management issue pertaining to joint computing,communication resource allocation and power control for multi-access edge computing(MEC).In order to tackle this formidable issue,we undertake the task of transforming the issue into a Markov decision process(MDP)problem and propose the self-attention based dynamic resource management(SABDRM)algorithm,which effectively extracts state information features to enhance the training process.Simulation results show that the proposed algorithm is capable of effectively reducing the long-term average delay and energy consumption of the tasks.展开更多
文摘The evolution of smart mobile devices has significantly impacted the way we generate and share contents and introduced a huge volume of Internet traffic.To address this issue and take advantage of the short-range communication capabilities of smart mobile devices,the decentralized content sharing approach has emerged as a suitable and promising alternative.Decentralized content sharing uses a peer-to-peer network among colocated smart mobile device users to fulfil content requests.Several articles have been published to date to address its different aspects including group management,interest extraction,message forwarding,participation incentive,and content replication.This survey paper summarizes and critically analyzes recent advancements in decentralized content sharing and highlights potential research issues that need further consideration.
基金supported by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government(MSIT) (No.RS-2022-00155885, Artificial Intelligence Convergence Innovation Human Resources Development (Hanyang University ERICA))supported by the National Natural Science Foundation of China under Grant No. 61971264the National Natural Science Foundation of China/Research Grants Council Collaborative Research Scheme under Grant No. 62261160390
文摘Due to the fading characteristics of wireless channels and the burstiness of data traffic,how to deal with congestion in Ad-hoc networks with effective algorithms is still open and challenging.In this paper,we focus on enabling congestion control to minimize network transmission delays through flexible power control.To effectively solve the congestion problem,we propose a distributed cross-layer scheduling algorithm,which is empowered by graph-based multi-agent deep reinforcement learning.The transmit power is adaptively adjusted in real-time by our algorithm based only on local information(i.e.,channel state information and queue length)and local communication(i.e.,information exchanged with neighbors).Moreover,the training complexity of the algorithm is low due to the regional cooperation based on the graph attention network.In the evaluation,we show that our algorithm can reduce the transmission delay of data flow under severe signal interference and drastically changing channel states,and demonstrate the adaptability and stability in different topologies.The method is general and can be extended to various types of topologies.
基金Project(07JJ1010) supported by the Hunan Provincial Natural Science Foundation of China for Distinguished Young ScholarsProjects(61073037,60773013) supported by the National Natural Science Foundation of China
文摘Most existing work on survivability in mobile ad-hoc networks(MANETs) focuses on two dimensional(2D) networks.However,many real applications run in three dimensional(3D) networks,e.g.,climate and ocean monitoring,and air defense systems.The impact on network survivability due to node behaviors was presented,and a quantitative analysis method on survivability was developed in 3D MANETs by modeling node behaviors and analyzing 3D network connectivity.Node behaviors were modeled by using a semi-Markov process.The node minimum degree of 3D MANETs was discussed.An effective approach to derive the survivability of k-connected networks was proposed through analyzing the connectivity of 3D MANETs caused by node misbehaviors,based on the model of node isolation.The quantitative analysis of node misbehaviors on the survivability in 3D MANETs is obtained through mathematical description,and the effectiveness and rationality of the proposed approach are verified through numerical analysis.The analytical results show that the effect from black and gray attack on network survivability is much severer than other misbehaviors.
文摘This work presents a multi-criteria analysis of the MAC (media access control) layer misbehavior of the IEEE (Institute of Electrical and Electronics Engineers) 802.11 standard, whose principle is to cheat at the protocol to increase the transmission rate by greedy nodes at the expense of the other honest nodes. In fact, IEEE 802.11 forces nodes for access to the channel to wait for a back off interval, randomly selected from a specified range, before initiating a transmission. Greedy nodes may wait for smaller back-off intervals than honest nodes, and then obtaining an unfair assignment. In the first of our works a state of art on the research on IEEE 802.11 MAC layer misbehavior are presented. Then the impact of this misbehavior at the reception is given, and we will generalize this impact on a large scale. An analysis of the correlation between the throughput and the inter-packets time is given. Afterwards, we will define a new metric for measuring the performance and capability of the network.
基金Project supported by the National Natural Science Foundation of China (Grant No. 62073172)the Natural Science Foundation of Jiangsu Province of China (Grant No. BK20221329)。
文摘A dynamical model is constructed to depict the spatial-temporal evolution of malware in mobile wireless sensor networks(MWSNs). Based on such a model, we design a hybrid control scheme combining parameter perturbation and state feedback to effectively manipulate the spatiotemporal dynamics of malware propagation. The hybrid control can not only suppress the Turing instability caused by diffusion factor but can also adjust the occurrence of Hopf bifurcation induced by time delay. Numerical simulation results show that the hybrid control strategy can efficiently manipulate the transmission dynamics to achieve our expected desired properties, thus reducing the harm of malware propagation to MWSNs.
基金supported by National Key Research and Development Program of China(2018YFC1504502).
文摘Mobile edge computing(MEC)-enabled satellite-terrestrial networks(STNs)can provide Internet of Things(IoT)devices with global computing services.Sometimes,the network state information is uncertain or unknown.To deal with this situation,we investigate online learning-based offloading decision and resource allocation in MEC-enabled STNs in this paper.The problem of minimizing the average sum task completion delay of all IoT devices over all time periods is formulated.We decompose this optimization problem into a task offloading decision problem and a computing resource allocation problem.A joint optimization scheme of offloading decision and resource allocation is then proposed,which consists of a task offloading decision algorithm based on the devices cooperation aided upper confidence bound(UCB)algorithm and a computing resource allocation algorithm based on the Lagrange multiplier method.Simulation results validate that the proposed scheme performs better than other baseline schemes.
基金the Deanship of Scientific Research at Majmaah University for supporting this work under Project No.R-2024-1008.
文摘Traffic in today’s cities is a serious problem that increases travel times,negatively affects the environment,and drains financial resources.This study presents an Artificial Intelligence(AI)augmentedMobile Ad Hoc Networks(MANETs)based real-time prediction paradigm for urban traffic challenges.MANETs are wireless networks that are based on mobile devices and may self-organize.The distributed nature of MANETs and the power of AI approaches are leveraged in this framework to provide reliable and timely traffic congestion forecasts.This study suggests a unique Chaotic Spatial Fuzzy Polynomial Neural Network(CSFPNN)technique to assess real-time data acquired from various sources within theMANETs.The framework uses the proposed approach to learn from the data and create predictionmodels to detect possible traffic problems and their severity in real time.Real-time traffic prediction allows for proactive actions like resource allocation,dynamic route advice,and traffic signal optimization to reduce congestion.The framework supports effective decision-making,decreases travel time,lowers fuel use,and enhances overall urban mobility by giving timely information to pedestrians,drivers,and urban planners.Extensive simulations and real-world datasets are used to test the proposed framework’s prediction accuracy,responsiveness,and scalability.Experimental results show that the suggested framework successfully anticipates urban traffic issues in real-time,enables proactive traffic management,and aids in creating smarter,more sustainable cities.
文摘In an era where digital technology is paramount, higher education institutions like the University of Zambia (UNZA) are employing advanced computer networks to enhance their operational capacity and offer cutting-edge services to their academic fraternity. Spanning across the Great East Road campus, UNZA has established one of the most extensive computer networks in Zambia, serving a burgeoning community of over 20,000 active users through a Metropolitan Area Network (MAN). However, as the digital landscape continues to evolve, it is besieged with burgeoning challenges that threaten the very fabric of network integrity—cyber security threats and the imperatives of maintaining high Quality of Service (QoS). In an effort to mitigate these threats and ensure network efficiency, the development of a mobile application to monitor temperatures in the server room was imperative. According to L. Wei, X. Zeng, and T. Shen, the use of wireless sensory networks to monitor the temperature of train switchgear contact points represents a cost-effective solution. The system is based on wireless communication technology and is detailed in their paper, “A wireless solution for train switchgear contact temperature monitoring and alarming system based on wireless communication technology”, published in the International Journal of Communications, Network and System Sciences, vol. 8, no. 4, pp. 79-87, 2015 [1]. Therefore, in this study, a mobile application technology was explored for monitoring of temperatures in the server room in order to aid Cisco device performance. Additionally, this paper also explores the hardening of Cisco device security and QoS which are the cornerstones of this study.
文摘Mobile and Internet network coverage plays an important role in digital transformation and the exploitation of new services. The evolution of mobile networks from the first generation (1G) to the 5th generation is still a long process. 2G networks have developed the messaging service, which complements the already operational voice service. 2G technology has rapidly progressed to the third generation (3G), incorporating multimedia data transmission techniques. It then progressed to fourth generation (4G) and LTE (Long Term Evolution), increasing the transmission speed to improve 3G. Currently, developed countries have already moved to 5G. In developing countries, including Burundi, a member of the East African Community (ECA) where more than 80% are connected to 2G technologies, 40% are connected to the 3G network and 25% to the 4G network and are not yet connected to the 5G network and then still a process. The objective of this article is to analyze the coverage of 2G, 3G and 4G networks in Burundi. This analysis will make it possible to identify possible deficits in order to reduce the digital divide between connected urban areas and remote rural areas. Furthermore, this analysis will draw the attention of decision-makers to the need to deploy networks and coverage to allow the population to access mobile and Internet services and thus enable the digitalization of the population. Finally, this article shows the level of coverage, the digital divide and an overview of the deployment of base stations (BTS) throughout the country to promote the transformation and digital inclusion of services.
基金supported by the National Basic Research 973 Program of China under Grant No.2003CB317003the Research Grants Council of the Hong Kong Special Administrative Region,China under Grant No.9041350(CityU 114908)+3 种基金CityU AppliedR&D Funding(ARD-(Ctr-)) under Grant Nos.9681001 and 9678002the Hunan Provincial Natural Science Foundation of China forDistinguished Young Scholars under Grant No.07J J1010the National Natural Science Foundation of China for Major Research Planunder Grant No.90718034the Program for Changjiang Scholars and Innovative Research Team in University under Grant No.IRT0661
文摘Survivability refers to the ability of a network system to fulfill critical services in a timely manner to end users in the presence of failures and/or attacks. In order to establish a highly survivable system, it is necessary to measure its survivability to evaluate the performance of the system's services under adverse conditions. According to survivability requirements of large-scale mobile ad-hoc networks (MANETs), we propose a novel model for quantitative evaluation on survivability. The proposed model considers various types of faults and connection states of mobile hosts, and uses the continuous time Markov chain (CTMC) to describe the survivability of MANETs in a precise manner. We introduce the reliability theory to perform quantitative analysis and survivability evaluation of segment-by-segment routing (SSR), multipath-based segment-by-segment routing (MP-SSR), and segment-by-segment-based multipath routing (SS-MPR) in large-scale MANETs. The proposed model can be used to analyze the network performance much more easily than a simulation-based approach. Numerical validation shows that the proposed model can be used to obtain a better evaluation result on the survivability of large-scale MANETs.
文摘Mobile computing is the most powerful application for network com-munication and connectivity,given recent breakthroughs in thefield of wireless networks or Mobile Ad-hoc networks(MANETs).There are several obstacles that effective networks confront and the networks must be able to transport data from one system to another with adequate precision.For most applications,a frame-work must ensure that the retrieved data reflects the transmitted data.Before driv-ing to other nodes,if the frame between the two nodes is deformed in the data-link layer,it must be repaired.Most link-layer protocols immediately disregard the frame and enable the high-layer protocols to transmit it down.In other words,because of asset information must be secured from threats,information is a valu-able resource.In MANETs,some applications necessitate the use of a network method for detecting and blocking these assaults.Building a secure intrusion detection system in the network,which provides security to the nodes and route paths in the network,is a major difficulty in MANET.Attacks on the network can jeopardize security issues discovered by the intrusion detection system engine,which are then blocked by the network’s intrusion prevention engine.By bringing the Secure Intrusion Detection System(S-IDS)into the network,a new technique for implementing security goals and preventing attacks will be developed.The Secure Energy Routing(SER)protocol for MANETs is introduced in this study.The protocol addresses the issue of network security by detecting and preventing attacks in the network.The data transmission in the MANET is forwarded using Elliptical Curve Cryptography(ECC)with an objective to improve the level of security.Network Simulator–2 is used to simulate the network and experiments are compared with existing methods.
文摘Jamming attack is quite serious threat for Mobile networks that collapses all necessary communication infrastructure. Since mobile nodes in Mobile Ad Hoc Networks (MANET) communicate in a multi-hop mode, there is always a possibility for an intruder to launch a jamming attack in order to intercept communication among communication nodes. In this study, a network simulation has been carried out in order to explore and evaluate the possible impacts of jamming attack on MACAW protocol. Ad-hoc network modelling is used to provide communication infrastructure among mobile nodes in order to modelling the simulation scenarios. In simulation model, these nodes have used AODV routing protocol which is designed for MANET while second scenario contains simulated MACAW node models for comparison. On the other hand, this paper is the first study that addresses performance evaluation of MACAW protocol under a constant Jamming Attack. The performance of MACAW protocol is simulated through OPNET Modeler 14.5 software.
基金Supported by the National Natural Science Foundation of China under Grant Nos. 61070197, 61103049, Shenzhen Research Fund of China under Grant No. JC201005270342A.
文摘Power saving is one of the key issues in Mobile Ad-Hoc Networks (MANETs). It can be realized in Medium Access Control (MAC) layer and network layer. However, previous attentions were mainly paid to MAC layer or network layer with the aim of improving the channel utilization by adopting variable-range transmission power control. In this paper we focus on the power saving in both MAC layer and network layer, and propose a Power Adjusting Algorithm (PAA). In the presence of host's mobility, PAA is designed to conserve energy by adjusting the transmission power to maintain the route's connectivity and restarting the route discovery periodically to find a new route with better energy efficiency dynamically. After analyzing the operations of PAA, we find that the length of route discovery restarting period is a critical argument which will affect power saving, and an energy consumption model is abstracted to find the optimal value of the restarting period by analyzing the energy consumption of this algorithm. PAA can handle the mobility of MANET by adjusting the transmission power and in the meantime save energy by restarting route discovery periodically to balance the energy consumption on route discovery and packet delivering. Simulation results show that, PAA saves nearly 40% energy compared with Dynamic Source Routing protocol when the maximum speed of mobile hosts is larger than 8 m/s.
基金supported by National Natural Science Foundation of China(No.61971060)。
文摘In coded caching,users cache pieces of files under a specific arrangement so that the server can satisfy their requests simultaneously in the broadcast scenario via e Xclusive OR(XOR)operation and therefore reduce the amount of transmission data.However,when users’locations are changing,the uploading of caching information is frequent and extensive that the traffic increase outweighed the traffic reduction that the traditional coded caching achieved.In this paper,we propose mobile coded caching schemes to reduce network traffic in mobility scenarios,which achieve a lower cost on caching information uploading.In the cache placement phase,the proposed scheme first constructs caching patterns,and then assigns the caching patterns to users according to the graph coloring method and four color theorem in our centralized cache placement algorithm or randomly in our decentralized cache placement algorithm.Then users are divided into groups based on their caching patterns.As a benefit,when user movements occur,the types of caching pattern,rather than the whole caching information of which file pieces are cached,are uploaded.In the content delivery phase,XOR coded caching messages are reconstructed.Transmission data volume is derived to measure the performance of the proposed schemes.Numerical results show that the proposed schemes achieve great improvement in traffic offloading.
基金supported by the National Key R&D Program of China (No.2021YFF0901002)the National Natural Science Foundation of China (No.61802291)+1 种基金Fundamental Research Funds for the Provincial Universities of Zhejiang (GK199900299012-025)Fundamental Research Funds for the Central Universities (No.JB210311).
文摘A large number of Web APIs have been released as services in mobile communications,but the service provided by a single Web API is usually limited.To enrich the services in mobile communications,developers have combined Web APIs and developed a new service,which is known as a mashup.The emergence of mashups greatly increases the number of services in mobile communications,especially in mobile networks and the Internet-of-Things(IoT),and has encouraged companies and individuals to develop even more mashups,which has led to the dramatic increase in the number of mashups.Such a trend brings with it big data,such as the massive text data from the mashups themselves and continually-generated usage data.Thus,the question of how to determine the most suitable mashups from big data has become a challenging problem.In this paper,we propose a mashup recommendation framework from big data in mobile networks and the IoT.The proposed framework is driven by machine learning techniques,including neural embedding,clustering,and matrix factorization.We employ neural embedding to learn the distributed representation of mashups and propose to use cluster analysis to learn the relationship among the mashups.We also develop a novel Joint Matrix Factorization(JMF)model to complete the mashup recommendation task,where we design a new objective function and an optimization algorithm.We then crawl through a real-world large mashup dataset and perform experiments.The experimental results demonstrate that our framework achieves high accuracy in mashup recommendation and performs better than all compared baselines.
基金supported by the National Natural Science Foundation of China(61975020,62171053)。
文摘Accurate traffic pattern prediction in largescale networks is of great importance for intelligent system management and automatic resource allocation.System-level mobile traffic forecasting has significant challenges due to the tremendous temporal and spatial dynamics introduced by diverse Internet user behaviors and frequent traffic migration.Spatialtemporal graph modeling is an efficient approach for analyzing the spatial relations and temporal trends of mobile traffic in a large system.Previous research may not reflect the optimal dependency by ignoring inter-base station dependency or pre-determining the explicit geological distance as the interrelationship of base stations.To overcome the limitations of graph structure,this study proposes an adaptive graph convolutional network(AGCN)that captures the latent spatial dependency by developing self-adaptive dependency matrices and acquires temporal dependency using recurrent neural networks.Evaluated on two mobile network datasets,the experimental results demonstrate that this method outperforms other baselines and reduces the mean absolute error by 3.7%and 5.6%compared to time-series based approaches.
文摘Cognitive Radio Networks(CRNs)have become a successful platform in recent years for a diverse range of future systems,in particularly,industrial internet of things(IIoT)applications.In order to provide an efficient connection among IIoT devices,CRNs enhance spectrum utilization by using licensed spectrum.However,the routing protocol in these networks is considered one of the main problems due to node mobility and time-variant channel selection.Specifically,the channel selection for routing protocol is indispensable in CRNs to provide an adequate adaptation to the Primary User(PU)activity and create a robust routing path.This study aims to construct a robust routing path by minimizing PU interference and routing delay to maximize throughput within the IIoT domain.Thus,a generic routing framework from a cross-layer perspective is investigated that intends to share the information resources by exploiting a recently proposed method,namely,Channel Availability Probability.Moreover,a novel cross-layer-oriented routing protocol is proposed by using a time-variant channel estimation technique.This protocol combines lower layer(Physical layer and Data Link layer)sensing that is derived from the channel estimation model.Also,it periodically updates and stores the routing table for optimal route decision-making.Moreover,in order to achieve higher throughput and lower delay,a new routing metric is presented.To evaluate the performance of the proposed protocol,network simulations have been conducted and also compared to the widely used routing protocols,as a benchmark.The simulation results of different routing scenarios demonstrate that our proposed solution outperforms the existing protocols in terms of the standard network performance metrics involving packet delivery ratio(with an improved margin of around 5–20%approximately)under varying numbers of PUs and cognitive users in Mobile Cognitive Radio Networks(MCRNs).Moreover,the cross-layer routing protocol successfully achieves high routing performance in finding a robust route,selecting the high channel stability,and reducing the probability of PU interference for continued communication.
基金supported by the National Natural Science Foundation of China(No.62201313)the Opening Foundation of Fujian Key Laboratory of Sensing and Computing for Smart Cities(Xiamen University)(No.SCSCKF202101)the Open Project of Fujian Provincial Key Laboratory of Information Processing and Intelligent Control(Minjiang University)(No.MJUKF-IPIC202206).
文摘Mobile Industrial Internet of Things(IIoT)applications have achieved the explosive growth in recent years.The mobile IIoT has flourished and become the backbone of the industry,laying a solid foundation for the interconnection of all things.The variety of application scenarios has brought serious challenges to mobile IIoT networks,which face complex and changeable communication environments.Ensuring data secure transmission is critical for mobile IIoT networks.This paper investigates the data secure transmission performance prediction of mobile IIoT networks.To cut down computational complexity,we propose a data secure transmission scheme employing Transmit Antenna Selection(TAS).The novel secrecy performance expressions are first derived.Then,to realize real-time secrecy analysis,we design an improved Convolutional Neural Network(CNN)model,and propose an intelligent data secure transmission performance prediction algorithm.For mobile signals,the important features may be removed by the pooling layers.This will lead to negative effects on the secrecy performance prediction.A novel nine-layer improved CNN model is designed.Out of the input and output layers,it removes the pooling layer and contains six convolution layers.Elman,Back-Propagation(BP)and LeNet methods are employed to compare with the proposed algorithm.Through simulation analysis,good prediction accuracy is achieved by the CNN algorithm.The prediction accuracy obtains a 59%increase.
基金National Natural Science Foundation of China (60532030)National Natural Science Foundation for Distinguished Young Scholars(60625102)
文摘For mobile satellite networks, an appropriate handover scheme should be devised to shorten handover delay with optimized application of network resources. By introducing the handover cost model of service, this article proposes a rerouting triggering scheme for path optimization after handover and a new minimum cost handover algorithm for mobile satellite networks. This algorithm ensures the quality of service (QoS) parameters, such as delay, during the handover and minimizes the handover costs. Simulation indicates that this algorithm is superior to other current algorithms in guaranteeing the QoS and decreasing handover costs.
基金supported by the National Key Research and Development Plan(No.2022YFB2902701)the key Natural Science Foundation of Shenzhen(No.JCYJ20220818102209020).
文摘The satellite-terrestrial networks possess the ability to transcend geographical constraints inherent in traditional communication networks,enabling global coverage and offering users ubiquitous computing power support,which is an important development direction of future communications.In this paper,we take into account a multi-scenario network model under the coverage of low earth orbit(LEO)satellite,which can provide computing resources to users in faraway areas to improve task processing efficiency.However,LEO satellites experience limitations in computing and communication resources and the channels are time-varying and complex,which makes the extraction of state information a daunting task.Therefore,we explore the dynamic resource management issue pertaining to joint computing,communication resource allocation and power control for multi-access edge computing(MEC).In order to tackle this formidable issue,we undertake the task of transforming the issue into a Markov decision process(MDP)problem and propose the self-attention based dynamic resource management(SABDRM)algorithm,which effectively extracts state information features to enhance the training process.Simulation results show that the proposed algorithm is capable of effectively reducing the long-term average delay and energy consumption of the tasks.