The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information ...The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information flow.To ensure effective transmission of wide-frequency electrical information by the communication protocol of a WAMS,this study performs real-time traffic monitoring and analysis of the data network of a power information system,and establishes corresponding network optimization strategies to solve existing transmission problems.This study utilizes the traffic analysis results obtained using the current real-time dynamic monitoring system to design an optimization strategy,covering the optimization in three progressive levels:the underlying communication protocol,source data,and transmission process.Optimization of the system structure and scheduling optimization of data information are validated to be feasible and practical via tests.展开更多
The traffic with tidal phenomenon in Heterogeneous Wireless Networks(HWNs)has radically increased the complexity of radio resource management and its performance analysis.In this paper,a Simplified Dynamic Hierarchy R...The traffic with tidal phenomenon in Heterogeneous Wireless Networks(HWNs)has radically increased the complexity of radio resource management and its performance analysis.In this paper,a Simplified Dynamic Hierarchy Resource Management(SDHRM)algorithm exploiting the resources dynamically and intelligently is proposed with the consideration of tidal traffic.In network-level resource allocation,the proposed algorithm first adopts wavelet neural network to forecast the traffic of each sub-area and then allocates the resources to those sub-areas to maximise the network utility.In connection-level network selection,based on the above resource allocation and the pre-defined QoS requirement,three typical network selection policies are provided to assign traffic flow to the most appropriate network.Furthermore,based on multidimensional Markov model,we analyse the performance of SDHRM in HWNs with heavy tailed traffic.Numerical results show that our theoretical values coincide with the simulation results and the SDHRM can improve the resource utilization.展开更多
The development of communication technologies which support traffic-intensive applications presents new challenges in designing a real-time traffic analysis architecture and an accurate method that suitable for a wide...The development of communication technologies which support traffic-intensive applications presents new challenges in designing a real-time traffic analysis architecture and an accurate method that suitable for a wide variety of traffic types.Current traffic analysis methods are executed on the cloud,which needs to upload the traffic data.Fog computing is a more promising way to save bandwidth resources by offloading these tasks to the fog nodes.However,traffic analysis models based on traditional machine learning need to retrain all traffic data when updating the trained model,which are not suitable for fog computing due to the poor computing power.In this study,we design a novel fog computing based traffic analysis system using broad learning.For one thing,fog computing can provide a distributed architecture for saving the bandwidth resources.For another,we use the broad learning to incrementally train the traffic data,which is more suitable for fog computing because it can support incremental updates of models without retraining all data.We implement our system on the Raspberry Pi,and experimental results show that we have a 98%probability to accurately identify these traffic data.Moreover,our method has a faster training speed compared with Convolutional Neural Network(CNN).展开更多
Recent economic crises like the 2008 financial tsunami has demonstrated a critical need for better understanding of the topologies and various economic,social,and technical mechanisms of the increasingly interconnecte...Recent economic crises like the 2008 financial tsunami has demonstrated a critical need for better understanding of the topologies and various economic,social,and technical mechanisms of the increasingly interconnected global financial system.Such a system largely relies on the interconnectedness of various financial entities such as banks,firms,and investors through complex financial relationships such as interbank payment networks,investment relations,or supply chains.A network-based perspective or approach is needed to study various financial networks in order to improve or extend financial theories,as well as develop business applications.Moreover,with the advance of big data related technologies,and the availability of huge amounts of financial and economic network data,advanced computing technologies and data analytics that can comprehend such big data are also needed.We referred this approach as financial network analytics.We suggest that it will enable stakeholders better understand the network dynamics within the interconnected global financial system and help designing financial policies such as managing and monitoring banking systemic risk,as well as developing intelligent business applications like banking advisory systems.In this paper,we review the existing research about financial network analytics and then discuss its main research challenges from the economic,social,and technological perspectives.展开更多
In this paper, on the basis of the implementation of the national chemical industry standard analytical test methods and analysis of test items, a food additive quality analysis and inspection of network management ap...In this paper, on the basis of the implementation of the national chemical industry standard analytical test methods and analysis of test items, a food additive quality analysis and inspection of network management applications are developed using the development technology of Visual Basic language and computer system operating environment, to achieve a network management software for users on food additives of quality analytical testing. The software sets up an information sharing network platform for enterprise and quality management departments, which is a major innovation in the food additive quality analysis on test management methods and tools.展开更多
A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the oper...A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.展开更多
The performance model proposed by this study represents an innovative approach to deal with performance assessment in ATM (air traffic management). It is based on Bayesian networks methodology, which presents severa...The performance model proposed by this study represents an innovative approach to deal with performance assessment in ATM (air traffic management). It is based on Bayesian networks methodology, which presents several advantages but also some drawbacks as highlighted along the paper. We illustrate the main steps required for building the model and present a number of interesting results. The contribution of the paper is two-fold: (1) It presents a new methodological approach to deal with a problem which is of strategic importance for ANSPs (air navigation service providers); (2) It provides insights on the interdependencies between factors influencing performance. Both results are considered particularly important nowadays, due to the SES (Single European Sky) performance scheme and its related target setting process.展开更多
With the advent of large-scale and high-speed IPv6 network technology, an effective multi-point traffic sampling is becoming a necessity. A distributed multi-point traffic sampling method that provides an accurate and...With the advent of large-scale and high-speed IPv6 network technology, an effective multi-point traffic sampling is becoming a necessity. A distributed multi-point traffic sampling method that provides an accurate and efficient solution to measure IPv6 traffic is proposed. The proposed method is to sample IPv6 traffic based on the analysis of bit randomness of each byte in the packet header. It offers a way to consistently select the same subset of packets at each measurement point, which satisfies the requirement of the distributed multi-point measurement. Finally, using real IPv6 traffic traces, the conclusion that the sampled traffic data have a good uniformity that satisfies the requirement of sampling randomness and can correctly reflect the packet size distribution of full packet trace is proved.展开更多
The Deep Packet Inspection(DPI)method is a popular method that can accurately identify the flow data and its corresponding application.Currently,the DPI method is widely used in common network management systems.Howev...The Deep Packet Inspection(DPI)method is a popular method that can accurately identify the flow data and its corresponding application.Currently,the DPI method is widely used in common network management systems.However,the major limitation of DPI systems is that their signature library is mainly extracted manually,which makes it hard to efficiently obtain the signature of new applications.Hence,in this paper,we propose an automatic signature extraction mechanism using Principal Component Analysis(PCA)technology,which is able to extract the signature automatically.In the proposed method,the signatures are expressed in the form of serial consistent sequences constructed by principal components instead of normally separated substrings in the original data extracted from the traditional methods.Extensive experiments based on numerous sets of data have been carried out to evaluate the performance of the proposed scheme,and the results prove that the newly proposed method can achieve good performance in terms of accuracy and efficiency.展开更多
The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in ...The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in the overall stability of the Internet under various conditions and threats has made network monitoring challenging. A good understanding of the nature and type of network traffic is the key to solving congestion problems. In this paper we describe the architecture and implementation of a scalable network traffic moni-toring and analysis system. The gigabit interface on the monitoring system was configured to capture network traffic and the Multi Router Traffic Grapher (MRTG) and Webalizer produces graphical and detailed traffic analysis. This system is in use at the Obafemi Awolowo University, IleIfe, Nigeria;we describe how this system can be replicated in another environment.展开更多
The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in ...The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in redundant content transmission and the end-point-based communication model.Information-centric networking(ICN)is a paradigm for the future Internet that can be utilized to resolve the data explosion problem.In this paper,we focus on content-centric networking(CCN),one of the key candidate ICN architectures.CCN has been studied in various network environments with the aim of relieving network and server burden,especially in name-based forwarding and in-network caching functionalities.This paper studies the effect of several caching strategies in the CCN domain from the perspective of network and server overhead.Thus,we comprehensively analyze the in-network caching performance of CCN under several popular cache replication methods(i.e.,cache placement).We evaluate the performance with respect to wellknown Internet traffic patterns that follow certain probabilistic distributions,such as the Zipf/Mandelbrot–Zipf distributions,and flashcrowds.For the experiments,we developed an OPNET-based CCN simulator with a realistic Internet-like topology.展开更多
To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empiri...To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.展开更多
In this paper we apply the nonlinear time series analysis method to small-time scale traffic measurement data. The prediction-based method is used to determine the embedding dimension of the traffic data. Based on the...In this paper we apply the nonlinear time series analysis method to small-time scale traffic measurement data. The prediction-based method is used to determine the embedding dimension of the traffic data. Based on the reconstructed phase space, the local support vector machine prediction method is used to predict the traffic measurement data, and the BIC-based neighbouring point selection method is used to choose the number of the nearest neighbouring points for the local support vector machine regression model. The experimental results show that the local support vector machine prediction method whose neighbouring points are optimized can effectively predict the small-time scale traffic measurement data and can reproduce the statistical features of real traffic measurements.展开更多
Cloud-based satellite and terrestrial spectrum shared networks(CB-STSSN)combines the triple advantages of efficient and flexible net-work management of heterogeneous cloud access(H-CRAN),vast coverage of satellite net...Cloud-based satellite and terrestrial spectrum shared networks(CB-STSSN)combines the triple advantages of efficient and flexible net-work management of heterogeneous cloud access(H-CRAN),vast coverage of satellite networks,and good communication quality of terrestrial networks.Thanks to the complementary coverage characteristics,any-time and anywhere high-speed communications can be achieved to meet the various needs of users.The scarcity of spectrum resources is a common prob-lem in both satellite and terrestrial networks.In or-der to improve resource utilization,the spectrum is shared not only within each component but also be-tween satellite beams and terrestrial cells,which intro-duces inter-component interferences.To this end,this paper first proposes an analytical framework which considers the inter-component interferences induced by spectrum sharing(SS).An intelligent SS scheme based on radio map(RM)consisting of LSTM-based beam prediction(BP),transfer learning-based spec-trum prediction(SP)and joint non-preemptive prior-ity and preemptive priority(J-NPAP)-based propor-tional fair spectrum allocation is than proposed.The simulation result shows that the spectrum utilization rate of CB-STSSN is improved and user blocking rate and waiting probability are decreased by the proposed scheme.展开更多
Urban buildings and urban traffic network are considered as the vital arteries of cities which have particular effects especially after the crisis in the search and rescue operations. The aim of this study is to deter...Urban buildings and urban traffic network are considered as the vital arteries of cities which have particular effects especially after the crisis in the search and rescue operations. The aim of this study is to determine the vulnerability of urban areas especially, buildings and traffic networks using multicriteria geographic information systems and decisionmaking methods. As there are many effective criteria on the seismic vulnerability that they have uncertain and vague properties, the method of this paper is applying fuzzy ordered weighted average(OWA) to model the seismic vulnerability of urban buildings and traffic networks in the most optimistic and pessimistic states. The study area is district 6 of Tehran that is affected by the four major faults, and thus will be threatened by the earthquakes. The achieved results illustrated the vulnerability with different degrees of risk levels including very high, high, medium, low and very low. The results show that in the most optimistic case 14% and in the pessimistic case 1% of buildings tolerate in very low vulnerability. The vulnerability of urban street network also indicates that in the optimistic case 12% and in the pessimistic case at most 9% of the area are in appropriate condition and the North and NorthEast of the study area are more vulnerable than South of it.展开更多
With the rapid growth of network bandwidth,traffic identification is currently an important challenge for network management and security.In recent years,packet sampling has been widely used in most network management...With the rapid growth of network bandwidth,traffic identification is currently an important challenge for network management and security.In recent years,packet sampling has been widely used in most network management systems.In this paper,in order to improve the accuracy of network traffic identification,sampled NetFlow data is applied to traffic identification,and the impact of packet sampling on the accuracy of the identification method is studied.This study includes feature selection,a metric correlation analysis for the application behavior,and a traffic identification algorithm.Theoretical analysis and experimental results show that the significance of behavior characteristics becomes lower in the packet sampling environment.Meanwhile,in this paper,the correlation analysis results in different trends according to different features.However,as long as the flow number meets the statistical requirement,the feature selection and the correlation degree will be independent of the sampling ratio.While in a high sampling ratio,where the effective information would be less,the identification accuracy is much lower than the unsampled packets.Finally,in order to improve the accuracy of the identification,we propose a Deep Belief Networks Application Identification(DBNAI)method,which can achieve better classification performance than other state-of-the-art methods.展开更多
The continual growth of the use of technological appliances during the COVID-19 pandemic has resulted in a massive volume of data flow on the Internet,as many employees have transitioned to working from home.Furthermo...The continual growth of the use of technological appliances during the COVID-19 pandemic has resulted in a massive volume of data flow on the Internet,as many employees have transitioned to working from home.Furthermore,with the increase in the adoption of encrypted data transmission by many people who tend to use a Virtual Private Network(VPN)or Tor Browser(dark web)to keep their data privacy and hidden,network traffic encryption is rapidly becoming a universal approach.This affects and complicates the quality of service(QoS),traffic monitoring,and network security provided by Internet Service Providers(ISPs),particularly for analysis and anomaly detection approaches based on the network traffic’s nature.The method of categorizing encrypted traffic is one of the most challenging issues introduced by a VPN as a way to bypass censorship as well as gain access to geo-locked services.Therefore,an efficient approach is especially needed that enables the identification of encrypted network traffic data to extract and select valuable features which improve the quality of service and network management as well as to oversee the overall performance.In this paper,the classification of network traffic data in terms of VPN and non-VPN traffic is studied based on the efficiency of time-based features extracted from network packets.Therefore,this paper suggests two machine learning models that categorize network traffic into encrypted and non-encrypted traffic.The proposed models utilize statistical features(SF),Pearson Correlation(PC),and a Genetic Algorithm(GA),preprocessing the traffic samples into net flow traffic to accomplish the experiment’s objectives.The GA-based method utilizes a stochastic method based on natural genetics and biological evolution to extract essential features.The PC-based method performs well in removing different features of network traffic.With a microsecond perpacket prediction time,the best model achieved an accuracy of more than 95.02 percent in the most demanding traffic classification task,a drop in accuracy of only 2.37 percent in comparison to the entire statistical-based machine learning approach.This is extremely promising for the development of real-time traffic analyzers.展开更多
1 IntroductionNowadays in China, there are more than six hundred million netizens [1]. On April 11, 2015, the nmnbet of simultaneous online users of the Chinese instant message application QQ reached two hundred milli...1 IntroductionNowadays in China, there are more than six hundred million netizens [1]. On April 11, 2015, the nmnbet of simultaneous online users of the Chinese instant message application QQ reached two hundred million [2]. The fast growth ol the lnternet pusnes me rapid development of information technology (IT) and communication technology (CT). Many traditional IT service and CT equipment providers are facing the fusion of IT and CT in the age of digital transformation, and heading toward ICT enterprises. Large global ICT enterprises, such as Apple, Google, Microsoft, Amazon, Verizon, and AT&T, have been contributing to the performance improvement of IT service and CT equipment.展开更多
Many organizations are struggling to provide high bandwidth and reliable internet connectivity at their branch offices and business locations and getting the most out of their operational expense.The need for internet...Many organizations are struggling to provide high bandwidth and reliable internet connectivity at their branch offices and business locations and getting the most out of their operational expense.The need for internet connectivity at any branch offices and business locations is not a luxury anymore but is a necessity.Let us try to understand how to plan and document the SDWAN(Software Defined-Wide Area Network)implementation in an organization.We will try to understand why it is essential to implement the new technology instead of investing in the existing MPLS(Multi-Protocol label switching)by taking an example of a retail organization.Methods:This project/research was performed using the abilities of Software Defined Network Technology and options available in MPLS(Multi-Protocol Label Switching).The Technical Project management principles were adopted as per PMI(Project Management Institute)waterfall methodology.Results/Conclusion:SDWAN technology provides an effective replacement of MPLS network connection for providing WAN connectivity for our office locations.It is essential to follow a documented process for appropriate vendor selection based on the available features and other listed attributes in the article.To be successful in the implementation it is essential to perform a POC(Proof of Concept)in a controlled environment and validate results.SDWAN provides better network performance and improves reliability as the links operate in active-active function.展开更多
The development of scientific inquiry and research has yielded numerous benefits in the realm of intelligent traffic control systems, particularly in the realm of automatic license plate recognition for vehicles. The ...The development of scientific inquiry and research has yielded numerous benefits in the realm of intelligent traffic control systems, particularly in the realm of automatic license plate recognition for vehicles. The design of license plate recognition algorithms has undergone digitalization through the utilization of neural networks. In contemporary times, there is a growing demand for vehicle surveillance due to the need for efficient vehicle processing and traffic management. The design, development, and implementation of a license plate recognition system hold significant social, economic, and academic importance. The study aims to present contemporary methodologies and empirical findings pertaining to automated license plate recognition. The primary focus of the automatic license plate recognition algorithm was on image extraction, character segmentation, and recognition. The task of character segmentation has been identified as the most challenging function based on my observations. The license plate recognition project that we designed demonstrated the effectiveness of this method across various observed conditions. Particularly in low-light environments, such as during periods of limited illumination or inclement weather characterized by precipitation. The method has been subjected to testing using a sample size of fifty images, resulting in a 100% accuracy rate. The findings of this study demonstrate the project’s ability to effectively determine the optimal outcomes of simulations.展开更多
文摘The application and development of a wide-area measurement system(WAMS)has enabled many applications and led to several requirements based on dynamic measurement data.Such data are transmitted as big data information flow.To ensure effective transmission of wide-frequency electrical information by the communication protocol of a WAMS,this study performs real-time traffic monitoring and analysis of the data network of a power information system,and establishes corresponding network optimization strategies to solve existing transmission problems.This study utilizes the traffic analysis results obtained using the current real-time dynamic monitoring system to design an optimization strategy,covering the optimization in three progressive levels:the underlying communication protocol,source data,and transmission process.Optimization of the system structure and scheduling optimization of data information are validated to be feasible and practical via tests.
基金ACKNOWLEDGEMENT This work was supported by the National Na- tural Science Foundation of China under Gra- nts No. 61172079, 61231008, No. 61201141, No. 61301176 the National Basic Research Program of China (973 Program) under Grant No. 2009CB320404+2 种基金 the 111 Project under Gr- ant No. B08038 the National Science and Tec- hnology Major Project under Grant No. 2012- ZX03002009-003, No. 2012ZX03004002-003 and the Shaanxi Province Science and Techno- logy Research and Development Program un- der Grant No. 2011KJXX-40.
文摘The traffic with tidal phenomenon in Heterogeneous Wireless Networks(HWNs)has radically increased the complexity of radio resource management and its performance analysis.In this paper,a Simplified Dynamic Hierarchy Resource Management(SDHRM)algorithm exploiting the resources dynamically and intelligently is proposed with the consideration of tidal traffic.In network-level resource allocation,the proposed algorithm first adopts wavelet neural network to forecast the traffic of each sub-area and then allocates the resources to those sub-areas to maximise the network utility.In connection-level network selection,based on the above resource allocation and the pre-defined QoS requirement,three typical network selection policies are provided to assign traffic flow to the most appropriate network.Furthermore,based on multidimensional Markov model,we analyse the performance of SDHRM in HWNs with heavy tailed traffic.Numerical results show that our theoretical values coincide with the simulation results and the SDHRM can improve the resource utilization.
基金supported by JSPS KAKENHI Grant Number JP16K00117, JP19K20250KDDI Foundationthe China Scholarship Council (201808050016)
文摘The development of communication technologies which support traffic-intensive applications presents new challenges in designing a real-time traffic analysis architecture and an accurate method that suitable for a wide variety of traffic types.Current traffic analysis methods are executed on the cloud,which needs to upload the traffic data.Fog computing is a more promising way to save bandwidth resources by offloading these tasks to the fog nodes.However,traffic analysis models based on traditional machine learning need to retrain all traffic data when updating the trained model,which are not suitable for fog computing due to the poor computing power.In this study,we design a novel fog computing based traffic analysis system using broad learning.For one thing,fog computing can provide a distributed architecture for saving the bandwidth resources.For another,we use the broad learning to incrementally train the traffic data,which is more suitable for fog computing because it can support incremental updates of models without retraining all data.We implement our system on the Raspberry Pi,and experimental results show that we have a 98%probability to accurately identify these traffic data.Moreover,our method has a faster training speed compared with Convolutional Neural Network(CNN).
基金This research was partially supported by Department of informatics,Faculty of Economics,Business Administration and Information Technology,University of Zurich.
文摘Recent economic crises like the 2008 financial tsunami has demonstrated a critical need for better understanding of the topologies and various economic,social,and technical mechanisms of the increasingly interconnected global financial system.Such a system largely relies on the interconnectedness of various financial entities such as banks,firms,and investors through complex financial relationships such as interbank payment networks,investment relations,or supply chains.A network-based perspective or approach is needed to study various financial networks in order to improve or extend financial theories,as well as develop business applications.Moreover,with the advance of big data related technologies,and the availability of huge amounts of financial and economic network data,advanced computing technologies and data analytics that can comprehend such big data are also needed.We referred this approach as financial network analytics.We suggest that it will enable stakeholders better understand the network dynamics within the interconnected global financial system and help designing financial policies such as managing and monitoring banking systemic risk,as well as developing intelligent business applications like banking advisory systems.In this paper,we review the existing research about financial network analytics and then discuss its main research challenges from the economic,social,and technological perspectives.
文摘In this paper, on the basis of the implementation of the national chemical industry standard analytical test methods and analysis of test items, a food additive quality analysis and inspection of network management applications are developed using the development technology of Visual Basic language and computer system operating environment, to achieve a network management software for users on food additives of quality analytical testing. The software sets up an information sharing network platform for enterprise and quality management departments, which is a major innovation in the food additive quality analysis on test management methods and tools.
文摘A network analyzer can often comprehend many protocols, which enables it to display talks taking place between hosts over a network. A network analyzer analyzes the device or network response and measures for the operator to keep an eye on the network’s or object’s performance in an RF circuit. The purpose of the following research includes analyzing the capabilities of NetFlow analyzer to measure various parts, including filters, mixers, frequency sensitive networks, transistors, and other RF-based instruments. NetFlow Analyzer is a network traffic analyzer that measures the network parameters of electrical networks. Although there are other types of network parameter sets including Y, Z, & H-parameters, these instruments are typically employed to measure S-parameters since transmission & reflection of electrical networks are simple to calculate at high frequencies. These analyzers are widely employed to distinguish between two-port networks, including filters and amplifiers. By allowing the user to view the actual data that is sent over a network, packet by packet, a network analyzer informs you of what is happening there. Also, this research will contain the design model of NetFlow Analyzer that Measurements involving transmission and reflection use. Gain, insertion loss, and transmission coefficient are measured in transmission measurements, whereas return loss, reflection coefficient, impedance, and other variables are measured in reflection measurements. These analyzers’ operational frequencies vary from 1 Hz to 1.5 THz. These analyzers can also be used to examine stability in measurements of open loops, audio components, and ultrasonics.
文摘The performance model proposed by this study represents an innovative approach to deal with performance assessment in ATM (air traffic management). It is based on Bayesian networks methodology, which presents several advantages but also some drawbacks as highlighted along the paper. We illustrate the main steps required for building the model and present a number of interesting results. The contribution of the paper is two-fold: (1) It presents a new methodological approach to deal with a problem which is of strategic importance for ANSPs (air navigation service providers); (2) It provides insights on the interdependencies between factors influencing performance. Both results are considered particularly important nowadays, due to the SES (Single European Sky) performance scheme and its related target setting process.
基金This project was supported by the National Natural Science Foundation of China (60572147,60132030)
文摘With the advent of large-scale and high-speed IPv6 network technology, an effective multi-point traffic sampling is becoming a necessity. A distributed multi-point traffic sampling method that provides an accurate and efficient solution to measure IPv6 traffic is proposed. The proposed method is to sample IPv6 traffic based on the analysis of bit randomness of each byte in the packet header. It offers a way to consistently select the same subset of packets at each measurement point, which satisfies the requirement of the distributed multi-point measurement. Finally, using real IPv6 traffic traces, the conclusion that the sampled traffic data have a good uniformity that satisfies the requirement of sampling randomness and can correctly reflect the packet size distribution of full packet trace is proved.
基金supported by the National Natural Science Foundation of China under Grant No.61003282Beijing Higher Education Young Elite Teacher Project+3 种基金China Next Generation Internet(CNGI)Project"Research and Trial on Evolving Next Generation Network Intelligence Capability Enhancement(NICE)"the National Basic Research Program(973 Program)under Grant No.2009CB320-505the National Science and Technology Major Project"Research about Architecture of Mobile Internet"under Grant No.2011ZX03-002-001-01the National High Technology Research and Development Program(863 Program)under Grant No.2011AA010704
文摘The Deep Packet Inspection(DPI)method is a popular method that can accurately identify the flow data and its corresponding application.Currently,the DPI method is widely used in common network management systems.However,the major limitation of DPI systems is that their signature library is mainly extracted manually,which makes it hard to efficiently obtain the signature of new applications.Hence,in this paper,we propose an automatic signature extraction mechanism using Principal Component Analysis(PCA)technology,which is able to extract the signature automatically.In the proposed method,the signatures are expressed in the form of serial consistent sequences constructed by principal components instead of normally separated substrings in the original data extracted from the traditional methods.Extensive experiments based on numerous sets of data have been carried out to evaluate the performance of the proposed scheme,and the results prove that the newly proposed method can achieve good performance in terms of accuracy and efficiency.
文摘The lack of current network dynamics studies that evaluate the effects of new application and protocol deployment or long-term studies that observe the effect of incremental changes on the Internet, and the change in the overall stability of the Internet under various conditions and threats has made network monitoring challenging. A good understanding of the nature and type of network traffic is the key to solving congestion problems. In this paper we describe the architecture and implementation of a scalable network traffic moni-toring and analysis system. The gigabit interface on the monitoring system was configured to capture network traffic and the Multi Router Traffic Grapher (MRTG) and Webalizer produces graphical and detailed traffic analysis. This system is in use at the Obafemi Awolowo University, IleIfe, Nigeria;we describe how this system can be replicated in another environment.
基金supported by Basic Science Research Program through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(2014R1A1A2057796)and(2015R1D1A1A01059049)
文摘The phenomenon of data explosion represents a severe challenge for the upcoming big data era.However,the current Internet architecture is insufficient for dealing with a huge amount of traffic owing to an increase in redundant content transmission and the end-point-based communication model.Information-centric networking(ICN)is a paradigm for the future Internet that can be utilized to resolve the data explosion problem.In this paper,we focus on content-centric networking(CCN),one of the key candidate ICN architectures.CCN has been studied in various network environments with the aim of relieving network and server burden,especially in name-based forwarding and in-network caching functionalities.This paper studies the effect of several caching strategies in the CCN domain from the perspective of network and server overhead.Thus,we comprehensively analyze the in-network caching performance of CCN under several popular cache replication methods(i.e.,cache placement).We evaluate the performance with respect to wellknown Internet traffic patterns that follow certain probabilistic distributions,such as the Zipf/Mandelbrot–Zipf distributions,and flashcrowds.For the experiments,we developed an OPNET-based CCN simulator with a realistic Internet-like topology.
基金The National Basic Research Program of China (973Program) (No.2005CB321802)Program for New Century Excellent Talents in University (No.NCET-06-0926)the National Natural Science Foundation of China (No.60873097,90612009)
文摘To resolve the ontology understanding problem, the structural features and the potential important terms of a large-scale ontology are investigated from the perspective of complex networks analysis. Through the empirical studies of the gene ontology with various perspectives, this paper shows that the whole gene ontology displays the same topological features as complex networks including "small world" and "scale-free",while some sub-ontologies have the "scale-free" property but no "small world" effect.The potential important terms in an ontology are discovered by some famous complex network centralization methods.An evaluation method based on information retrieval in MEDLINE is designed to measure the effectiveness of the discovered important terms.According to the relevant literature of the gene ontology terms,the suitability of these centralization methods for ontology important concepts discovering is quantitatively evaluated.The experimental results indicate that the betweenness centrality is the most appropriate method among all the evaluated centralization measures.
基金Project supported by the National Natural Science Foundation of China (Grant No 60573065)the Natural Science Foundation of Shandong Province,China (Grant No Y2007G33)the Key Subject Research Foundation of Shandong Province,China(Grant No XTD0708)
文摘In this paper we apply the nonlinear time series analysis method to small-time scale traffic measurement data. The prediction-based method is used to determine the embedding dimension of the traffic data. Based on the reconstructed phase space, the local support vector machine prediction method is used to predict the traffic measurement data, and the BIC-based neighbouring point selection method is used to choose the number of the nearest neighbouring points for the local support vector machine regression model. The experimental results show that the local support vector machine prediction method whose neighbouring points are optimized can effectively predict the small-time scale traffic measurement data and can reproduce the statistical features of real traffic measurements.
基金the National Nat-ural Science Foundation of China under Grants 61771163the Natural Science Foundation for Out-standing Young Scholars of Heilongjiang Province un-der Grant YQ2020F001the Science and Technol-ogy on Communication Networks Laboratory under Grants SXX19641X072 and SXX18641X028.(Cor-respondence author:Min Jia)。
文摘Cloud-based satellite and terrestrial spectrum shared networks(CB-STSSN)combines the triple advantages of efficient and flexible net-work management of heterogeneous cloud access(H-CRAN),vast coverage of satellite networks,and good communication quality of terrestrial networks.Thanks to the complementary coverage characteristics,any-time and anywhere high-speed communications can be achieved to meet the various needs of users.The scarcity of spectrum resources is a common prob-lem in both satellite and terrestrial networks.In or-der to improve resource utilization,the spectrum is shared not only within each component but also be-tween satellite beams and terrestrial cells,which intro-duces inter-component interferences.To this end,this paper first proposes an analytical framework which considers the inter-component interferences induced by spectrum sharing(SS).An intelligent SS scheme based on radio map(RM)consisting of LSTM-based beam prediction(BP),transfer learning-based spec-trum prediction(SP)and joint non-preemptive prior-ity and preemptive priority(J-NPAP)-based propor-tional fair spectrum allocation is than proposed.The simulation result shows that the spectrum utilization rate of CB-STSSN is improved and user blocking rate and waiting probability are decreased by the proposed scheme.
文摘Urban buildings and urban traffic network are considered as the vital arteries of cities which have particular effects especially after the crisis in the search and rescue operations. The aim of this study is to determine the vulnerability of urban areas especially, buildings and traffic networks using multicriteria geographic information systems and decisionmaking methods. As there are many effective criteria on the seismic vulnerability that they have uncertain and vague properties, the method of this paper is applying fuzzy ordered weighted average(OWA) to model the seismic vulnerability of urban buildings and traffic networks in the most optimistic and pessimistic states. The study area is district 6 of Tehran that is affected by the four major faults, and thus will be threatened by the earthquakes. The achieved results illustrated the vulnerability with different degrees of risk levels including very high, high, medium, low and very low. The results show that in the most optimistic case 14% and in the pessimistic case 1% of buildings tolerate in very low vulnerability. The vulnerability of urban street network also indicates that in the optimistic case 12% and in the pessimistic case at most 9% of the area are in appropriate condition and the North and NorthEast of the study area are more vulnerable than South of it.
基金supported by Key Scientific and Technological Research Projects in Henan Province(Grand No 192102210125)Key scientific research projects of colleges and universities in Henan Province(23A520054)Open Foundation of State key Laboratory of Networking and Switching Technology(Beijing University of Posts and Telecommunications)(SKLNST-2020-2-01).
文摘With the rapid growth of network bandwidth,traffic identification is currently an important challenge for network management and security.In recent years,packet sampling has been widely used in most network management systems.In this paper,in order to improve the accuracy of network traffic identification,sampled NetFlow data is applied to traffic identification,and the impact of packet sampling on the accuracy of the identification method is studied.This study includes feature selection,a metric correlation analysis for the application behavior,and a traffic identification algorithm.Theoretical analysis and experimental results show that the significance of behavior characteristics becomes lower in the packet sampling environment.Meanwhile,in this paper,the correlation analysis results in different trends according to different features.However,as long as the flow number meets the statistical requirement,the feature selection and the correlation degree will be independent of the sampling ratio.While in a high sampling ratio,where the effective information would be less,the identification accuracy is much lower than the unsampled packets.Finally,in order to improve the accuracy of the identification,we propose a Deep Belief Networks Application Identification(DBNAI)method,which can achieve better classification performance than other state-of-the-art methods.
文摘The continual growth of the use of technological appliances during the COVID-19 pandemic has resulted in a massive volume of data flow on the Internet,as many employees have transitioned to working from home.Furthermore,with the increase in the adoption of encrypted data transmission by many people who tend to use a Virtual Private Network(VPN)or Tor Browser(dark web)to keep their data privacy and hidden,network traffic encryption is rapidly becoming a universal approach.This affects and complicates the quality of service(QoS),traffic monitoring,and network security provided by Internet Service Providers(ISPs),particularly for analysis and anomaly detection approaches based on the network traffic’s nature.The method of categorizing encrypted traffic is one of the most challenging issues introduced by a VPN as a way to bypass censorship as well as gain access to geo-locked services.Therefore,an efficient approach is especially needed that enables the identification of encrypted network traffic data to extract and select valuable features which improve the quality of service and network management as well as to oversee the overall performance.In this paper,the classification of network traffic data in terms of VPN and non-VPN traffic is studied based on the efficiency of time-based features extracted from network packets.Therefore,this paper suggests two machine learning models that categorize network traffic into encrypted and non-encrypted traffic.The proposed models utilize statistical features(SF),Pearson Correlation(PC),and a Genetic Algorithm(GA),preprocessing the traffic samples into net flow traffic to accomplish the experiment’s objectives.The GA-based method utilizes a stochastic method based on natural genetics and biological evolution to extract essential features.The PC-based method performs well in removing different features of network traffic.With a microsecond perpacket prediction time,the best model achieved an accuracy of more than 95.02 percent in the most demanding traffic classification task,a drop in accuracy of only 2.37 percent in comparison to the entire statistical-based machine learning approach.This is extremely promising for the development of real-time traffic analyzers.
基金supported in part by Ministry of Education/China Mobile joint research grant under Project No.5-10Nanjing University of Posts and Telecommunications under Grants No.NY214135 and NY215045
文摘1 IntroductionNowadays in China, there are more than six hundred million netizens [1]. On April 11, 2015, the nmnbet of simultaneous online users of the Chinese instant message application QQ reached two hundred million [2]. The fast growth ol the lnternet pusnes me rapid development of information technology (IT) and communication technology (CT). Many traditional IT service and CT equipment providers are facing the fusion of IT and CT in the age of digital transformation, and heading toward ICT enterprises. Large global ICT enterprises, such as Apple, Google, Microsoft, Amazon, Verizon, and AT&T, have been contributing to the performance improvement of IT service and CT equipment.
文摘Many organizations are struggling to provide high bandwidth and reliable internet connectivity at their branch offices and business locations and getting the most out of their operational expense.The need for internet connectivity at any branch offices and business locations is not a luxury anymore but is a necessity.Let us try to understand how to plan and document the SDWAN(Software Defined-Wide Area Network)implementation in an organization.We will try to understand why it is essential to implement the new technology instead of investing in the existing MPLS(Multi-Protocol label switching)by taking an example of a retail organization.Methods:This project/research was performed using the abilities of Software Defined Network Technology and options available in MPLS(Multi-Protocol Label Switching).The Technical Project management principles were adopted as per PMI(Project Management Institute)waterfall methodology.Results/Conclusion:SDWAN technology provides an effective replacement of MPLS network connection for providing WAN connectivity for our office locations.It is essential to follow a documented process for appropriate vendor selection based on the available features and other listed attributes in the article.To be successful in the implementation it is essential to perform a POC(Proof of Concept)in a controlled environment and validate results.SDWAN provides better network performance and improves reliability as the links operate in active-active function.
文摘The development of scientific inquiry and research has yielded numerous benefits in the realm of intelligent traffic control systems, particularly in the realm of automatic license plate recognition for vehicles. The design of license plate recognition algorithms has undergone digitalization through the utilization of neural networks. In contemporary times, there is a growing demand for vehicle surveillance due to the need for efficient vehicle processing and traffic management. The design, development, and implementation of a license plate recognition system hold significant social, economic, and academic importance. The study aims to present contemporary methodologies and empirical findings pertaining to automated license plate recognition. The primary focus of the automatic license plate recognition algorithm was on image extraction, character segmentation, and recognition. The task of character segmentation has been identified as the most challenging function based on my observations. The license plate recognition project that we designed demonstrated the effectiveness of this method across various observed conditions. Particularly in low-light environments, such as during periods of limited illumination or inclement weather characterized by precipitation. The method has been subjected to testing using a sample size of fifty images, resulting in a 100% accuracy rate. The findings of this study demonstrate the project’s ability to effectively determine the optimal outcomes of simulations.