Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal depende...Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.展开更多
With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk manageme...With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk management,but searching for an efficient and accurate risk assessment algorithm has become a challenge for the civil aviation industry.Therefore,an improved risk assessment algorithm(PS-AE-LSTM)based on long short-term memory network(LSTM)with autoencoder(AE)is proposed for the various supervised deep learning algorithms in flight safety that cannot adequately address the problem of the quality on risk level labels.Firstly,based on the normal distribution characteristics of flight data,a probability severity(PS)model is established to enhance the quality of risk assessment labels.Secondly,autoencoder is introduced to reconstruct the flight parameter data to improve the data quality.Finally,utilizing the time-series nature of flight data,a long and short-termmemory network is used to classify the risk level and improve the accuracy of risk assessment.Thus,a risk assessment experimentwas conducted to analyze a fleet landing phase dataset using the PS-AE-LSTMalgorithm to assess the risk level associated with aircraft hard landing events.The results show that the proposed algorithm achieves an accuracy of 86.45%compared with seven baseline models and has excellent risk assessment capability.展开更多
Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G...Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.展开更多
This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabili...This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabilistic line-of-sight(LoS) channel. Especially, access points(APs) are introduced to collect data from some sensors in the unlicensed band to improve data collection efficiency. We formulate a mixed-integer non-convex optimization problem to minimize the UAV flight time by jointly designing the UAV 3D trajectory and sensors’ scheduling, while ensuring the required amount of data can be collected under the limited UAV energy. To solve this nonconvex problem, we recast the objective problem into a tractable form. Then, the problem is further divided into several sub-problems to solve iteratively, and the successive convex approximation(SCA) scheme is applied to solve each non-convex subproblem. Finally,the bisection search is adopted to speed up the searching for the minimum UAV flight time. Simulation results verify that the UAV flight time can be shortened by the proposed method effectively.展开更多
Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean...Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.展开更多
In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed a...In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.展开更多
In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. U...In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. Using the moderate-resolution imaging spectroradiometer (MODIS)/enhanced vegetation index(EVI) collected every eight days during January- July from 2005 to 2008 and the corresponding remote sensing data as experimental materials, we constructed cloud-free images via the Harmonic analysis of time series (HANTS). The cloud-free images were then treated by dynamic threshold method for obtaining the vegetation phenology in green up period and its distribution pattern. And the distribution pattern between freezing disaster year and normal year were comparatively analyzed for revealing the effect of freezing disaster on vegetation phenology in experimental plot. The result showed that the treated EVI data performed well in monitoring the effect of freezing disaster on vegetation phenology, accurately reflecting the regions suffered from freezing disaster. This result suggests that processing of remote sensing data using HANTS method could well monitor the ecological characteristics of vegetation.展开更多
An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb...An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.展开更多
In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be ma...In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k -dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.展开更多
Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with...Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with traffic data collected by discrete loop detectors as well as the web-crawl weather data. Matched case-control method and support vector machines (SVMs) technique were employed to identify the risk status. The adaptive synthetic over-sampling technique was applied to solve the imbalanced dataset issues. Random forest technique was applied to select the contributing factors and avoid the over-fitting issues. The results indicate that the SVMs classifier could successfully classify 76.32% of the crashes on the test dataset and 87.52% of the crashes on the overall dataset, which were relatively satisfactory compared with the results of the previous studies. Compared with the SVMs classifier without the data, the SVMs classifier with the web-crawl weather data increased the crash prediction accuracy by 1.32% and decreased the false alarm rate by 1.72%, showing the potential value of the massive web weather data. Mean impact value method was employed to evaluate the variable effects, and the results are identical with the results of most of previous studies. The emerging technique based on the discrete traffic data and web weather data proves to be more applicable on real- time safety management on freeways.展开更多
Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web bas...Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.展开更多
In the big data environment, enterprises must constantly assimilate big dataknowledge and private knowledge by multiple knowledge transfers to maintain theircompetitive advantage. The optimal time of knowledge transfe...In the big data environment, enterprises must constantly assimilate big dataknowledge and private knowledge by multiple knowledge transfers to maintain theircompetitive advantage. The optimal time of knowledge transfer is one of the mostimportant aspects to improve knowledge transfer efficiency. Based on the analysis of thecomplex characteristics of knowledge transfer in the big data environment, multipleknowledge transfers can be divided into two categories. One is the simultaneous transferof various types of knowledge, and the other one is multiple knowledge transfers atdifferent time points. Taking into consideration the influential factors, such as theknowledge type, knowledge structure, knowledge absorptive capacity, knowledge updaterate, discount rate, market share, profit contributions of each type of knowledge, transfercosts, product life cycle and so on, time optimization models of multiple knowledgetransfers in the big data environment are presented by maximizing the total discountedexpected profits (DEPs) of an enterprise. Some simulation experiments have beenperformed to verify the validity of the models, and the models can help enterprisesdetermine the optimal time of multiple knowledge transfer in the big data environment.展开更多
Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series da...Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on shorttime stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.展开更多
On the assumption that random interruptions in the observation process are modeled by a sequence of independent Bernoulli random variables, we firstly generalize two kinds of nonlinear filtering methods with random in...On the assumption that random interruptions in the observation process are modeled by a sequence of independent Bernoulli random variables, we firstly generalize two kinds of nonlinear filtering methods with random interruption failures in the observation based on the extended Kalman filtering (EKF) and the unscented Kalman filtering (UKF), which were shortened as GEKF and CUKF in this paper, respectively. Then the nonlinear filtering model is established by using the radial basis function neural network (RBFNN) prototypes and the network weights as state equation and the output of RBFNN to present the observation equation. Finally, we take the filtering problem under missing observed data as a special case of nonlinear filtering with random intermittent failures by setting each missing data to be zero without needing to pre-estimate the missing data, and use the GEKF-based RBFNN and the GUKF-based RBFNN to predict the ground radioactivity time series with missing data. Experimental results demonstrate that the prediction results of GUKF-based RBFNN accord well with the real ground radioactivity time series while the prediction results of GEKF-based RBFNN are divergent.展开更多
Travel time and delay are among the most important measures for gauging a transportation system’s performance. To address the growing problem of congestion in the US, transportation planning legislation mandated the ...Travel time and delay are among the most important measures for gauging a transportation system’s performance. To address the growing problem of congestion in the US, transportation planning legislation mandated the monitoring and analysis of system performance and produced a renewed interest in travel time and delay studies. The use of traditional sensors installed on major roads (e.g. inductive loops) for collecting data is necessary but not sufficient because of their limited coverage and expensive costs for setting up and maintaining the required infrastructure. The GPS-based techniques employed by the University of Delaware have evolved into an automated system, which provides more realistic experience of a traffic flow throughout the road links. However, human error and the weaknesses of using GPS devices in urban settings still have the potential to create inaccuracies. By simultaneously collecting data using three different techniques, the accuracy of the GPS positioning data and the resulting travel time and delay values could be objectively compared for automation and statistically compared for accuracy. It was found that the new technique provided the greatest automation requiring minimal attention of the data collectors and automatically processing the data sets. The data samples were statistically analyzed by using a combination of parametric and nonparametric statistical tests. This analysis greatly favored the GeoStats GPS method over the rest methods.展开更多
A -4/3|log |-2 result is obtained for the existence time of solutions of semi- linear different speed Klein-Gordon system in one space dimension for weakly decaying Cauchy data, of size , in certain circumstances of ...A -4/3|log |-2 result is obtained for the existence time of solutions of semi- linear different speed Klein-Gordon system in one space dimension for weakly decaying Cauchy data, of size , in certain circumstances of nonlinearity.展开更多
Metro system has experienced the global rapid rise over the past decades. However,few studies have paid attention to the evolution in system usage with the network expanding. The paper's main objectives are to ana...Metro system has experienced the global rapid rise over the past decades. However,few studies have paid attention to the evolution in system usage with the network expanding. The paper's main objectives are to analyze passenger flow characteristics and evaluate travel time reliability for the Nanjing Metro network by visualizing the smart card data of April 2014,April 2015 and April 2016. We performed visualization techniques and comparative analyses to examine the changes in system usage between before and after the system expansion. Specifically,workdays,holidays and weekends were specially segmented for analysis.Results showed that workdays had obvious morning and evening peak hours due to daily commuting,while no obvious peak hours existed in weekends and holidays and the daily traffic was evenly distributed. Besides,some metro stations had a serious directional imbalance,especially during the morning and evening peak hours of workdays. Serious unreliability occurred in morning peaks on workdays and the reliability of new lines was relatively low,meanwhile,new stations had negative effects on exiting stations in terms of reliability. Monitoring the evolution of system usage over years enables the identification of system performance and can serve as an input for improving the metro system quality.展开更多
For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method...For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method, and examine these performances by simulation. By comparing this method with the nonlinear least square fitting (NLSF) method and the linear regression of the sum (LRS) method in derivations and simulations, we find that this method can achieve the same or even better precision, comparable accuracy, and lower computation cost. We test this method by experimental decay signals. The results are in agreement with the ones obtained from the nonlinear least square fitting method.展开更多
基金This research was financially supported by the Ministry of Trade,Industry,and Energy(MOTIE),Korea,under the“Project for Research and Development with Middle Markets Enterprises and DNA(Data,Network,AI)Universities”(AI-based Safety Assessment and Management System for Concrete Structures)(ReferenceNumber P0024559)supervised by theKorea Institute for Advancement of Technology(KIAT).
文摘Time-series data provide important information in many fields,and their processing and analysis have been the focus of much research.However,detecting anomalies is very difficult due to data imbalance,temporal dependence,and noise.Therefore,methodologies for data augmentation and conversion of time series data into images for analysis have been studied.This paper proposes a fault detection model that uses time series data augmentation and transformation to address the problems of data imbalance,temporal dependence,and robustness to noise.The method of data augmentation is set as the addition of noise.It involves adding Gaussian noise,with the noise level set to 0.002,to maximize the generalization performance of the model.In addition,we use the Markov Transition Field(MTF)method to effectively visualize the dynamic transitions of the data while converting the time series data into images.It enables the identification of patterns in time series data and assists in capturing the sequential dependencies of the data.For anomaly detection,the PatchCore model is applied to show excellent performance,and the detected anomaly areas are represented as heat maps.It allows for the detection of anomalies,and by applying an anomaly map to the original image,it is possible to capture the areas where anomalies occur.The performance evaluation shows that both F1-score and Accuracy are high when time series data is converted to images.Additionally,when processed as images rather than as time series data,there was a significant reduction in both the size of the data and the training time.The proposed method can provide an important springboard for research in the field of anomaly detection using time series data.Besides,it helps solve problems such as analyzing complex patterns in data lightweight.
基金the National Natural Science Foundation of China(U2033213)the Fundamental Research Funds for the Central Universities(FZ2021ZZ01,FZ2022ZX50).
文摘With the development of the integration of aviation safety and artificial intelligence,research on the combination of risk assessment and artificial intelligence is particularly important in the field of risk management,but searching for an efficient and accurate risk assessment algorithm has become a challenge for the civil aviation industry.Therefore,an improved risk assessment algorithm(PS-AE-LSTM)based on long short-term memory network(LSTM)with autoencoder(AE)is proposed for the various supervised deep learning algorithms in flight safety that cannot adequately address the problem of the quality on risk level labels.Firstly,based on the normal distribution characteristics of flight data,a probability severity(PS)model is established to enhance the quality of risk assessment labels.Secondly,autoencoder is introduced to reconstruct the flight parameter data to improve the data quality.Finally,utilizing the time-series nature of flight data,a long and short-termmemory network is used to classify the risk level and improve the accuracy of risk assessment.Thus,a risk assessment experimentwas conducted to analyze a fleet landing phase dataset using the PS-AE-LSTMalgorithm to assess the risk level associated with aircraft hard landing events.The results show that the proposed algorithm achieves an accuracy of 86.45%compared with seven baseline models and has excellent risk assessment capability.
基金This work was supported by the National Science and Technology Council,Taiwan,under Project NSTC 112-2221-E-029-015.
文摘Various mobile devices and applications are now used in daily life.These devices require high-speed data processing,low energy consumption,low communication latency,and secure data transmission,especially in 5G and 6G mobile networks.High-security cryptography guarantees that essential data can be transmitted securely;however,it increases energy consumption and reduces data processing speed.Therefore,this study proposes a low-energy data encryption(LEDE)algorithm based on the Advanced Encryption Standard(AES)for improving data transmission security and reducing the energy consumption of encryption in Internet-of-Things(IoT)devices.In the proposed LEDE algorithm,the system time parameter is employed to create a dynamic S-Box to replace the static S-Box of AES.Tests indicated that six-round LEDE encryption achieves the same security level as 10-round conventional AES encryption.This reduction in encryption time results in the LEDE algorithm having a 67.4%lower energy consumption and 43.9%shorter encryption time than conventional AES;thus,the proposed LEDE algorithm can improve the performance and the energy consumption of IoT edge devices.
基金supported by the National Key Research and Development Program under Grant 2022YFB3303702the Key Program of National Natural Science Foundation of China under Grant 61931001+1 种基金supported by the National Natural Science Foundation of China under Grant No.62203368the Natural Science Foundation of Sichuan Province under Grant No.2023NSFSC1440。
文摘This paper investigates the data collection in an unmanned aerial vehicle(UAV)-aided Internet of Things(IoT) network, where a UAV is dispatched to collect data from ground sensors in a practical and accurate probabilistic line-of-sight(LoS) channel. Especially, access points(APs) are introduced to collect data from some sensors in the unlicensed band to improve data collection efficiency. We formulate a mixed-integer non-convex optimization problem to minimize the UAV flight time by jointly designing the UAV 3D trajectory and sensors’ scheduling, while ensuring the required amount of data can be collected under the limited UAV energy. To solve this nonconvex problem, we recast the objective problem into a tractable form. Then, the problem is further divided into several sub-problems to solve iteratively, and the successive convex approximation(SCA) scheme is applied to solve each non-convex subproblem. Finally,the bisection search is adopted to speed up the searching for the minimum UAV flight time. Simulation results verify that the UAV flight time can be shortened by the proposed method effectively.
基金The National Key R&D Program of China under contract No.2021YFC3101603.
文摘Ocean temperature is an important physical variable in marine ecosystems,and ocean temperature prediction is an important research objective in ocean-related fields.Currently,one of the commonly used methods for ocean temperature prediction is based on data-driven,but research on this method is mostly limited to the sea surface,with few studies on the prediction of internal ocean temperature.Existing graph neural network-based methods usually use predefined graphs or learned static graphs,which cannot capture the dynamic associations among data.In this study,we propose a novel dynamic spatiotemporal graph neural network(DSTGN)to predict threedimensional ocean temperature(3D-OT),which combines static graph learning and dynamic graph learning to automatically mine two unknown dependencies between sequences based on the original 3D-OT data without prior knowledge.Temporal and spatial dependencies in the time series were then captured using temporal and graph convolutions.We also integrated dynamic graph learning,static graph learning,graph convolution,and temporal convolution into an end-to-end framework for 3D-OT prediction using time-series grid data.In this study,we conducted prediction experiments using high-resolution 3D-OT from the Copernicus global ocean physical reanalysis,with data covering the vertical variation of temperature from the sea surface to 1000 m below the sea surface.We compared five mainstream models that are commonly used for ocean temperature prediction,and the results showed that the method achieved the best prediction results at all prediction scales.
基金The Soft Science Research Project of Ministry of Housing and Urban-Rural Development of China (No. 2008-k5-14)
文摘In order to provide important parameters for schedule designing, decision-making bases for transit operation management and references for passengers traveling by bus, bus transit travel time reliability is analyzed and evaluated based on automatic vehicle location (AVL) data. Based on the statistical analysis of the bus transit travel time, six indices including the coefficient of variance, the width of travel time distribution, the mean commercial speed, the congestion frequency, the planning time index and the buffer time index are proposed. Moreover, a framework for evaluating bus transit travel time reliability is constructed. Finally, a case study on a certain bus route in Suzhou is conducted. Results show that the proposed evaluation index system is simple and intuitive, and it can effectively reflect the efficiency and stability of bus operations. And a distinguishing feature of bus transit travel time reliability is the temporal pattern. It varies across different time periods.
文摘In the field of global changes, the relationship between plant phenology and climate, which reflects the response of terrestrial ecosystem to global climate change, has become a key subject that is highly concerned. Using the moderate-resolution imaging spectroradiometer (MODIS)/enhanced vegetation index(EVI) collected every eight days during January- July from 2005 to 2008 and the corresponding remote sensing data as experimental materials, we constructed cloud-free images via the Harmonic analysis of time series (HANTS). The cloud-free images were then treated by dynamic threshold method for obtaining the vegetation phenology in green up period and its distribution pattern. And the distribution pattern between freezing disaster year and normal year were comparatively analyzed for revealing the effect of freezing disaster on vegetation phenology in experimental plot. The result showed that the treated EVI data performed well in monitoring the effect of freezing disaster on vegetation phenology, accurately reflecting the regions suffered from freezing disaster. This result suggests that processing of remote sensing data using HANTS method could well monitor the ecological characteristics of vegetation.
文摘An earthquake of Ms= 6, 9 occurred at the Gonghe, Qinghai Province, China on April 26, 1990. Three larger aftershocks took place at the same region, Ms= 5. 0 on May 7, 1990, Ms= 6. 0 on Jan. 3, 1994 and Ms= 5. 7on Feb. 16, 1994. The long-period recordings of the main shock from China Digital Seismograph Network (CDSN) are deconvolved for the source time functions by the correspondent0 recordings of the three aftershocks asempirical Green's functions (EGFs). No matter which aftershock is taken as EGF, the relative source time functions (RSTFs) Obtained are nearly identical. The RSTFs suggest the Ms= 6. 9 event consists of at least two subevents with approximately equal size whose occurrence times are about 30 s apart, the first one has a duration of 12 s and a rise time of about 5 s, and the second one has a duration of 17 s and a rise time of about & s. COmParing the RSTFs obtained from P- and SH-phases respectively, we notice that those from SH-phases are a slightly more complex than those from p-phases, implying other finer subevents exist during the process of the main shock. It is interesting that the results from the EGF deconvolution of long-Period way form data are in good agreement with the results from the moment tensor inversion and from the EGF deconvolution of broadband waveform data. Additionally, the two larger aftershocks are deconvolved for their RSTFs. The deconvolution results show that the processes of the Ms= 6. 0 event on Jan. 3, 1994 and the Ms= 5. 7 event on Feb. 16,1994 are quite simple, both RSTFs are single impulses.The RSTFs of the Ms= 6. 9 main shock obtained from different stations are noticed to be azimuthally dependent, whose shapes are a slightly different with different stations. However, the RSTFs of the two smaller aftershocks are not azimuthally dependent. The integrations of RSTFs over the processes are quite close to each other, i. e., the scalar seismic moments estimated from different stations are in good agreement. Finally the scalar seismic moments of the three aftershocks are compared. The relative scalar seismic moment Of the three aftershocks deduced from the relative scalar seismic moments of the Ms=6. 9 main shock are very close to those inverted directly from the EGF deconvolution. The relative scalar seismic moment of the Ms =6. 9 main shock calculated using the three aftershocks as EGF are 22 (the Ms= 6. 0 aftershock being EGF), 26 (the Ms= 5. 7 aftershock being EGF) and 66 (the Ms= 5. 5 aftershock being EGF), respectively. Deducingfrom those results, the relative scalar sesimic moments of the Ms= 6. 0 to the Ms= 5. 7 events, the Ms= 6. 0 tothe Ms= 5. 5 events and the Ms= 5. 7 to the Ms= 5. 5 events are 1. 18, 3. 00 and 2. 54, respectively. The correspondent relative scalar seismic moments calculated directly from the waveform recordings are 1. 15, 3. 43, and 3. 05.
文摘In this paper, we present a cluster-based algorithm for time series outlier mining.We use discrete Fourier transformation (DFT) to transform time series from time domain to frequency domain. Time series thus can be mapped as the points in k -dimensional space.For these points, a cluster-based algorithm is developed to mine the outliers from these points.The algorithm first partitions the input points into disjoint clusters and then prunes the clusters,through judgment that can not contain outliers.Our algorithm has been run in the electrical load time series of one steel enterprise and proved to be effective.
基金supported by the National Natural Science Foundation (71301119)the Shanghai Natural Science Foundation (12ZR1434100)
文摘Recent advances in intelligent transportation system allow traffic safety studies to extend from historic data-based analyses to real-time applications. The study presents a new method to predict crash likelihood with traffic data collected by discrete loop detectors as well as the web-crawl weather data. Matched case-control method and support vector machines (SVMs) technique were employed to identify the risk status. The adaptive synthetic over-sampling technique was applied to solve the imbalanced dataset issues. Random forest technique was applied to select the contributing factors and avoid the over-fitting issues. The results indicate that the SVMs classifier could successfully classify 76.32% of the crashes on the test dataset and 87.52% of the crashes on the overall dataset, which were relatively satisfactory compared with the results of the previous studies. Compared with the SVMs classifier without the data, the SVMs classifier with the web-crawl weather data increased the crash prediction accuracy by 1.32% and decreased the false alarm rate by 1.72%, showing the potential value of the massive web weather data. Mean impact value method was employed to evaluate the variable effects, and the results are identical with the results of most of previous studies. The emerging technique based on the discrete traffic data and web weather data proves to be more applicable on real- time safety management on freeways.
文摘Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.
基金supported by the National Natural Science Foundation ofChina (Grant No. 71704016,71331008, 71402010)the Natural Science Foundation of HunanProvince (Grant No. 2017JJ2267)+1 种基金the Educational Economy and Financial Research Base ofHunan Province (Grant No. 13JCJA2)the Project of China Scholarship Council forOverseas Studies (201508430121, 201208430233).
文摘In the big data environment, enterprises must constantly assimilate big dataknowledge and private knowledge by multiple knowledge transfers to maintain theircompetitive advantage. The optimal time of knowledge transfer is one of the mostimportant aspects to improve knowledge transfer efficiency. Based on the analysis of thecomplex characteristics of knowledge transfer in the big data environment, multipleknowledge transfers can be divided into two categories. One is the simultaneous transferof various types of knowledge, and the other one is multiple knowledge transfers atdifferent time points. Taking into consideration the influential factors, such as theknowledge type, knowledge structure, knowledge absorptive capacity, knowledge updaterate, discount rate, market share, profit contributions of each type of knowledge, transfercosts, product life cycle and so on, time optimization models of multiple knowledgetransfers in the big data environment are presented by maximizing the total discountedexpected profits (DEPs) of an enterprise. Some simulation experiments have beenperformed to verify the validity of the models, and the models can help enterprisesdetermine the optimal time of multiple knowledge transfer in the big data environment.
文摘Data Mining (DM) methods are being increasingly used in prediction with time series data, in addition to traditional statistical approaches. This paper presents a literature review of the use of DM with time series data, focusing on shorttime stocks prediction. This is an area that has been attracting a great deal of attention from researchers in the field. The main contribution of this paper is to provide an outline of the use of DM with time series data, using mainly examples related with short-term stocks prediction. This is important to a better understanding of the field. Some of the main trends and open issues will also be introduced.
基金Project supported by the State Key Program of the National Natural Science of China (Grant No. 60835004)the Natural Science Foundation of Jiangsu Province of China (Grant No. BK2009727)+1 种基金the Natural Science Foundation of Higher Education Institutions of Jiangsu Province of China (Grant No. 10KJB510004)the National Natural Science Foundation of China (Grant No. 61075028)
文摘On the assumption that random interruptions in the observation process are modeled by a sequence of independent Bernoulli random variables, we firstly generalize two kinds of nonlinear filtering methods with random interruption failures in the observation based on the extended Kalman filtering (EKF) and the unscented Kalman filtering (UKF), which were shortened as GEKF and CUKF in this paper, respectively. Then the nonlinear filtering model is established by using the radial basis function neural network (RBFNN) prototypes and the network weights as state equation and the output of RBFNN to present the observation equation. Finally, we take the filtering problem under missing observed data as a special case of nonlinear filtering with random intermittent failures by setting each missing data to be zero without needing to pre-estimate the missing data, and use the GEKF-based RBFNN and the GUKF-based RBFNN to predict the ground radioactivity time series with missing data. Experimental results demonstrate that the prediction results of GUKF-based RBFNN accord well with the real ground radioactivity time series while the prediction results of GEKF-based RBFNN are divergent.
文摘Travel time and delay are among the most important measures for gauging a transportation system’s performance. To address the growing problem of congestion in the US, transportation planning legislation mandated the monitoring and analysis of system performance and produced a renewed interest in travel time and delay studies. The use of traditional sensors installed on major roads (e.g. inductive loops) for collecting data is necessary but not sufficient because of their limited coverage and expensive costs for setting up and maintaining the required infrastructure. The GPS-based techniques employed by the University of Delaware have evolved into an automated system, which provides more realistic experience of a traffic flow throughout the road links. However, human error and the weaknesses of using GPS devices in urban settings still have the potential to create inaccuracies. By simultaneously collecting data using three different techniques, the accuracy of the GPS positioning data and the resulting travel time and delay values could be objectively compared for automation and statistically compared for accuracy. It was found that the new technique provided the greatest automation requiring minimal attention of the data collectors and automatically processing the data sets. The data samples were statistically analyzed by using a combination of parametric and nonparametric statistical tests. This analysis greatly favored the GeoStats GPS method over the rest methods.
文摘A -4/3|log |-2 result is obtained for the existence time of solutions of semi- linear different speed Klein-Gordon system in one space dimension for weakly decaying Cauchy data, of size , in certain circumstances of nonlinearity.
基金Sponsored by Projects of International Cooperation and Exchange of the National Natural Science Foundation of China(Grant No.51561135003)Key Project of National Natural Science Foundation of China(Grant No.51338003)
文摘Metro system has experienced the global rapid rise over the past decades. However,few studies have paid attention to the evolution in system usage with the network expanding. The paper's main objectives are to analyze passenger flow characteristics and evaluate travel time reliability for the Nanjing Metro network by visualizing the smart card data of April 2014,April 2015 and April 2016. We performed visualization techniques and comparative analyses to examine the changes in system usage between before and after the system expansion. Specifically,workdays,holidays and weekends were specially segmented for analysis.Results showed that workdays had obvious morning and evening peak hours due to daily commuting,while no obvious peak hours existed in weekends and holidays and the daily traffic was evenly distributed. Besides,some metro stations had a serious directional imbalance,especially during the morning and evening peak hours of workdays. Serious unreliability occurred in morning peaks on workdays and the reliability of new lines was relatively low,meanwhile,new stations had negative effects on exiting stations in terms of reliability. Monitoring the evolution of system usage over years enables the identification of system performance and can serve as an input for improving the metro system quality.
基金supported by the Preeminent Youth Fund of Sichuan Province,China(Grant No.2012JQ0012)the National Natural Science Foundation of China(Grant Nos.11173008,10974202,and 60978049)the National Key Scientific and Research Equipment Development Project of China(Grant No.ZDYZ2013-2)
文摘For the accurate extraction of cavity decay time, a selection of data points is supplemented to the weighted least square method. We derive the expected precision, accuracy and computation cost of this improved method, and examine these performances by simulation. By comparing this method with the nonlinear least square fitting (NLSF) method and the linear regression of the sum (LRS) method in derivations and simulations, we find that this method can achieve the same or even better precision, comparable accuracy, and lower computation cost. We test this method by experimental decay signals. The results are in agreement with the ones obtained from the nonlinear least square fitting method.