The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries an...The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.展开更多
In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.A...In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.Although many anomaly detection methods have been proposed,the temporal correlation of the time series over the same sensor and the state(spatial)correlation between different sensors are rarely considered simultaneously in these methods.Owing to the superior capability of Transformer in learning time series features.This paper proposes a time series anomaly detection method based on a spatial-temporal network and an improved Transformer.Additionally,the methods based on graph neural networks typically include a graph structure learning module and an anomaly detection module,which are interdependent.However,in the initial phase of training,since neither of the modules has reached an optimal state,their performance may influence each other.This scenario makes the end-to-end training approach hard to effectively direct the learning trajectory of each module.This interdependence between the modules,coupled with the initial instability,may cause the model to find it hard to find the optimal solution during the training process,resulting in unsatisfactory results.We introduce an adaptive graph structure learning method to obtain the optimal model parameters and graph structure.Experiments on two publicly available datasets demonstrate that the proposed method attains higher anomaly detection results than other methods.展开更多
Classification models for multivariate time series have drawn the interest of many researchers to the field with the objective of developing accurate and efficient models.However,limited research has been conducted on...Classification models for multivariate time series have drawn the interest of many researchers to the field with the objective of developing accurate and efficient models.However,limited research has been conducted on generating adversarial samples for multivariate time series classification models.Adversarial samples could become a security concern in systems with complex sets of sensors.This study proposes extending the existing gradient adversarial transformation network(GATN)in combination with adversarial autoencoders to attack multivariate time series classification models.The proposed model attacks classification models by utilizing a distilled model to imitate the output of the multivariate time series classification model.In addition,the adversarial generator function is replaced with a variational autoencoder to enhance the adversarial samples.The developed methodology is tested on two multivariate time series classification models:1-nearest neighbor dynamic time warping(1-NN DTW)and a fully convolutional network(FCN).This study utilizes 30 multivariate time series benchmarks provided by the University of East Anglia(UEA)and University of California Riverside(UCR).The use of adversarial autoencoders shows an increase in the fraction of successful adversaries generated on multivariate time series.To the best of our knowledge,this is the first study to explore adversarial attacks on multivariate time series.Additionally,we recommend future research utilizing the generated latent space from the variational autoencoders.展开更多
Time series is a kind of data widely used in various fields such as electricity forecasting,exchange rate forecasting,and solar power generation forecasting,and therefore time series prediction is of great significanc...Time series is a kind of data widely used in various fields such as electricity forecasting,exchange rate forecasting,and solar power generation forecasting,and therefore time series prediction is of great significance.Recently,the encoder-decoder model combined with long short-term memory(LSTM)is widely used for multivariate time series prediction.However,the encoder can only encode information into fixed-length vectors,hence the performance of the model decreases rapidly as the length of the input sequence or output sequence increases.To solve this problem,we propose a combination model named AR_CLSTM based on the encoder_decoder structure and linear autoregression.The model uses a time step-based attention mechanism to enable the decoder to adaptively select past hidden states and extract useful information,and then uses convolution structure to learn the internal relationship between different dimensions of multivariate time series.In addition,AR_CLSTM combines the traditional linear autoregressive method to learn the linear relationship of the time series,so as to further reduce the error of time series prediction in the encoder_decoder structure and improve the multivariate time series Predictive effect.Experiments show that the AR_CLSTM model performs well in different time series predictions,and its root mean square error,mean square error,and average absolute error all decrease significantly.展开更多
In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate tim...In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate time series was originally reconstructed by a classical reconstruction technology.Then,the original decision-table of rough set theory was set up according to the embedding dimensions and time-delays of the original reconstruction phase space,and the rough set reduction was used to delete the redundant dimensions and irrelevant variables and to reconstruct the generic phase space,Finally,the input vectors for the prediction of multivariate time series were extracted according to generic reconstruction results to identify the parameters of prediction model.Verification results show that the developed reconstruction method leads to better generalization ability for the prediction model and it is feasible and worthwhile for application.展开更多
Sensors produce a large amount of multivariate time series data to record the states of Internet of Things(IoT)systems.Multivariate time series timestamp anomaly detection(TSAD)can identify timestamps of attacks and m...Sensors produce a large amount of multivariate time series data to record the states of Internet of Things(IoT)systems.Multivariate time series timestamp anomaly detection(TSAD)can identify timestamps of attacks and malfunctions.However,it is necessary to determine which sensor or indicator is abnormal to facilitate a more detailed diagnosis,a process referred to as fine-grained anomaly detection(FGAD).Although further FGAD can be extended based on TSAD methods,existing works do not provide a quantitative evaluation,and the performance is unknown.Therefore,to tackle the FGAD problem,this paper first verifies that the TSAD methods achieve low performance when applied to the FGAD task directly because of the excessive fusion of features and the ignoring of the relationship’s dynamic changes between indicators.Accordingly,this paper proposes a mul-tivariate time series fine-grained anomaly detection(MFGAD)framework.To avoid excessive fusion of features,MFGAD constructs two sub-models to independently identify the abnormal timestamp and abnormal indicator instead of a single model and then combines the two kinds of abnormal results to detect the fine-grained anomaly.Based on this framework,an algorithm based on Graph Attention Neural Network(GAT)and Attention Convolutional Long-Short Term Memory(A-ConvLSTM)is proposed,in which GAT learns temporal features of multiple indicators to detect abnormal timestamps and A-ConvLSTM captures the dynamic relationship between indicators to identify abnormal indicators.Extensive simulations on a real-world dataset demonstrate that the proposed algorithm can achieve a higher F1 score and hit rate than the extension of existing TSAD methods with the benefit of two independent sub-models for timestamp and indicator detection.展开更多
Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These li...Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These limitations can result in the misjudgment of models,leading to a degradation in overall detection performance.This paper proposes a novel transformer-like anomaly detection model adopting a contrastive learning module and a memory block(CLME)to overcome the above limitations.The contrastive learning module tailored for time series data can learn the contextual relationships to generate temporal fine-grained representations.The memory block can record normal patterns of these representations through the utilization of attention-based addressing and reintegration mechanisms.These two modules together effectively alleviate the problem of generalization.Furthermore,this paper introduces a fusion anomaly detection strategy that comprehensively takes into account the residual and feature spaces.Such a strategy can enlarge the discrepancies between normal and abnormal data,which is more conducive to anomaly identification.The proposed CLME model not only efficiently enhances the generalization performance but also improves the ability of anomaly detection.To validate the efficacy of the proposed approach,extensive experiments are conducted on well-established benchmark datasets,including SWaT,PSM,WADI,and MSL.The results demonstrate outstanding performance,with F1 scores of 90.58%,94.83%,91.58%,and 91.75%,respectively.These findings affirm the superiority of the CLME model over existing stateof-the-art anomaly detection methodologies in terms of its ability to detect anomalies within complex datasets accurately.展开更多
A forecasting method of oil well production based on multivariate time series(MTS)and vector autoregressive(VAR)machine learning model for waterflooding reservoir is proposed,and an example application is carried out....A forecasting method of oil well production based on multivariate time series(MTS)and vector autoregressive(VAR)machine learning model for waterflooding reservoir is proposed,and an example application is carried out.This method first uses MTS analysis to optimize injection and production data on the basis of well pattern analysis.The oil production of different production wells and water injection of injection wells in the well group are regarded as mutually related time series.Then a VAR model is established to mine the linear relationship from MTS data and forecast the oil well production by model fitting.The analysis of history production data of waterflooding reservoirs shows that,compared with history matching results of numerical reservoir simulation,the production forecasting results from the machine learning model are more accurate,and uncertainty analysis can improve the safety of forecasting results.Furthermore,impulse response analysis can evaluate the oil production contribution of the injection well,which can provide theoretical guidance for adjustment of waterflooding development plan.展开更多
Long-term multivariate time series forecasting is an important task in engineering applications. It helps grasp the future development trend of data in real-time, which is of great significance for a wide variety of f...Long-term multivariate time series forecasting is an important task in engineering applications. It helps grasp the future development trend of data in real-time, which is of great significance for a wide variety of fields. Due to the non-linear and unstable characteristics of multivariate time series, the existing methods encounter difficulties in analyzing complex high-dimensional data and capturing latent relationships between multivariates in time series, thus affecting the performance of long-term prediction. In this paper, we propose a novel time series forecasting model based on multilayer perceptron that combines spatio-temporal decomposition and doubly residual stacking, namely Spatio-Temporal Decomposition Neural Network (STDNet). We decompose the originally complex and unstable time series into two parts, temporal term and spatial term. We design temporal module based on auto-correlation mechanism to discover temporal dependencies at the sub-series level, and spatial module based on convolutional neural network and self-attention mechanism to integrate multivariate information from two dimensions, global and local, respectively. Then we integrate the results obtained from the different modules to get the final forecast. Extensive experiments on four real-world datasets show that STDNet significantly outperforms other state-of-the-art methods, which provides an effective solution for long-term time series forecasting.展开更多
The methods to determine time delays and embedding dimensions in the phase space delay reconstruction of multivariate chaotic time series are proposed. Three nonlinear prediction methods of multivariate chaotic tim...The methods to determine time delays and embedding dimensions in the phase space delay reconstruction of multivariate chaotic time series are proposed. Three nonlinear prediction methods of multivariate chaotic time series including local mean prediction, local linear prediction and BP neural networks prediction are considered. The simulation results obtained by the Lorenz system show that no matter what nonlinear prediction method is used, the prediction error of multivariate chaotic time series is much smaller than the prediction error of univariate time series, even if half of the data of univariate time series are used in multivariate time series. The results also verify that methods to determine the time delays and the embedding dimensions are correct from the view of minimizing the prediction error.展开更多
Wind power is one of the fastest-growing renewable energy sectors instrumental in the ongoing decarbonizationprocess. However, wind turbines are subjected to a wide range of dynamic loads which can cause more frequent...Wind power is one of the fastest-growing renewable energy sectors instrumental in the ongoing decarbonizationprocess. However, wind turbines are subjected to a wide range of dynamic loads which can cause more frequentfailures and downtime periods, leading to ever-increasing attention to effective Condition Monitoring strategies.In this paper, we propose a novel unsupervised deep anomaly detection framework to detect anomalies in windturbines based on SCADA data. We introduce a promising neural architecture, namely a Graph ConvolutionalAutoencoder for Multivariate Time series, to model the sensor network as a dynamical functional graph. Thisstructure improves the unsupervised learning capabilities of Autoencoders by considering individual sensormeasurements together with the nonlinear correlations existing among signals. On this basis, we developeda deep anomaly detection framework that was validated on 12 failure events occurred during 20 months ofoperation of four wind turbines. The results show that the proposed framework successfully detects anomaliesand anticipates SCADA alarms by outperforming other two recent neural approaches.展开更多
Detection and clarification of cause-effect relationships among variables is an important problem in time series analysis.This paper provides a method that employs both mutual information and conditional mutual inform...Detection and clarification of cause-effect relationships among variables is an important problem in time series analysis.This paper provides a method that employs both mutual information and conditional mutual information to identify the causal structure of multivariate time series causal graphical models.A three-step procedure is developed to learn the contemporaneous and the lagged causal relationships of time series causal graphs.Contrary to conventional constraint-based algorithm, the proposed algorithm does not involve any special kinds of distribution and is nonparametric.These properties are especially appealing for inference of time series causal graphs when the prior knowledge about the data model is not available.Simulations and case analysis demonstrate the effectiveness of the method.展开更多
The UK is the most important partner of the EU in terms of economic and other fields due to the geographical proximity.It was one of the largest economies in the EU and its per capita income is higher than the EU aver...The UK is the most important partner of the EU in terms of economic and other fields due to the geographical proximity.It was one of the largest economies in the EU and its per capita income is higher than the EU average,so it is a net contributor to the EU.With UKs membership of the EU ended on 31 January 2019,there are concerns that the Brexit may have a significant impact on the EU,resulting in social,economic,political,and institutional changes,etc.in EU.While the impact of Brexit on the UK has always been the subject of considerable scholarly interest in recent years,there is relatively little literature on the impact of Brexit on the EU.This paper focuses on the evaluation of the impact of Brexit on the EU economy and other relevant aspects along three dimensions:GDP,PPP,Quarterly GDP growth.Employing powerful quantitative analysis technology that includes vector autoregression model,multivariate time series model with intervention variables,and autoregression integrated moving average,this paper obtains the important and novel evidence about the potential impact of Brexit on the EU economy,pointing out that Brexit is of far-reaching significance to the EU.This analysis uses several statistical models to screen out several key influencing factors,which can be used to predict the total GDP of EU in the next five years.The results show that EU economy will react negatively to"no-deal"Brexit,and its growth rate of economy will slow down significantly in next 5 years.Finally,we put forward relevant policy suggestions on how to deal with the negative impact of Brexit on EU.展开更多
Multivariate time series segmentation is an important problem in data mining and it has arisen in more and more practical applications in recent years.The task of time series segmentation is to partition a time series...Multivariate time series segmentation is an important problem in data mining and it has arisen in more and more practical applications in recent years.The task of time series segmentation is to partition a time series into segments by detecting the abrupt changes or anomalies in the time series.Multivariate time series segmentation can provide meaningful information for further data analysis,prediction and policy decision.A time series can be considered as a piecewise continuous function,it is natural to take its total variation norm as a prior information of this time series.In this paper,by minimizing the negative log-likelihood function of a time series,we propose a total variation based model for multivariate time series segmentation.An iterative process is applied to solve the proposed model and a search combined the dynamic programming method is designed to determine the breakpoints.The experimental results show that the proposed method is efficient for multivariate time series segmentation and it is competitive to the existing methods for multivariate time series segmentation.展开更多
Long-term prediction is still a difficult problem in data mining.People usually use various kinds of methods of Recurrent Neural Network to predict.However,with the increase of the prediction step,the accuracy of pred...Long-term prediction is still a difficult problem in data mining.People usually use various kinds of methods of Recurrent Neural Network to predict.However,with the increase of the prediction step,the accuracy of prediction decreases rapidly.In order to improve the accuracy of long-term prediction,we propose a framework Variational Auto-Encoder Conditional Generative Adversarial Network(VAECGAN).Our model is divided into three parts.The first part is the encoder net,which can encode the exogenous sequence into latent space vectors and fully save the information carried by the exogenous sequence.The second part is the generator net which is responsible for generating prediction data.In the third part,the discriminator net is used to classify and feedback,adjust data generation and improve prediction accuracy.Finally,extensive empirical studies tested with five real-world datasets(NASDAQ,SML,Energy,EEG,KDDCUP)demonstrate the effectiveness and robustness of our proposed approach.展开更多
Multivariate Time Series(MTS)forecasting is an essential problem in many fields.Accurate forecasting results can effectively help in making decisions.To date,many MTS forecasting methods have been proposed and widely ...Multivariate Time Series(MTS)forecasting is an essential problem in many fields.Accurate forecasting results can effectively help in making decisions.To date,many MTS forecasting methods have been proposed and widely applied.However,these methods assume that the predicted value of a single variable is affected by all other variables,ignoring the causal relationship among variables.To address the above issue,we propose a novel end-to-end deep learning model,termed graph neural network with neural Granger causality,namely CauGNN,in this paper.To characterize the causal information among variables,we introduce the neural Granger causality graph in our model.Each variable is regarded as a graph node,and each edge represents the casual relationship between variables.In addition,convolutional neural network filters with different perception scales are used for time series feature extraction,to generate the feature of each node.Finally,the graph neural network is adopted to tackle the forecasting problem of the graph structure generated by the MTS.Three benchmark datasets from the real world are used to evaluate the proposed CauGNN,and comprehensive experiments show that the proposed method achieves state-of-the-art results in the MTS forecasting task.展开更多
This paper introduces an approach to analyzing multivariate time series(MVTS)data through progressive temporal abstraction of the data into patterns characterizing the behavior of the studied dynamic phenomenon.The pa...This paper introduces an approach to analyzing multivariate time series(MVTS)data through progressive temporal abstraction of the data into patterns characterizing the behavior of the studied dynamic phenomenon.The paper focuses on two core challenges:identifying basic behavior patterns of individual attributes and examining the temporal relations between these patterns across the range of attributes to derive higher-level abstractions of multi-attribute behavior.The proposed approach combines existing methods for univariate pattern extraction,computation of temporal relations according to the Allen’s time interval algebra,visual displays of the temporal relations,and interactive query operations into a cohesive visual analytics workflow.The paper describes the application of the approach to real-world examples of population mobility data during the COVID-19 pandemic and characteristics of episodes in a football match,illustrating its versatility and effectiveness in understanding composite patterns of interrelated attribute behaviors in MVTS data.展开更多
Correlated multivariate time series prediction is an effective tool for discovering the chang rules of temporal data,but it is challenging tofind these rules.Recently,deep learning methods have made it possible to pred...Correlated multivariate time series prediction is an effective tool for discovering the chang rules of temporal data,but it is challenging tofind these rules.Recently,deep learning methods have made it possible to predict high-dimensional and complex multivariate time series data.However,these methods cannot capture or predict potential mutation signals of time series,leading to a lag in data prediction trends and large errors.Moreover,it is difficult to capture dependencies of the data,especially when the data is sparse and the time intervals are large.In this paper,we proposed a prediction approach that leverages both propagation dynamics and deep learning,called Rolling Iterative Prediction(RIP).In RIP method,the Time-Delay Moving Average(TDMA)is used to carry out maximum likelihood reduction on the raw data,and the propagation dynamics model is applied to obtain the potential propagation parameters data,and dynamic properties of the correlated multivariate time series are clearly established.Long Short-Term Memory(LSTM)is applied to capture the time dependencies of data,and the medium and long-term Rolling Iterative Prediction method is established by alternately estimating parameters and predicting time series.Experiments are performed on the data of the Corona Virus Disease 2019(COVID-19)in China,France,and South Korea.Experimental results show that the real distribution of the epidemic data is well restored,the prediction accuracy is better than baseline methods.展开更多
Multivariate time series with missing values are common in a wide range of applications,including energy data.Existing imputation methods often fail to focus on the temporal dynamics and the cross-dimensional correlat...Multivariate time series with missing values are common in a wide range of applications,including energy data.Existing imputation methods often fail to focus on the temporal dynamics and the cross-dimensional correlation simultaneously.In this paper we propose a two-step method based on an attention model to impute missing values in multivariate energy time series.First,the underlying distribution of the missing values in the data is learned.This information is then further used to train an attention based imputation model.By learning the distribution prior to the imputation process,the model can respond flexibly to the specific characteristics of the underlying data.The developed model is applied to European energy data,obtained from the European Network of Transmission System Operators for Electricity.Using different evaluation metrics and benchmarks,the conducted experiments show that the proposed model is preferable to the benchmarks and is able to accurately impute missing values.展开更多
In the context of rapid digitization in industrial environments,how effective are advanced unsupervised learning models,particularly hybrid autoencoder models,at detecting anomalies in industrial control system(ICS)da...In the context of rapid digitization in industrial environments,how effective are advanced unsupervised learning models,particularly hybrid autoencoder models,at detecting anomalies in industrial control system(ICS)datasets?This study is crucial because it addresses the challenge of identifying rare and complex anomalous patterns in the vast amounts of time series data generated by Internet of Things(IoT)devices,which can significantly improve the reliability and safety of these systems.In this paper,we propose a hybrid autoencoder model,called ConvBiLSTMAE,which combines convolutional neural network(CNN)and bidirectional long short-term memory(BiLSTM)to more effectively train complex temporal data patterns in anomaly detection.On the hardware-in-the-loopbased extended industrial control system dataset,the ConvBiLSTM-AE model demonstrated remarkable anomaly detection performance,achieving F1 scores of 0.78 and 0.41 for the first and second datasets,respectively.The results suggest that hybrid autoencoder models are not only viable,but potentially superior alternatives for unsupervised anomaly detection in complex industrial systems,offering a promising approach to improving their reliability and safety.展开更多
基金supported by the China Scholarship Council and the CERNET Innovation Project under grant No.20170111.
文摘The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.
基金This work is partly supported by the National Key Research and Development Program of China(Grant No.2020YFB1805403)the National Natural Science Foundation of China(Grant No.62032002)the 111 Project(Grant No.B21049).
文摘In the Industrial Internet of Things(IIoT),sensors generate time series data to reflect the working state.When the systems are attacked,timely identification of outliers in time series is critical to ensure security.Although many anomaly detection methods have been proposed,the temporal correlation of the time series over the same sensor and the state(spatial)correlation between different sensors are rarely considered simultaneously in these methods.Owing to the superior capability of Transformer in learning time series features.This paper proposes a time series anomaly detection method based on a spatial-temporal network and an improved Transformer.Additionally,the methods based on graph neural networks typically include a graph structure learning module and an anomaly detection module,which are interdependent.However,in the initial phase of training,since neither of the modules has reached an optimal state,their performance may influence each other.This scenario makes the end-to-end training approach hard to effectively direct the learning trajectory of each module.This interdependence between the modules,coupled with the initial instability,may cause the model to find it hard to find the optimal solution during the training process,resulting in unsatisfactory results.We introduce an adaptive graph structure learning method to obtain the optimal model parameters and graph structure.Experiments on two publicly available datasets demonstrate that the proposed method attains higher anomaly detection results than other methods.
文摘Classification models for multivariate time series have drawn the interest of many researchers to the field with the objective of developing accurate and efficient models.However,limited research has been conducted on generating adversarial samples for multivariate time series classification models.Adversarial samples could become a security concern in systems with complex sets of sensors.This study proposes extending the existing gradient adversarial transformation network(GATN)in combination with adversarial autoencoders to attack multivariate time series classification models.The proposed model attacks classification models by utilizing a distilled model to imitate the output of the multivariate time series classification model.In addition,the adversarial generator function is replaced with a variational autoencoder to enhance the adversarial samples.The developed methodology is tested on two multivariate time series classification models:1-nearest neighbor dynamic time warping(1-NN DTW)and a fully convolutional network(FCN).This study utilizes 30 multivariate time series benchmarks provided by the University of East Anglia(UEA)and University of California Riverside(UCR).The use of adversarial autoencoders shows an increase in the fraction of successful adversaries generated on multivariate time series.To the best of our knowledge,this is the first study to explore adversarial attacks on multivariate time series.Additionally,we recommend future research utilizing the generated latent space from the variational autoencoders.
基金Shanxi Provincial Key Research and Development Program Project Fund(No.201703D111011)。
文摘Time series is a kind of data widely used in various fields such as electricity forecasting,exchange rate forecasting,and solar power generation forecasting,and therefore time series prediction is of great significance.Recently,the encoder-decoder model combined with long short-term memory(LSTM)is widely used for multivariate time series prediction.However,the encoder can only encode information into fixed-length vectors,hence the performance of the model decreases rapidly as the length of the input sequence or output sequence increases.To solve this problem,we propose a combination model named AR_CLSTM based on the encoder_decoder structure and linear autoregression.The model uses a time step-based attention mechanism to enable the decoder to adaptively select past hidden states and extract useful information,and then uses convolution structure to learn the internal relationship between different dimensions of multivariate time series.In addition,AR_CLSTM combines the traditional linear autoregressive method to learn the linear relationship of the time series,so as to further reduce the error of time series prediction in the encoder_decoder structure and improve the multivariate time series Predictive effect.Experiments show that the AR_CLSTM model performs well in different time series predictions,and its root mean square error,mean square error,and average absolute error all decrease significantly.
基金Project(61025015) supported by the National Natural Science Funds for Distinguished Young Scholars of ChinaProject(21106036) supported by the National Natural Science Foundation of China+2 种基金Project(200805331103) supported by Research Fund for the Doctoral Program of Higher Education of ChinaProject(NCET-08-0576) supported by Program for New Century Excellent Talents in Universities of ChinaProject(11B038) supported by Scientific Research Fund for the Excellent Youth Scholars of Hunan Provincial Education Department,China
文摘In order to effectively analyse the multivariate time series data of complex process,a generic reconstruction technology based on reduction theory of rough sets was proposed,Firstly,the phase space of multivariate time series was originally reconstructed by a classical reconstruction technology.Then,the original decision-table of rough set theory was set up according to the embedding dimensions and time-delays of the original reconstruction phase space,and the rough set reduction was used to delete the redundant dimensions and irrelevant variables and to reconstruct the generic phase space,Finally,the input vectors for the prediction of multivariate time series were extracted according to generic reconstruction results to identify the parameters of prediction model.Verification results show that the developed reconstruction method leads to better generalization ability for the prediction model and it is feasible and worthwhile for application.
基金supported in part by the National Natural Science Foundation of China under Grant 62272062the Researchers Supporting Project number.(RSP2023R102)King Saud University+5 种基金Riyadh,Saudi Arabia,the Open Research Fund of the Hunan Provincial Key Laboratory of Network Investigational Technology under Grant 2018WLZC003the National Science Foundation of Hunan Province under Grant 2020JJ2029the Hunan Provincial Key Research and Development Program under Grant 2022GK2019the Science Fund for Creative Research Groups of Hunan Province under Grant 2020JJ1006the Scientific Research Fund of Hunan Provincial Transportation Department under Grant 202143the Open Fund of Key Laboratory of Safety Control of Bridge Engineering,Ministry of Education(Changsha University of Science Technology)under Grant 21KB07.
文摘Sensors produce a large amount of multivariate time series data to record the states of Internet of Things(IoT)systems.Multivariate time series timestamp anomaly detection(TSAD)can identify timestamps of attacks and malfunctions.However,it is necessary to determine which sensor or indicator is abnormal to facilitate a more detailed diagnosis,a process referred to as fine-grained anomaly detection(FGAD).Although further FGAD can be extended based on TSAD methods,existing works do not provide a quantitative evaluation,and the performance is unknown.Therefore,to tackle the FGAD problem,this paper first verifies that the TSAD methods achieve low performance when applied to the FGAD task directly because of the excessive fusion of features and the ignoring of the relationship’s dynamic changes between indicators.Accordingly,this paper proposes a mul-tivariate time series fine-grained anomaly detection(MFGAD)framework.To avoid excessive fusion of features,MFGAD constructs two sub-models to independently identify the abnormal timestamp and abnormal indicator instead of a single model and then combines the two kinds of abnormal results to detect the fine-grained anomaly.Based on this framework,an algorithm based on Graph Attention Neural Network(GAT)and Attention Convolutional Long-Short Term Memory(A-ConvLSTM)is proposed,in which GAT learns temporal features of multiple indicators to detect abnormal timestamps and A-ConvLSTM captures the dynamic relationship between indicators to identify abnormal indicators.Extensive simulations on a real-world dataset demonstrate that the proposed algorithm can achieve a higher F1 score and hit rate than the extension of existing TSAD methods with the benefit of two independent sub-models for timestamp and indicator detection.
基金support from the Major National Science and Technology Special Projects(2016ZX02301003-004-007)the Natural Science Foundation of Hebei Province(F2020202067)。
文摘Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These limitations can result in the misjudgment of models,leading to a degradation in overall detection performance.This paper proposes a novel transformer-like anomaly detection model adopting a contrastive learning module and a memory block(CLME)to overcome the above limitations.The contrastive learning module tailored for time series data can learn the contextual relationships to generate temporal fine-grained representations.The memory block can record normal patterns of these representations through the utilization of attention-based addressing and reintegration mechanisms.These two modules together effectively alleviate the problem of generalization.Furthermore,this paper introduces a fusion anomaly detection strategy that comprehensively takes into account the residual and feature spaces.Such a strategy can enlarge the discrepancies between normal and abnormal data,which is more conducive to anomaly identification.The proposed CLME model not only efficiently enhances the generalization performance but also improves the ability of anomaly detection.To validate the efficacy of the proposed approach,extensive experiments are conducted on well-established benchmark datasets,including SWaT,PSM,WADI,and MSL.The results demonstrate outstanding performance,with F1 scores of 90.58%,94.83%,91.58%,and 91.75%,respectively.These findings affirm the superiority of the CLME model over existing stateof-the-art anomaly detection methodologies in terms of its ability to detect anomalies within complex datasets accurately.
基金Huo Yingdong Education Foundation Young Teachers Fund for Higher Education Institutions(171043)Sichuan Outstanding Young Science and Technology Talent Project(2019JDJQ0036)。
文摘A forecasting method of oil well production based on multivariate time series(MTS)and vector autoregressive(VAR)machine learning model for waterflooding reservoir is proposed,and an example application is carried out.This method first uses MTS analysis to optimize injection and production data on the basis of well pattern analysis.The oil production of different production wells and water injection of injection wells in the well group are regarded as mutually related time series.Then a VAR model is established to mine the linear relationship from MTS data and forecast the oil well production by model fitting.The analysis of history production data of waterflooding reservoirs shows that,compared with history matching results of numerical reservoir simulation,the production forecasting results from the machine learning model are more accurate,and uncertainty analysis can improve the safety of forecasting results.Furthermore,impulse response analysis can evaluate the oil production contribution of the injection well,which can provide theoretical guidance for adjustment of waterflooding development plan.
基金supported by the National Key Research and Development Program of China (No. 2021YFB3300503)Regional Innovation and Development Joint Fund of National Natural Science Foundation of China (No. U22A20167)National Natural Science Foundation of China (No. 61872260).
文摘Long-term multivariate time series forecasting is an important task in engineering applications. It helps grasp the future development trend of data in real-time, which is of great significance for a wide variety of fields. Due to the non-linear and unstable characteristics of multivariate time series, the existing methods encounter difficulties in analyzing complex high-dimensional data and capturing latent relationships between multivariates in time series, thus affecting the performance of long-term prediction. In this paper, we propose a novel time series forecasting model based on multilayer perceptron that combines spatio-temporal decomposition and doubly residual stacking, namely Spatio-Temporal Decomposition Neural Network (STDNet). We decompose the originally complex and unstable time series into two parts, temporal term and spatial term. We design temporal module based on auto-correlation mechanism to discover temporal dependencies at the sub-series level, and spatial module based on convolutional neural network and self-attention mechanism to integrate multivariate information from two dimensions, global and local, respectively. Then we integrate the results obtained from the different modules to get the final forecast. Extensive experiments on four real-world datasets show that STDNet significantly outperforms other state-of-the-art methods, which provides an effective solution for long-term time series forecasting.
文摘The methods to determine time delays and embedding dimensions in the phase space delay reconstruction of multivariate chaotic time series are proposed. Three nonlinear prediction methods of multivariate chaotic time series including local mean prediction, local linear prediction and BP neural networks prediction are considered. The simulation results obtained by the Lorenz system show that no matter what nonlinear prediction method is used, the prediction error of multivariate chaotic time series is much smaller than the prediction error of univariate time series, even if half of the data of univariate time series are used in multivariate time series. The results also verify that methods to determine the time delays and the embedding dimensions are correct from the view of minimizing the prediction error.
文摘Wind power is one of the fastest-growing renewable energy sectors instrumental in the ongoing decarbonizationprocess. However, wind turbines are subjected to a wide range of dynamic loads which can cause more frequentfailures and downtime periods, leading to ever-increasing attention to effective Condition Monitoring strategies.In this paper, we propose a novel unsupervised deep anomaly detection framework to detect anomalies in windturbines based on SCADA data. We introduce a promising neural architecture, namely a Graph ConvolutionalAutoencoder for Multivariate Time series, to model the sensor network as a dynamical functional graph. Thisstructure improves the unsupervised learning capabilities of Autoencoders by considering individual sensormeasurements together with the nonlinear correlations existing among signals. On this basis, we developeda deep anomaly detection framework that was validated on 12 failure events occurred during 20 months ofoperation of four wind turbines. The results show that the proposed framework successfully detects anomaliesand anticipates SCADA alarms by outperforming other two recent neural approaches.
基金supported by the National Natural Science Foundation of China under Grant Nos.60972150, 10926197,61201323
文摘Detection and clarification of cause-effect relationships among variables is an important problem in time series analysis.This paper provides a method that employs both mutual information and conditional mutual information to identify the causal structure of multivariate time series causal graphical models.A three-step procedure is developed to learn the contemporaneous and the lagged causal relationships of time series causal graphs.Contrary to conventional constraint-based algorithm, the proposed algorithm does not involve any special kinds of distribution and is nonparametric.These properties are especially appealing for inference of time series causal graphs when the prior knowledge about the data model is not available.Simulations and case analysis demonstrate the effectiveness of the method.
基金by the National Natural Science Foundation of China(No.11861042)the China Statistical Research Project(No.2020LZ25).
文摘The UK is the most important partner of the EU in terms of economic and other fields due to the geographical proximity.It was one of the largest economies in the EU and its per capita income is higher than the EU average,so it is a net contributor to the EU.With UKs membership of the EU ended on 31 January 2019,there are concerns that the Brexit may have a significant impact on the EU,resulting in social,economic,political,and institutional changes,etc.in EU.While the impact of Brexit on the UK has always been the subject of considerable scholarly interest in recent years,there is relatively little literature on the impact of Brexit on the EU.This paper focuses on the evaluation of the impact of Brexit on the EU economy and other relevant aspects along three dimensions:GDP,PPP,Quarterly GDP growth.Employing powerful quantitative analysis technology that includes vector autoregression model,multivariate time series model with intervention variables,and autoregression integrated moving average,this paper obtains the important and novel evidence about the potential impact of Brexit on the EU economy,pointing out that Brexit is of far-reaching significance to the EU.This analysis uses several statistical models to screen out several key influencing factors,which can be used to predict the total GDP of EU in the next five years.The results show that EU economy will react negatively to"no-deal"Brexit,and its growth rate of economy will slow down significantly in next 5 years.Finally,we put forward relevant policy suggestions on how to deal with the negative impact of Brexit on EU.
基金This work is supported by the National Natural Science Foundation of China Nos.11971215,11871210,and 11971214the Key Laboratory of Applied Mathematics and Complex Systems of Lanzhou University.
文摘Multivariate time series segmentation is an important problem in data mining and it has arisen in more and more practical applications in recent years.The task of time series segmentation is to partition a time series into segments by detecting the abrupt changes or anomalies in the time series.Multivariate time series segmentation can provide meaningful information for further data analysis,prediction and policy decision.A time series can be considered as a piecewise continuous function,it is natural to take its total variation norm as a prior information of this time series.In this paper,by minimizing the negative log-likelihood function of a time series,we propose a total variation based model for multivariate time series segmentation.An iterative process is applied to solve the proposed model and a search combined the dynamic programming method is designed to determine the breakpoints.The experimental results show that the proposed method is efficient for multivariate time series segmentation and it is competitive to the existing methods for multivariate time series segmentation.
基金the Youth Talent Star of Institute of Information Engineering,Chinese Academy of Sciences(Y7Z0091105)This work was supported in part by National Natural Science Foundation of China under Grant 61771469.
文摘Long-term prediction is still a difficult problem in data mining.People usually use various kinds of methods of Recurrent Neural Network to predict.However,with the increase of the prediction step,the accuracy of prediction decreases rapidly.In order to improve the accuracy of long-term prediction,we propose a framework Variational Auto-Encoder Conditional Generative Adversarial Network(VAECGAN).Our model is divided into three parts.The first part is the encoder net,which can encode the exogenous sequence into latent space vectors and fully save the information carried by the exogenous sequence.The second part is the generator net which is responsible for generating prediction data.In the third part,the discriminator net is used to classify and feedback,adjust data generation and improve prediction accuracy.Finally,extensive empirical studies tested with five real-world datasets(NASDAQ,SML,Energy,EEG,KDDCUP)demonstrate the effectiveness and robustness of our proposed approach.
基金supported in part by the National Natural Science Foundation of China (No.62002035)the Natural Science Foundation of Chongqing (No.cstc2020jcyj-bshX0034).
文摘Multivariate Time Series(MTS)forecasting is an essential problem in many fields.Accurate forecasting results can effectively help in making decisions.To date,many MTS forecasting methods have been proposed and widely applied.However,these methods assume that the predicted value of a single variable is affected by all other variables,ignoring the causal relationship among variables.To address the above issue,we propose a novel end-to-end deep learning model,termed graph neural network with neural Granger causality,namely CauGNN,in this paper.To characterize the causal information among variables,we introduce the neural Granger causality graph in our model.Each variable is regarded as a graph node,and each edge represents the casual relationship between variables.In addition,convolutional neural network filters with different perception scales are used for time series feature extraction,to generate the feature of each node.Finally,the graph neural network is adopted to tackle the forecasting problem of the graph structure generated by the MTS.Three benchmark datasets from the real world are used to evaluate the proposed CauGNN,and comprehensive experiments show that the proposed method achieves state-of-the-art results in the MTS forecasting task.
基金supported by Federal Ministry of Education and Research of Germany and the state of North-Rhine Westphalia as part of the Lamarr Institute for Machine Learning and Artificial Intelligence(Lamarr22B)by EU in projects SoBigData++and CrexData(grant agreement 101092749).
文摘This paper introduces an approach to analyzing multivariate time series(MVTS)data through progressive temporal abstraction of the data into patterns characterizing the behavior of the studied dynamic phenomenon.The paper focuses on two core challenges:identifying basic behavior patterns of individual attributes and examining the temporal relations between these patterns across the range of attributes to derive higher-level abstractions of multi-attribute behavior.The proposed approach combines existing methods for univariate pattern extraction,computation of temporal relations according to the Allen’s time interval algebra,visual displays of the temporal relations,and interactive query operations into a cohesive visual analytics workflow.The paper describes the application of the approach to real-world examples of population mobility data during the COVID-19 pandemic and characteristics of episodes in a football match,illustrating its versatility and effectiveness in understanding composite patterns of interrelated attribute behaviors in MVTS data.
基金This work was supported by the National Key R&D Program of China under Grant No.2020YFB1710200.
文摘Correlated multivariate time series prediction is an effective tool for discovering the chang rules of temporal data,but it is challenging tofind these rules.Recently,deep learning methods have made it possible to predict high-dimensional and complex multivariate time series data.However,these methods cannot capture or predict potential mutation signals of time series,leading to a lag in data prediction trends and large errors.Moreover,it is difficult to capture dependencies of the data,especially when the data is sparse and the time intervals are large.In this paper,we proposed a prediction approach that leverages both propagation dynamics and deep learning,called Rolling Iterative Prediction(RIP).In RIP method,the Time-Delay Moving Average(TDMA)is used to carry out maximum likelihood reduction on the raw data,and the propagation dynamics model is applied to obtain the potential propagation parameters data,and dynamic properties of the correlated multivariate time series are clearly established.Long Short-Term Memory(LSTM)is applied to capture the time dependencies of data,and the medium and long-term Rolling Iterative Prediction method is established by alternately estimating parameters and predicting time series.Experiments are performed on the data of the Corona Virus Disease 2019(COVID-19)in China,France,and South Korea.Experimental results show that the real distribution of the epidemic data is well restored,the prediction accuracy is better than baseline methods.
文摘Multivariate time series with missing values are common in a wide range of applications,including energy data.Existing imputation methods often fail to focus on the temporal dynamics and the cross-dimensional correlation simultaneously.In this paper we propose a two-step method based on an attention model to impute missing values in multivariate energy time series.First,the underlying distribution of the missing values in the data is learned.This information is then further used to train an attention based imputation model.By learning the distribution prior to the imputation process,the model can respond flexibly to the specific characteristics of the underlying data.The developed model is applied to European energy data,obtained from the European Network of Transmission System Operators for Electricity.Using different evaluation metrics and benchmarks,the conducted experiments show that the proposed model is preferable to the benchmarks and is able to accurately impute missing values.
基金supported by the Culture,Sports,and Tourism R&D Program through the Korea Creative Content Agency grant funded by the Ministry of Culture,Sports,and Tourism in 2024(Project Name:Development of Distribution and Management Platform Technology and Human Resource Development for Blockchain-Based SW Copyright Protection,Project Number:RS-2023-00228867,Contribution Rate:100%)and also supported by the Soonchunhyang University Research Fund.
文摘In the context of rapid digitization in industrial environments,how effective are advanced unsupervised learning models,particularly hybrid autoencoder models,at detecting anomalies in industrial control system(ICS)datasets?This study is crucial because it addresses the challenge of identifying rare and complex anomalous patterns in the vast amounts of time series data generated by Internet of Things(IoT)devices,which can significantly improve the reliability and safety of these systems.In this paper,we propose a hybrid autoencoder model,called ConvBiLSTMAE,which combines convolutional neural network(CNN)and bidirectional long short-term memory(BiLSTM)to more effectively train complex temporal data patterns in anomaly detection.On the hardware-in-the-loopbased extended industrial control system dataset,the ConvBiLSTM-AE model demonstrated remarkable anomaly detection performance,achieving F1 scores of 0.78 and 0.41 for the first and second datasets,respectively.The results suggest that hybrid autoencoder models are not only viable,but potentially superior alternatives for unsupervised anomaly detection in complex industrial systems,offering a promising approach to improving their reliability and safety.