期刊文献+
共找到36篇文章
< 1 2 >
每页显示 20 50 100
Deep Learning for Financial Time Series Prediction:A State-of-the-Art Review of Standalone and HybridModels
1
作者 Weisi Chen Walayat Hussain +1 位作者 Francesco Cauteruccio Xu Zhang 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期187-224,共38页
Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep lear... Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions. 展开更多
关键词 Financial time series prediction convolutional neural network long short-term memory deep learning attention mechanism FINANCE
下载PDF
Chaotic time series prediction using fuzzy sigmoid kernel-based support vector machines 被引量:2
2
作者 刘涵 刘丁 邓凌峰 《Chinese Physics B》 SCIE EI CAS CSCD 2006年第6期1196-1200,共5页
Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel i... Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction. 展开更多
关键词 support vector machines chaotic time series prediction fuzzy sigmoid kernel
下载PDF
LS-SVR and AGO Based Time Series Prediction Method 被引量:2
3
作者 ZHANG Shou-peng LIU Shan +2 位作者 CHAI Wang-xu ZHANG Jia-qi GUO Yang-ming 《International Journal of Plant Engineering and Management》 2016年第1期1-13,共13页
Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties... Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision. 展开更多
关键词 time series prediction least squares support vector regression (LS-SVR) Gaussian radial basisfunction (RBF) accumulated generating operation (AGO)
下载PDF
AFSTGCN:Prediction for multivariate time series using an adaptive fused spatial-temporal graph convolutional network
4
作者 Yuteng Xiao Kaijian Xia +5 位作者 Hongsheng Yin Yu-Dong Zhang Zhenjiang Qian Zhaoyang Liu Yuehan Liang Xiaodan Li 《Digital Communications and Networks》 SCIE CSCD 2024年第2期292-303,共12页
The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries an... The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models. 展开更多
关键词 Adaptive adjacency matrix Digital twin Graph convolutional network Multivariate time series prediction Spatial-temporal graph
下载PDF
Prediction of Time Series Empowered with a Novel SREKRLS Algorithm 被引量:3
5
作者 Bilal Shoaib Yasir Javed +6 位作者 Muhammad Adnan Khan Fahad Ahmad Rizwan Majeed Muhammad Saqib Nawaz Muhammad Adeel Ashraf Abid Iqbal Muhammad Idrees 《Computers, Materials & Continua》 SCIE EI 2021年第5期1413-1427,共15页
For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself ... For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm. 展开更多
关键词 Kernel methods square root adaptive filtering givens rotation mackey glass time series prediction recursive least squares kernel recursive least squares extended kernel recursive least squares square root extended kernel recursive least squares algorithm
下载PDF
STUDY ON THE PREDICTION METHOD OF LOW-DIMENSION TIME SERIES THAT ARISE FROM THE INTRINSIC NONLINEAR DYNAMICS 被引量:2
6
作者 MA Junhai(马军海) +1 位作者 CHEN Yushu(陈予恕) 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2001年第5期501-509,共9页
The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time se... The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable. 展开更多
关键词 NONLINEAR chaotic model parameter identification time series prediction
下载PDF
A framework based on sparse representation model for time series prediction in smart city 被引量:1
7
作者 Zhiyong YU Xiangping ZHENG +3 位作者 Fangwan HUANG Wenzhong GUO Lin SUN Zhiwen YU 《Frontiers of Computer Science》 SCIE EI CSCD 2021年第1期99-111,共13页
Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge ... Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively. 展开更多
关键词 sparse representation smart city time series prediction dictionary construction
原文传递
Research on Prediction of Sentiment Trend of Food Safety Public Opinion
8
作者 Chaofan Jiang Hu Wang +1 位作者 Changbin Jiang Di Li 《Journal of Computer and Communications》 2023年第3期189-201,共13页
Emotion has a nearly decisive role in behavior, which will directly affect netizens’ views on food safety public opinion events, thereby affecting the development direction of public opinion on the event, and it is o... Emotion has a nearly decisive role in behavior, which will directly affect netizens’ views on food safety public opinion events, thereby affecting the development direction of public opinion on the event, and it is of great significance for food safety network public opinion to predict emotional trends to do a good job in food safety network public opinion guidance. In this paper, the dynamic text representation method XLNet is used to generate word vectors with context-dependent dependencies to distribute the text information of food safety network public opinion. Then, the word vector is input into the CNN-BiLSTM network for local semantic feature and context semantic extraction. The attention mechanism is introduced to give different weights according to the importance of features, and the emotional tendency analysis is carried out. Based on sentiment analysis, sentiment value time series data is obtained, and a time series model is constructed to predict sentiment trends. The sentiment analysis model proposed in this paper can well classify the sentiment of food safety network public opinion, and the time series model has a good effect on the prediction of food safety network public opinion sentiment trend. . 展开更多
关键词 Network Public Opinion Sentiment Analysis time series prediction XLNet
下载PDF
STUDY ON PREDICTION METHODS FOR DYNAMIC SYSTEMS OF NONLINEAR CHAOTIC TIME SERIES*
9
作者 马军海 陈予恕 辛宝贵 《Applied Mathematics and Mechanics(English Edition)》 SCIE EI 2004年第6期605-611,共7页
The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural... The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly. 展开更多
关键词 nonlinear self-related chaotic model wavelet neural network parameter identification time series prediction
下载PDF
Performance Degradation Prediction of Proton Exchange Membrane Fuel Cell Based on CEEMDAN-KPCA and DA-GRU Networks
10
作者 Tingwei Zhao Juan Wang +2 位作者 Jiangxuan Che Yingjie Bian Tianyu Chen 《Instrumentation》 2024年第1期51-61,共11页
In order to improve the performance degradation prediction accuracy of proton exchange membrane fuel cell(PEMFC),a fusion prediction method(CKDG)based on adaptive noise complete ensemble empirical mode decomposition(C... In order to improve the performance degradation prediction accuracy of proton exchange membrane fuel cell(PEMFC),a fusion prediction method(CKDG)based on adaptive noise complete ensemble empirical mode decomposition(CEEMDAN),kernel principal component analysis(KPCA)and dual attention mechanism gated recurrent unit neural network(DA-GRU)was proposed.CEEMDAN and KPCA were used to extract the input feature data sequence,reduce the influence of random factors,and capture essential feature components to reduce the model complexity.The DA-GRU network helps to learn the feature mapping relationship of data in long time series and predict the changing trend of performance degradation data more accurately.The actual aging experimental data verify the performance of the CKDG method.The results show that under the steady-state condition of 20%training data prediction,the CKDA method can reduce the root mean square error(RMSE)by 52.7%and 34.6%,respectively,compared with the traditional LSTM and GRU neural networks.Compared with the simple DA-GRU network,RMSE is reduced by 15%,and the degree of over-fitting is reduced,which has higher accuracy.It also shows excellent prediction performance under the dynamic condition data set and has good universality. 展开更多
关键词 proton exchange membrane fuel cell dual-attention gated recurrent unit data-driven model time series prediction
下载PDF
New prediction of chaotic time series based on local Lyapunov exponent 被引量:9
11
作者 张勇 《Chinese Physics B》 SCIE EI CAS CSCD 2013年第5期191-197,共7页
A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in stat... A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically. 展开更多
关键词 chaotic time series prediction of chaotic time series local Lyapunov exponent least squaresmethod
下载PDF
Generalized unscented Kalman filtering based radial basis function neural network for the prediction of ground radioactivity time series with missing data 被引量:2
12
作者 伍雪冬 王耀南 +1 位作者 刘维亭 朱志宇 《Chinese Physics B》 SCIE EI CAS CSCD 2011年第6期546-551,共6页
On the assumption that random interruptions in the observation process are modeled by a sequence of independent Bernoulli random variables, we firstly generalize two kinds of nonlinear filtering methods with random in... On the assumption that random interruptions in the observation process are modeled by a sequence of independent Bernoulli random variables, we firstly generalize two kinds of nonlinear filtering methods with random interruption failures in the observation based on the extended Kalman filtering (EKF) and the unscented Kalman filtering (UKF), which were shortened as GEKF and CUKF in this paper, respectively. Then the nonlinear filtering model is established by using the radial basis function neural network (RBFNN) prototypes and the network weights as state equation and the output of RBFNN to present the observation equation. Finally, we take the filtering problem under missing observed data as a special case of nonlinear filtering with random intermittent failures by setting each missing data to be zero without needing to pre-estimate the missing data, and use the GEKF-based RBFNN and the GUKF-based RBFNN to predict the ground radioactivity time series with missing data. Experimental results demonstrate that the prediction results of GUKF-based RBFNN accord well with the real ground radioactivity time series while the prediction results of GEKF-based RBFNN are divergent. 展开更多
关键词 prediction of time series with missing data random interruption failures in the observation neural network approximation
下载PDF
Multimodality Prediction of Chaotic Time Series with Sparse Hard-Cut EM Learning of the Gaussian Process Mixture Model 被引量:1
13
作者 周亚同 樊煜 +1 位作者 陈子一 孙建成 《Chinese Physics Letters》 SCIE CAS CSCD 2017年第5期22-26,共5页
The contribution of this work is twofold: (1) a multimodality prediction method of chaotic time series with the Gaussian process mixture (GPM) model is proposed, which employs a divide and conquer strategy. It au... The contribution of this work is twofold: (1) a multimodality prediction method of chaotic time series with the Gaussian process mixture (GPM) model is proposed, which employs a divide and conquer strategy. It automatically divides the chaotic time series into multiple modalities with different extrinsic patterns and intrinsic characteristics, and thus can more precisely fit the chaotic time series. (2) An effective sparse hard-cut expec- tation maximization (SHC-EM) learning algorithm for the GPM model is proposed to improve the prediction performance. SHO-EM replaces a large learning sample set with fewer pseudo inputs, accelerating model learning based on these pseudo inputs. Experiments on Lorenz and Chua time series demonstrate that the proposed method yields not only accurate multimodality prediction, but also the prediction confidence interval SHC-EM outperforms the traditional variational 1earning in terms of both prediction accuracy and speed. In addition, SHC-EM is more robust and insusceptible to noise than variational learning. 展开更多
关键词 GPM Multimodality prediction of Chaotic time series with Sparse Hard-Cut EM Learning of the Gaussian Process Mixture Model EM SHC
下载PDF
Adaptive watermark generation mechanism based on time series prediction for stream processing
14
作者 Yang SONG Yunchun LI +3 位作者 Hailong YANG Jun XU Zerong LUAN Wei LI 《Frontiers of Computer Science》 SCIE EI CSCD 2021年第6期59-73,共15页
The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such... The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy. 展开更多
关键词 data stream processing WATERMARK time series based prediction dynamic adjustment
原文传递
The Prediction of Non-stationary Climate Series Based on Empirical Mode Decomposition 被引量:10
15
作者 杨培才 王革丽 +1 位作者 卞建春 周秀骥 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2010年第4期845-854,共10页
This paper proposes a new approach which we refer to as "segregated prediction" to predict climate time series which are nonstationary. This approach is based on the empirical mode decomposition method (EMD), whic... This paper proposes a new approach which we refer to as "segregated prediction" to predict climate time series which are nonstationary. This approach is based on the empirical mode decomposition method (EMD), which can decompose a time signal into a finite and usually small number of basic oscillatory components. To test the capabilities of this approach, some prediction experiments are carried out for several climate time series. The experimental results show that this approach can decompose the nonstationarity of the climate time series and segregate nonlinear interactions between the different mode components, which thereby is able to improve prediction accuracy of these original climate time series. 展开更多
关键词 EMD nonstationarity nonlinear system climate prediction time series prediction
下载PDF
Continuous-Time Prediction of Industrial Paste Thickener System With Differential ODE-Net 被引量:1
16
作者 Zhaolin Yuan Xiaorui Li +4 位作者 Di Wu Xiaojuan Ban Nai-Qi Wu Hong-Ning Dai Hao Wang 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2022年第4期686-698,共13页
It is crucial to predict the outputs of a thickening system,including the underflow concentration(UC)and mud pressure,for optimal control of the process.The proliferation of industrial sensors and the availability of ... It is crucial to predict the outputs of a thickening system,including the underflow concentration(UC)and mud pressure,for optimal control of the process.The proliferation of industrial sensors and the availability of thickening-system data make this possible.However,the unique properties of thickening systems,such as the non-linearities,long-time delays,partially observed data,and continuous time evolution pose challenges on building data-driven predictive models.To address the above challenges,we establish an integrated,deep-learning,continuous time network structure that consists of a sequential encoder,a state decoder,and a derivative module to learn the deterministic state space model from thickening systems.Using a case study,we examine our methods with a tailing thickener manufactured by the FLSmidth installed with massive sensors and obtain extensive experimental results.The results demonstrate that the proposed continuous-time model with the sequential encoder achieves better prediction performances than the existing discrete-time models and reduces the negative effects from long time delays by extracting features from historical system trajectories.The proposed method also demonstrates outstanding performances for both short and long term prediction tasks with the two proposed derivative types. 展开更多
关键词 Industrial 24 paste thickener ordinary differential equation(ODE)-net recurrent neural network time series prediction
下载PDF
A Hybrid Deep Learning Approach for PM2.5 Concentration Prediction in Smart Environmental Monitoring
17
作者 Minh Thanh Vo Anh HVo +1 位作者 Huong Bui Tuong Le 《Intelligent Automation & Soft Computing》 SCIE 2023年第6期3029-3041,共13页
Nowadays,air pollution is a big environmental problem in develop-ing countries.In this problem,particulate matter 2.5(PM2.5)in the air is an air pollutant.When its concentration in the air is high in developing countr... Nowadays,air pollution is a big environmental problem in develop-ing countries.In this problem,particulate matter 2.5(PM2.5)in the air is an air pollutant.When its concentration in the air is high in developing countries like Vietnam,it will harm everyone’s health.Accurate prediction of PM2.5 concentrations can help to make the correct decision in protecting the health of the citizen.This study develops a hybrid deep learning approach named PM25-CBL model for PM2.5 concentration prediction in Ho Chi Minh City,Vietnam.Firstly,this study analyzes the effects of variables on PM2.5 concentrations in Air Quality HCMC dataset.Only variables that affect the results will be selected for PM2.5 concentration prediction.Secondly,an efficient PM25-CBL model that integrates a convolutional neural network(CNN)andBidirectionalLongShort-TermMemory(Bi-LSTM)isdeveloped.This model consists of three following modules:CNN,Bi-LSTM,and Fully connected modules.Finally,this study conducts the experiment to compare the performance of our approach and several state-of-the-art deep learning models for time series prediction such as LSTM,Bi-LSTM,the combination of CNN and LSTM(CNN-LSTM),and ARIMA.The empirical results confirm that PM25-CBL model outperforms other methods for Air Quality HCMC dataset in terms of several metrics including Mean Squared Error(MSE),Root Mean Squared Error(RMSE),Mean Absolute Error(MAE),and Mean Absolute Percentage Error(MAPE). 展开更多
关键词 time series prediction PM2.5 concentration prediction CNN Bi-LSTM network
下载PDF
Exploring reservoir computing:Implementation via double stochastic nanowire networks
18
作者 唐健峰 夏磊 +3 位作者 李广隶 付军 段书凯 王丽丹 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第3期572-582,共11页
Neuromorphic computing,inspired by the human brain,uses memristor devices for complex tasks.Recent studies show that self-organizing random nanowires can implement neuromorphic information processing,enabling data ana... Neuromorphic computing,inspired by the human brain,uses memristor devices for complex tasks.Recent studies show that self-organizing random nanowires can implement neuromorphic information processing,enabling data analysis.This paper presents a model based on these nanowire networks,with an improved conductance variation profile.We suggest using these networks for temporal information processing via a reservoir computing scheme and propose an efficient data encoding method using voltage pulses.The nanowire network layer generates dynamic behaviors for pulse voltages,allowing time series prediction analysis.Our experiment uses a double stochastic nanowire network architecture for processing multiple input signals,outperforming traditional reservoir computing in terms of fewer nodes,enriched dynamics and improved prediction accuracy.Experimental results confirm the high accuracy of this architecture on multiple real-time series datasets,making neuromorphic nanowire networks promising for physical implementation of reservoir computing. 展开更多
关键词 double-layer stochastic(DS)nanowire network architecture neuromorphic computation nanowire network reservoir computing time series prediction
下载PDF
TSE-Tran:Prediction method of telecommunication-network fraud crime based on time series representation and transformer
19
作者 Tuo Shi Jie Fu Xiaofeng Hu 《Journal of Safety Science and Resilience》 EI CSCD 2023年第4期340-347,共8页
Telecom network fraud has become the most common and concerning type of crime and is an important public security incident that threatens urban resilience.Therefore,preventing a continuous rise in telecommunications a... Telecom network fraud has become the most common and concerning type of crime and is an important public security incident that threatens urban resilience.Therefore,preventing a continuous rise in telecommunications and network fraud will help establish a resilient urban governance system.Undertaking the spatiotemporal forecasting of telecommunications-network fraud trends is of significant importance for aiding public security agencies in proactive crime prevention and implementing targeted anti-fraud campaigns.This study presents a telecommunication network fraudulent crime prediction method called TSE-Tran,which integrates temporal representation and transformer architecture.The time-series data of telecommunication-network fraud occurrences were input into the TimesNet module,which maps the sequence data to a more precise feature representation tensor that accounts for both intra-and inter-cycle features.Subsequently,the data are fed into the transformer-encoder module for further encoding,capturing long-range dependencies in the time-series data.Finally,occurrences of future telecommunication network frauds are predicted by a fully connected layer.The results of the study demonstrate that the proposed TSE-Tran method outperforms benchmark methods in terms of prediction accuracy.The results of this study are expected to aid in the prevention and control of telecommunications and network frauds effectively strengthen the resilience of urban development and the ability to respond to public security incidents. 展开更多
关键词 Urban resilience timesNet TRANSFORMER time series prediction Attention mechanism
原文传递
Rolling Iterative Prediction for Correlated Multivariate Time Series
20
作者 Peng Liu Qiong Han Xiao Yang 《国际计算机前沿大会会议论文集》 EI 2023年第1期433-452,共20页
Correlated multivariate time series prediction is an effective tool for discovering the chang rules of temporal data,but it is challenging tofind these rules.Recently,deep learning methods have made it possible to pred... Correlated multivariate time series prediction is an effective tool for discovering the chang rules of temporal data,but it is challenging tofind these rules.Recently,deep learning methods have made it possible to predict high-dimensional and complex multivariate time series data.However,these methods cannot capture or predict potential mutation signals of time series,leading to a lag in data prediction trends and large errors.Moreover,it is difficult to capture dependencies of the data,especially when the data is sparse and the time intervals are large.In this paper,we proposed a prediction approach that leverages both propagation dynamics and deep learning,called Rolling Iterative Prediction(RIP).In RIP method,the Time-Delay Moving Average(TDMA)is used to carry out maximum likelihood reduction on the raw data,and the propagation dynamics model is applied to obtain the potential propagation parameters data,and dynamic properties of the correlated multivariate time series are clearly established.Long Short-Term Memory(LSTM)is applied to capture the time dependencies of data,and the medium and long-term Rolling Iterative Prediction method is established by alternately estimating parameters and predicting time series.Experiments are performed on the data of the Corona Virus Disease 2019(COVID-19)in China,France,and South Korea.Experimental results show that the real distribution of the epidemic data is well restored,the prediction accuracy is better than baseline methods. 展开更多
关键词 time series prediction Correlated Multivariate time series Trend prediction of Infectious Disease Rolling Circulation
原文传递
上一页 1 2 下一页 到第
使用帮助 返回顶部