Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep lear...Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions.展开更多
Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stab...Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.展开更多
Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been develop...Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.展开更多
Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel i...Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction.展开更多
Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimizatio...Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.展开更多
Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties...Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision.展开更多
The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of paralle...The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of parallel multidimensional step search (PMSS) is proposed for users to select best parameters intraining support vector machine to get a prediction model. A series of tests are performed to evaluate themodeling mechanism and prediction results indicate that Nu-SVR models can reflect the variation tendencyof time series with low prediction error on both familiar and unfamiliar data. Statistical analysis is alsoemployed to verify the optimization performance of PMSS algorithm and comparative results indicate thattraining error can take the minimum over the interval around planar data point corresponding to selectedparameters. Moreover, the introduction of parallelization can remarkably speed up the optimizing procedure.展开更多
In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and time...In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used as indicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%,the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.展开更多
The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries an...The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.展开更多
Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive cal...Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive calculation of block matrix, a new time series online prediction algorithm based on improved LS-SVM was proposed. The historical training results were fully utilized and the computing speed of LS-SVM was enhanced. Then, the improved algorithm was applied to timc series online prediction. Based on the operational data provided by the Northwest Power Grid of China, the method was used in the transient stability prediction of electric power system. The results show that, compared with the calculation time of the traditional LS-SVM(75 1 600 ms), that of the proposed method in different time windows is 40-60 ms, proposed method is above 0.8. So the improved method is online prediction. and the prediction accuracy(normalized root mean squared error) of the better than the traditional LS-SVM and more suitable for time series online prediction.展开更多
For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself ...For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm.展开更多
The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time se...The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable.展开更多
Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge ...Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively.展开更多
Emotion has a nearly decisive role in behavior, which will directly affect netizens’ views on food safety public opinion events, thereby affecting the development direction of public opinion on the event, and it is o...Emotion has a nearly decisive role in behavior, which will directly affect netizens’ views on food safety public opinion events, thereby affecting the development direction of public opinion on the event, and it is of great significance for food safety network public opinion to predict emotional trends to do a good job in food safety network public opinion guidance. In this paper, the dynamic text representation method XLNet is used to generate word vectors with context-dependent dependencies to distribute the text information of food safety network public opinion. Then, the word vector is input into the CNN-BiLSTM network for local semantic feature and context semantic extraction. The attention mechanism is introduced to give different weights according to the importance of features, and the emotional tendency analysis is carried out. Based on sentiment analysis, sentiment value time series data is obtained, and a time series model is constructed to predict sentiment trends. The sentiment analysis model proposed in this paper can well classify the sentiment of food safety network public opinion, and the time series model has a good effect on the prediction of food safety network public opinion sentiment trend. .展开更多
In order to improve the performance degradation prediction accuracy of proton exchange membrane fuel cell(PEMFC),a fusion prediction method(CKDG)based on adaptive noise complete ensemble empirical mode decomposition(C...In order to improve the performance degradation prediction accuracy of proton exchange membrane fuel cell(PEMFC),a fusion prediction method(CKDG)based on adaptive noise complete ensemble empirical mode decomposition(CEEMDAN),kernel principal component analysis(KPCA)and dual attention mechanism gated recurrent unit neural network(DA-GRU)was proposed.CEEMDAN and KPCA were used to extract the input feature data sequence,reduce the influence of random factors,and capture essential feature components to reduce the model complexity.The DA-GRU network helps to learn the feature mapping relationship of data in long time series and predict the changing trend of performance degradation data more accurately.The actual aging experimental data verify the performance of the CKDG method.The results show that under the steady-state condition of 20%training data prediction,the CKDA method can reduce the root mean square error(RMSE)by 52.7%and 34.6%,respectively,compared with the traditional LSTM and GRU neural networks.Compared with the simple DA-GRU network,RMSE is reduced by 15%,and the degree of over-fitting is reduced,which has higher accuracy.It also shows excellent prediction performance under the dynamic condition data set and has good universality.展开更多
Mill vibration is a common problem in rolling production,which directly affects the thickness accuracy of the strip and may even lead to strip fracture accidents in serious cases.The existing vibration prediction mode...Mill vibration is a common problem in rolling production,which directly affects the thickness accuracy of the strip and may even lead to strip fracture accidents in serious cases.The existing vibration prediction models do not consider the features contained in the data,resulting in limited improvement of model accuracy.To address these challenges,this paper proposes a multi-dimensional multi-modal cold rolling vibration time series prediction model(MDMMVPM)based on the deep fusion of multi-level networks.In the model,the long-term and short-term modal features of multi-dimensional data are considered,and the appropriate prediction algorithms are selected for different data features.Based on the established prediction model,the effects of tension and rolling force on mill vibration are analyzed.Taking the 5th stand of a cold mill in a steel mill as the research object,the innovative model is applied to predict the mill vibration for the first time.The experimental results show that the correlation coefficient(R^(2))of the model proposed in this paper is 92.5%,and the root-mean-square error(RMSE)is 0.0011,which significantly improves the modeling accuracy compared with the existing models.The proposed model is also suitable for the hot rolling process,which provides a new method for the prediction of strip rolling vibration.展开更多
The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine l...The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.展开更多
The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural...The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly.展开更多
A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in stat...A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically.展开更多
The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such...The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy.展开更多
基金funded by the Natural Science Foundation of Fujian Province,China (Grant No.2022J05291)Xiamen Scientific Research Funding for Overseas Chinese Scholars.
文摘Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions.
基金supported by the National Natural Science Foundation of China(Grant No.52308340)the Innovative Projects of Universities in Guangdong(Grant No.2022KTSCX208)Sichuan Transportation Science and Technology Project(Grant No.2018-ZL-01).
文摘Historically,landslides have been the primary type of geological disaster worldwide.Generally,the stability of reservoir banks is primarily affected by rainfall and reservoir water level fluctuations.Moreover,the stability of reservoir banks changes with the long-term dynamics of external disastercausing factors.Thus,assessing the time-varying reliability of reservoir landslides remains a challenge.In this paper,a machine learning(ML)based approach is proposed to analyze the long-term reliability of reservoir bank landslides in spatially variable soils through time series prediction.This study systematically investigated the prediction performances of three ML algorithms,i.e.multilayer perceptron(MLP),convolutional neural network(CNN),and long short-term memory(LSTM).Additionally,the effects of the data quantity and data ratio on the predictive power of deep learning models are considered.The results show that all three ML models can accurately depict the changes in the time-varying failure probability of reservoir landslides.The CNN model outperforms both the MLP and LSTM models in predicting the failure probability.Furthermore,selecting the right data ratio can improve the prediction accuracy of the failure probability obtained by ML models.
文摘Time series prediction has been successfully used in several application areas, such as meteoro-logical forecasting, market prediction, network traffic forecasting, etc. , and a number of techniques have been developed for modeling and predicting time series. In the traditional exponential smoothing method, a fixed weight is assigned to data history, and the trend changes of time series are ignored. In this paper, an uncertainty reasoning method, based on cloud model, is employed in time series prediction, which uses cloud logic controller to adjust the smoothing coefficient of the simple exponential smoothing method dynamically to fit the current trend of the time series. The validity of this solution was proved by experiments on various data sets.
文摘Support vector machines (SVM) have been widely used in chaotic time series predictions in recent years. In order to enhance the prediction efficiency of this method and implement it in hardware, the sigmoid kernel in SVM is drawn in a more natural way by using the fuzzy logic method proposed in this paper. This method provides easy hardware implementation and straightforward interpretability. Experiments on two typical chaotic time series predictions have been carried out and the obtained results show that the average CPU time can be reduced significantly at the cost of a small decrease in prediction accuracy, which is favourable for the hardware implementation for chaotic time series prediction.
基金The project supported by National Natural Science Foundation of China under Grant No. 90203008 and the Doctoral Foundation of the Ministry of Education of China
文摘Nonlinear time series prediction is studied by using an improved least squares support vector machine (LSSVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.
基金supported by National Natural Science Foundation(NNSF)of China under Grant No.61371024Aviation Science Fund of China under Grant No.2013ZD53051+1 种基金Aerospace Technology Support Fund of Chinathe Industry-Academy-Research Project of AVIC(cxy2013XGD14)
文摘Recently, fault or health condition prediction of complex systems becomes an interesting research topic. However, it is difficult to establish precise physical model for complex systems, and the time series properties are often necessary to be incorporated for the prediction in practice. Currently, the LS-SVR is widely adopted for prediction of systems with time series data. In this paper, in order to improve the prediction accuracy, accumulated generating operation (AGO) is carried out to improve the data quality and regularity of raw time series data based on grey system theory; then, the inverse accumulated generating operation (IAGO) is performed to obtain the prediction results. In addition, due to the reason that appropriate kernel function plays an important role in improving the accuracy of prediction through LS-SVR, a modified Gaussian radial basis function (RBF) is proposed. The requirements of distance functions-based kernel functions are satisfied, which ensure fast damping at the place adjacent to the test point and a moderate damping at infinity. The presented model is applied to the analysis of benchmarks. As indicated by the results, the proposed method is an effective prediction one with good precision.
基金Supported by the National Natural Science Foundation of China (No. 60873235&60473099)the Science-Technology Development Key Project of Jilin Province of China (No. 20080318)the Program of New Century Excellent Talents in University of China (No. NCET-06-0300).
文摘The theory of nu-support vector regression (Nu-SVR) is employed in modeling time series variationfor prediction. In order to avoid prediction performance degradation caused by improper parameters, themethod of parallel multidimensional step search (PMSS) is proposed for users to select best parameters intraining support vector machine to get a prediction model. A series of tests are performed to evaluate themodeling mechanism and prediction results indicate that Nu-SVR models can reflect the variation tendencyof time series with low prediction error on both familiar and unfamiliar data. Statistical analysis is alsoemployed to verify the optimization performance of PMSS algorithm and comparative results indicate thattraining error can take the minimum over the interval around planar data point corresponding to selectedparameters. Moreover, the introduction of parallelization can remarkably speed up the optimizing procedure.
基金supported by the Research and Innovation Program for College and University Graduate Students in Jiangsu Province (No.CX10B-141Z)the National Natural Science Foundation of China (No. 41071273)
文摘In order to study dynamic laws of surface movements over coal mines due to mining activities,a dynamic prediction model of surface movements was established,based on the theory of support vector machines(SVM) and times-series analysis.An engineering application was used to verify the correctness of the model.Measurements from observation stations were analyzed and processed to obtain equal-time interval surface movement data and subjected to tests of stationary,zero means and normality.Then the data were used to train the SVM model.A time series model was established to predict mining subsidence by rational choices of embedding dimensions and SVM parameters.MAPE and WIA were used as indicators to evaluate the accuracy of the model and for generalization performance.In the end,the model was used to predict future surface movements.Data from observation stations in Huaibei coal mining area were used as an example.The results show that the maximum absolute error of subsidence is 9 mm,the maximum relative error 1.5%,the maximum absolute error of displacement 7 mm and the maximum relative error 1.8%.The accuracy and reliability of the model meet the requirements of on-site engineering.The results of the study provide a new approach to investigate the dynamics of surface movements.
基金supported by the China Scholarship Council and the CERNET Innovation Project under grant No.20170111.
文摘The prediction for Multivariate Time Series(MTS)explores the interrelationships among variables at historical moments,extracts their relevant characteristics,and is widely used in finance,weather,complex industries and other fields.Furthermore,it is important to construct a digital twin system.However,existing methods do not take full advantage of the potential properties of variables,which results in poor predicted accuracy.In this paper,we propose the Adaptive Fused Spatial-Temporal Graph Convolutional Network(AFSTGCN).First,to address the problem of the unknown spatial-temporal structure,we construct the Adaptive Fused Spatial-Temporal Graph(AFSTG)layer.Specifically,we fuse the spatial-temporal graph based on the interrelationship of spatial graphs.Simultaneously,we construct the adaptive adjacency matrix of the spatial-temporal graph using node embedding methods.Subsequently,to overcome the insufficient extraction of disordered correlation features,we construct the Adaptive Fused Spatial-Temporal Graph Convolutional(AFSTGC)module.The module forces the reordering of disordered temporal,spatial and spatial-temporal dependencies into rule-like data.AFSTGCN dynamically and synchronously acquires potential temporal,spatial and spatial-temporal correlations,thereby fully extracting rich hierarchical feature information to enhance the predicted accuracy.Experiments on different types of MTS datasets demonstrate that the model achieves state-of-the-art single-step and multi-step performance compared with eight other deep learning models.
基金Project (SGKJ[200301-16]) supported by the State Grid Cooperation of China
文摘Deficiencies of applying the traditional least squares support vector machine (LS-SVM) to time series online prediction were specified. According to the kernel function matrix's property and using the recursive calculation of block matrix, a new time series online prediction algorithm based on improved LS-SVM was proposed. The historical training results were fully utilized and the computing speed of LS-SVM was enhanced. Then, the improved algorithm was applied to timc series online prediction. Based on the operational data provided by the Northwest Power Grid of China, the method was used in the transient stability prediction of electric power system. The results show that, compared with the calculation time of the traditional LS-SVM(75 1 600 ms), that of the proposed method in different time windows is 40-60 ms, proposed method is above 0.8. So the improved method is online prediction. and the prediction accuracy(normalized root mean squared error) of the better than the traditional LS-SVM and more suitable for time series online prediction.
基金funded by Prince Sultan University,Riyadh,Saudi Arabia。
文摘For the unforced dynamical non-linear state–space model,a new Q1 and efficient square root extended kernel recursive least square estimation algorithm is developed in this article.The proposed algorithm lends itself towards the parallel implementation as in the FPGA systems.With the help of an ortho-normal triangularization method,which relies on numerically stable givens rotation,matrix inversion causes a computational burden,is reduced.Matrix computation possesses many excellent numerical properties such as singularity,symmetry,skew symmetry,and triangularity is achieved by using this algorithm.The proposed method is validated for the prediction of stationary and non-stationary Mackey–Glass Time Series,along with that a component in the x-direction of the Lorenz Times Series is also predicted to illustrate its usefulness.By the learning curves regarding mean square error(MSE)are witnessed for demonstration with prediction performance of the proposed algorithm from where it’s concluded that the proposed algorithm performs better than EKRLS.This new SREKRLS based design positively offers an innovative era towards non-linear systolic arrays,which is efficient in developing very-large-scale integration(VLSI)applications with non-linear input data.Multiple experiments are carried out to validate the reliability,effectiveness,and applicability of the proposed algorithm and with different noise levels compared to the Extended kernel recursive least-squares(EKRLS)algorithm.
文摘The prediction methods and its applications of the nonlinear dynamic systems determined from chaotic time series of low-dimension are discussed mainly. Based on the work of the foreign researchers, the chaotic time series in the phase space adopting one kind of nonlinear chaotic model were reconstructed. At first, the model parameters were estimated by using the improved least square method. Then as the precision was satisfied, the optimization method was used to estimate these parameters. At the end by using the obtained chaotic model, the future data of the chaotic time series in the phase space was predicted. Some representative experimental examples were analyzed to testify the models and the algorithms developed in this paper. ne results show that if the algorithms developed here are adopted, the parameters of the corresponding chaotic model will be easily calculated well and true. Predictions of chaotic series in phase space make the traditional methods change from outer iteration to interpolations. And if the optimal model rank is chosen, the prediction precision will increase notably. Long term superior predictability of nonlinear chaotic models is proved to be irrational and unreasonable.
基金This work was supported by the National Natural Science Foundation of China(Grant Nos.61772136,61672159)the Technology Innovation Platform Project of Fujian Province(2014H2005)+1 种基金the Research Project for Young and Middle-aged Teachers of Fujian Province(JT 180045)the Fujian Collaborative Innovation Center for Big Data Application in Governments,the Fujian Engineering Research Center of Big Data Analysis and Processing.
文摘Smart city driven by Big Data and Internet of Things(loT)has become a most promising trend of the future.As one important function of smart city,event alert based on time series prediction is faced with the challenge of how to extract and represent discriminative features of sensing knowledge from the massive sequential data generated by IoT devices.In this paper,a framework based on sparse representa-tion model(SRM)for time series prediction is proposed as an efficient approach to tackle this challenge.After dividing the over-complete dictionary into upper and lower parts,the main idea of SRM is to obtain the sparse representation of time series based on the upper part firstly,and then realize the prediction of future values based on the lower part.The choice of different dictionaries has a significant impact on the performance of SRM.This paper focuses on the study of dictionary construction strategy and summarizes eight variants of SRM.Experimental results demonstrate that SRM can deal with different types of time series prediction flexibly and effectively.
文摘Emotion has a nearly decisive role in behavior, which will directly affect netizens’ views on food safety public opinion events, thereby affecting the development direction of public opinion on the event, and it is of great significance for food safety network public opinion to predict emotional trends to do a good job in food safety network public opinion guidance. In this paper, the dynamic text representation method XLNet is used to generate word vectors with context-dependent dependencies to distribute the text information of food safety network public opinion. Then, the word vector is input into the CNN-BiLSTM network for local semantic feature and context semantic extraction. The attention mechanism is introduced to give different weights according to the importance of features, and the emotional tendency analysis is carried out. Based on sentiment analysis, sentiment value time series data is obtained, and a time series model is constructed to predict sentiment trends. The sentiment analysis model proposed in this paper can well classify the sentiment of food safety network public opinion, and the time series model has a good effect on the prediction of food safety network public opinion sentiment trend. .
基金funded by Shaanxi Province Key Industrial Chain Project(2023-ZDLGY-24)Industrialization Project of Shaanxi Provincial Education Department(21JC018)+1 种基金Shaanxi Province Key Research and Development Program(2021ZDLGY13-02)the Open Foundation of State Key Laboratory for Advanced Metals and Materials(2022-Z01).
文摘In order to improve the performance degradation prediction accuracy of proton exchange membrane fuel cell(PEMFC),a fusion prediction method(CKDG)based on adaptive noise complete ensemble empirical mode decomposition(CEEMDAN),kernel principal component analysis(KPCA)and dual attention mechanism gated recurrent unit neural network(DA-GRU)was proposed.CEEMDAN and KPCA were used to extract the input feature data sequence,reduce the influence of random factors,and capture essential feature components to reduce the model complexity.The DA-GRU network helps to learn the feature mapping relationship of data in long time series and predict the changing trend of performance degradation data more accurately.The actual aging experimental data verify the performance of the CKDG method.The results show that under the steady-state condition of 20%training data prediction,the CKDA method can reduce the root mean square error(RMSE)by 52.7%and 34.6%,respectively,compared with the traditional LSTM and GRU neural networks.Compared with the simple DA-GRU network,RMSE is reduced by 15%,and the degree of over-fitting is reduced,which has higher accuracy.It also shows excellent prediction performance under the dynamic condition data set and has good universality.
基金Project(2023JH26-10100002)supported by the Liaoning Science and Technology Major Project,ChinaProjects(U21A20117,52074085)supported by the National Natural Science Foundation of China+1 种基金Project(2022JH2/101300008)supported by the Liaoning Applied Basic Research Program Project,ChinaProject(22567612H)supported by the Hebei Provincial Key Laboratory Performance Subsidy Project,China。
文摘Mill vibration is a common problem in rolling production,which directly affects the thickness accuracy of the strip and may even lead to strip fracture accidents in serious cases.The existing vibration prediction models do not consider the features contained in the data,resulting in limited improvement of model accuracy.To address these challenges,this paper proposes a multi-dimensional multi-modal cold rolling vibration time series prediction model(MDMMVPM)based on the deep fusion of multi-level networks.In the model,the long-term and short-term modal features of multi-dimensional data are considered,and the appropriate prediction algorithms are selected for different data features.Based on the established prediction model,the effects of tension and rolling force on mill vibration are analyzed.Taking the 5th stand of a cold mill in a steel mill as the research object,the innovative model is applied to predict the mill vibration for the first time.The experimental results show that the correlation coefficient(R^(2))of the model proposed in this paper is 92.5%,and the root-mean-square error(RMSE)is 0.0011,which significantly improves the modeling accuracy compared with the existing models.The proposed model is also suitable for the hot rolling process,which provides a new method for the prediction of strip rolling vibration.
文摘The growing global requirement for food and the need for sustainable farming in an era of a changing climate and scarce resources have inspired substantial crop yield prediction research.Deep learning(DL)and machine learning(ML)models effectively deal with such challenges.This research paper comprehensively analyses recent advancements in crop yield prediction from January 2016 to March 2024.In addition,it analyses the effectiveness of various input parameters considered in crop yield prediction models.We conducted an in-depth search and gathered studies that employed crop modeling and AI-based methods to predict crop yield.The total number of articles reviewed for crop yield prediction using ML,meta-modeling(Crop models coupled with ML/DL),and DL-based prediction models and input parameter selection is 125.We conduct the research by setting up five objectives for this research and discussing them after analyzing the selected research papers.Each study is assessed based on the crop type,input parameters employed for prediction,the modeling techniques adopted,and the evaluation metrics used for estimatingmodel performance.We also discuss the ethical and social impacts of AI on agriculture.However,various approaches presented in the scientific literature have delivered impressive predictions,they are complicateddue to intricate,multifactorial influences oncropgrowthand theneed for accuratedata-driven models.Therefore,thorough research is required to deal with challenges in predicting agricultural output.
文摘The prediction methods for nonlinear dynamic systems which are decided by chaotic time series are mainly studied as well as structures of nonlinear self-related chaotic models and their dimensions. By combining neural networks and wavelet theories, the structures of wavelet transform neural networks were studied and also a wavelet neural networks learning method was given. Based on wavelet networks, a new method for parameter identification was suggested, which can be used selectively to extract different scales of frequency and time in time series in order to realize prediction of tendencies or details of original time series. Through pre-treatment and comparison of results before and after the treatment, several useful conclusions are reached: High accurate identification can be guaranteed by applying wavelet networks to identify parameters of self-related chaotic models and more valid prediction of the chaotic time series including noise can be achieved accordingly.
基金Project supported by the National Natural Science Foundation of China (Grant No. 61201452)
文摘A new method of predicting chaotic time series is presented based on a local Lyapunov exponent, by quantitatively measuring the exponential rate of separation or attraction of two infinitely close trajectories in state space. After recon- structing state space from one-dimensional chaotic time series, neighboring multiple-state vectors of the predicting point are selected to deduce the prediction formula by using the definition of the locaI Lyapunov exponent. Numerical simulations are carded out to test its effectiveness and verify its higher precision over two older methods. The effects of the number of referential state vectors and added noise on forecasting accuracy are also studied numerically.
基金This work was supported by National Key Research and Development Program of China(2020YFB1506703)the National Natural Science Foundation of China(Grant No.62072018).
文摘The data stream processing framework processes the stream data based on event-time to ensure that the request can be responded to in real-time.In reality,streaming data usually arrives out-of-order due to factors such as network delay.The data stream processing framework commonly adopts the watermark mechanism to address the data disorderedness.Watermark is a special kind of data inserted into the data stream with a timestamp,which helps the framework to decide whether the data received is late and thus be discarded.Traditional watermark generation strategies are periodic;they cannot dynamically adjust the watermark distribution to balance the responsiveness and accuracy.This paper proposes an adaptive watermark generation mechanism based on the time series prediction model to address the above limitation.This mechanism dynamically adjusts the frequency and timing of watermark distribution using the disordered data ratio and other lateness properties of the data stream to improve the system responsiveness while ensuring acceptable result accuracy.We implement the proposed mechanism on top of Flink and evaluate it with realworld datasets.The experiment results show that our mechanism is superior to the existing watermark distribution strategies in terms of both system responsiveness and result accuracy.