There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement an...There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement and time series of a landslide.The second one is the dynamic evolution of a landslide,which could not be feasibly simulated simply by traditional prediction models.In this paper,a dynamic model of displacement prediction is introduced for composite landslides based on a combination of empirical mode decomposition with soft screening stop criteria(SSSC-EMD)and deep bidirectional long short-term memory(DBi-LSTM)neural network.In the proposed model,the time series analysis and SSSC-EMD are used to decompose the observed accumulated displacements of a slope into three components,viz.trend displacement,periodic displacement,and random displacement.Then,by analyzing the evolution pattern of a landslide and its key factors triggering landslides,appropriate influencing factors are selected for each displacement component,and DBi-LSTM neural network to carry out multi-datadriven dynamic prediction for each displacement component.An accumulated displacement prediction has been obtained by a summation of each component.For accuracy verification and engineering practicability of the model,field observations from two known landslides in China,the Xintan landslide and the Bazimen landslide were collected for comparison and evaluation.The case study verified that the model proposed in this paper can better characterize the"stepwise"deformation characteristics of a slope.As compared with long short-term memory(LSTM)neural network,support vector machine(SVM),and autoregressive integrated moving average(ARIMA)model,DBi-LSTM neural network has higher accuracy in predicting the periodic displacement of slope deformation,with the mean absolute percentage error reduced by 3.063%,14.913%,and 13.960%respectively,and the root mean square error reduced by 1.951 mm,8.954 mm and 7.790 mm respectively.Conclusively,this model not only has high prediction accuracy but also is more stable,which can provide new insight for practical landslide prevention and control engineering.展开更多
In this paper, a filtering method is presented to estimate time-varying parameters of a missile dual control system with tail fins and reaction jets as control variables. In this method, the long-short-term memory(LST...In this paper, a filtering method is presented to estimate time-varying parameters of a missile dual control system with tail fins and reaction jets as control variables. In this method, the long-short-term memory(LSTM) neural network is nested into the extended Kalman filter(EKF) to modify the Kalman gain such that the filtering performance is improved in the presence of large model uncertainties. To avoid the unstable network output caused by the abrupt changes of system states,an adaptive correction factor is introduced to correct the network output online. In the process of training the network, a multi-gradient descent learning mode is proposed to better fit the internal state of the system, and a rolling training is used to implement an online prediction logic. Based on the Lyapunov second method, we discuss the stability of the system, the result shows that when the training error of neural network is sufficiently small, the system is asymptotically stable. With its application to the estimation of time-varying parameters of a missile dual control system, the LSTM-EKF shows better filtering performance than the EKF and adaptive EKF(AEKF) when there exist large uncertainties in the system model.展开更多
An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models...An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models, this paper proposes a dynamic prediction model of landslide displacement based on singular spectrum analysis(SSA) and stack long short-term memory(SLSTM) network. The SSA is used to decompose the landslide accumulated displacement time series data into trend term and periodic term displacement subsequences. A cubic polynomial function is used to predict the trend term displacement subsequence, and the SLSTM neural network is used to predict the periodic term displacement subsequence. At the same time, the Bayesian optimization algorithm is used to determine that the SLSTM network input sequence length is 12 and the number of hidden layer nodes is 18. The SLSTM network is updated by adding predicted values to the training set to achieve dynamic displacement prediction. Finally, the accumulated landslide displacement is obtained by superimposing the predicted value of each displacement subsequence. The proposed model was verified on the Xintan landslide in Hubei Province, China. The results show that when predicting the displacement of the periodic term, the SLSTM network has higher prediction accuracy than the support vector machine(SVM) and auto regressive integrated moving average(ARIMA). The mean relative error(MRE) is reduced by 4.099% and 3.548% respectively, while the root mean square error(RMSE) is reduced by 5.830 mm and 3.854 mm respectively. It is concluded that the SLSTM network model can better simulate the dynamic characteristics of landslides.展开更多
In dense pedestrian tracking,frequent object occlusions and close distances between objects cause difficulty when accurately estimating object trajectories.In this study,a conditional random field tracking model is es...In dense pedestrian tracking,frequent object occlusions and close distances between objects cause difficulty when accurately estimating object trajectories.In this study,a conditional random field tracking model is established by using a visual long short term memory network in the three-dimensional(3D)space and the motion estimations jointly performed on object trajectory segments.Object visual field information is added to the long short term memory network to improve the accuracy of the motion related object pair selection and motion estimation.To address the uncertainty of the length and interval of trajectory segments,a multimode long short term memory network is proposed for the object motion estimation.The tracking performance is evaluated using the PETS2009 dataset.The experimental results show that the proposed method achieves better performance than the tracking methods based on the independent motion estimation.展开更多
Speaker separation in complex acoustic environment is one of challenging tasks in speech separation.In practice,speakers are very often unmoving or moving slowly in normal communication.In this case,the spatial featur...Speaker separation in complex acoustic environment is one of challenging tasks in speech separation.In practice,speakers are very often unmoving or moving slowly in normal communication.In this case,the spatial features among the consecutive speech frames become highly correlated such that it is helpful for speaker separation by providing additional spatial information.To fully exploit this information,we design a separation system on Recurrent Neural Network(RNN)with long short-term memory(LSTM)which effectively learns the temporal dynamics of spatial features.In detail,a LSTM-based speaker separation algorithm is proposed to extract the spatial features in each time-frequency(TF)unit and form the corresponding feature vector.Then,we treat speaker separation as a supervised learning problem,where a modified ideal ratio mask(IRM)is defined as the training function during LSTM learning.Simulations show that the proposed system achieves attractive separation performance in noisy and reverberant environments.Specifically,during the untrained acoustic test with limited priors,e.g.,unmatched signal to noise ratio(SNR)and reverberation,the proposed LSTM based algorithm can still outperforms the existing DNN based method in the measures of PESQ and STOI.It indicates our method is more robust in untrained conditions.展开更多
The unloading relaxation caused by excavation for construction of high arch dams is an important factor influencing the foundation’s integrity and strength.To evaluate the degree of unloading relaxation,the long-shor...The unloading relaxation caused by excavation for construction of high arch dams is an important factor influencing the foundation’s integrity and strength.To evaluate the degree of unloading relaxation,the long-short term memory(LSTM)network was used to estimate the depth of unloading relaxation zones on the left bank foundation of the Baihetan Arch Dam.Principal component analysis indicates that rock charac-teristics,the structural plane,the protection layer,lithology,and time are the main factors.The LSTM network results demonstrate the unloading relaxation characteristics of the left bank,and the relationships with the factors were also analyzed.The structural plane has the most significant influence on the distribution of unloading relaxation zones.Compared with massive basalt,the columnar jointed basalt experiences a more significant unloading relaxation phenomenon with a clear time effect,with the average unloading relaxation period being 50 d.The protection layer can effectively reduce the unloading relaxation depth by approximately 20%.展开更多
RF power amplifiers (PAs) are usually considered as memoryless devices in most existing predistortion techniques. Nevertheless, in wideband communication systems, PA memory effects can no longer be ignored and memoryl...RF power amplifiers (PAs) are usually considered as memoryless devices in most existing predistortion techniques. Nevertheless, in wideband communication systems, PA memory effects can no longer be ignored and memoryless predistortion cannot linearize PAs effectively. After analyzing PA memory effects, a novel predistortion method based on wavelet networks (WNs) is proposed to linearize wideband RF power amplifiers. A complex wavelet network with tapped delay lines is applied to construct the predistorter and then a complex backpropagation algorithm is developed to train the predistorter parameters. The simulation results show that compared with the previously published feed-forward neural network predistortion method, the proposed method provides faster convergence rate and better performance in reducing out-of-band spectral regrowth.展开更多
Time series forecasting and analysis are widely used in many fields and application scenarios.Time series historical data reflects the change pattern and trend,which can serve the application and decision in each appl...Time series forecasting and analysis are widely used in many fields and application scenarios.Time series historical data reflects the change pattern and trend,which can serve the application and decision in each application scenario to a certain extent.In this paper,we select the time series prediction problem in the atmospheric environment scenario to start the application research.In terms of data support,we obtain the data of nearly 3500 vehicles in some cities in China fromRunwoda Research Institute,focusing on the major pollutant emission data of non-road mobile machinery and high emission vehicles in Beijing and Bozhou,Anhui Province to build the dataset and conduct the time series prediction analysis experiments on them.This paper proposes a P-gLSTNet model,and uses Autoregressive Integrated Moving Average model(ARIMA),long and short-term memory(LSTM),and Prophet to predict and compare the emissions in the future period.The experiments are validated on four public data sets and one self-collected data set,and the mean absolute error(MAE),root mean square error(RMSE),and mean absolute percentage error(MAPE)are selected as the evaluationmetrics.The experimental results show that the proposed P-gLSTNet fusion model predicts less error,outperforms the backbone method,and is more suitable for the prediction of time-series data in this scenario.展开更多
In this study,an optimized long short-term memory(LSTM)network is proposed to predict the reliability and remaining useful life(RUL)of rolling bearings based on an improved whale-optimized algorithm(IWOA).The multi-do...In this study,an optimized long short-term memory(LSTM)network is proposed to predict the reliability and remaining useful life(RUL)of rolling bearings based on an improved whale-optimized algorithm(IWOA).The multi-domain features are extracted to construct the feature dataset because the single-domain features are difficult to characterize the performance degeneration of the rolling bearing.To provide covariates for reliability assessment,a kernel principal component analysis is used to reduce the dimensionality of the features.A Weibull distribution proportional hazard model(WPHM)is used for the reliability assessment of rolling bearing,and a beluga whale optimization(BWO)algorithm is combined with maximum likelihood estimation(MLE)to improve the estimation accuracy of the model parameters of the WPHM,which provides the data basis for predicting reliability.Considering the possible gradient explosion by training the rolling bearing lifetime data and the difficulties in selecting the key network parameters,an optimized LSTM network called the improved whale optimization algorithm-based long short-term memory(IWOA-LSTM)network is proposed.As IWOA better jumps out of the local optimization,the fitting and prediction accuracies of the network are correspondingly improved.The experimental results show that compared with the whale optimization algorithm-based long short-term memory(WOA-LSTM)network,the reliability prediction and RUL prediction accuracies of the rolling bearing are improved by the proposed IWOA-LSTM network.展开更多
Ionosphere delay is one of the main sources of noise affecting global navigation satellite systems, operation of radio detection and ranging systems and very-long-baseline-interferometry. One of the most important and...Ionosphere delay is one of the main sources of noise affecting global navigation satellite systems, operation of radio detection and ranging systems and very-long-baseline-interferometry. One of the most important and common methods to reduce this phase delay is to establish accurate nowcasting and forecasting ionospheric total electron content models. For forecasting models, compared to mid-to-high latitudes, at low latitudes, an active ionosphere leads to extreme differences between long-term prediction models and the actual state of the ionosphere. To solve the problem of low accuracy for long-term prediction models at low latitudes, this article provides a low-latitude, long-term ionospheric prediction model based on a multi-input-multi-output, long-short-term memory neural network. To verify the feasibility of the model, we first made predictions of the vertical total electron content data 24 and 48 hours in advance for each day of July 2020 and then compared both the predictions corresponding to a given day, for all days. Furthermore, in the model modification part, we selected historical data from June 2020 for the validation set, determined a large offset from the results that were predicted to be active, and used the ratio of the mean absolute error of the detected results to that of the predicted results as a correction coefficient to modify our multi-input-multi-output long short-term memory model. The average root mean square error of the 24-hour-advance predictions of our modified model was 4.4 TECU, which was lower and better than5.1 TECU of the multi-input-multi-output, long short-term memory model and 5.9 TECU of the IRI-2016 model.展开更多
Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These li...Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These limitations can result in the misjudgment of models,leading to a degradation in overall detection performance.This paper proposes a novel transformer-like anomaly detection model adopting a contrastive learning module and a memory block(CLME)to overcome the above limitations.The contrastive learning module tailored for time series data can learn the contextual relationships to generate temporal fine-grained representations.The memory block can record normal patterns of these representations through the utilization of attention-based addressing and reintegration mechanisms.These two modules together effectively alleviate the problem of generalization.Furthermore,this paper introduces a fusion anomaly detection strategy that comprehensively takes into account the residual and feature spaces.Such a strategy can enlarge the discrepancies between normal and abnormal data,which is more conducive to anomaly identification.The proposed CLME model not only efficiently enhances the generalization performance but also improves the ability of anomaly detection.To validate the efficacy of the proposed approach,extensive experiments are conducted on well-established benchmark datasets,including SWaT,PSM,WADI,and MSL.The results demonstrate outstanding performance,with F1 scores of 90.58%,94.83%,91.58%,and 91.75%,respectively.These findings affirm the superiority of the CLME model over existing stateof-the-art anomaly detection methodologies in terms of its ability to detect anomalies within complex datasets accurately.展开更多
The classification of infrasound events has considerable importance in improving the capability to identify the types of natural disasters.The traditional infrasound classification mainly relies on machine learning al...The classification of infrasound events has considerable importance in improving the capability to identify the types of natural disasters.The traditional infrasound classification mainly relies on machine learning algorithms after artificial feature extraction.However,guaranteeing the effectiveness of the extracted features is difficult.The current trend focuses on using a convolution neural network to automatically extract features for classification.This method can be used to extract signal spatial features automatically through a convolution kernel;however,infrasound signals contain not only spatial information but also temporal information when used as a time series.These extracted temporal features are also crucial.If only a convolution neural network is used,then the time dependence of the infrasound sequence will be missed.Using long short-term memory networks can compensate for the missing time-series features but induces spatial feature information loss of the infrasound signal.A multiscale squeeze excitation–convolution neural network–bidirectional long short-term memory network infrasound event classification fusion model is proposed in this study to address these problems.This model automatically extracted temporal and spatial features,adaptively selected features,and also realized the fusion of the two types of features.Experimental results showed that the classification accuracy of the model was more than 98%,thus verifying the effectiveness and superiority of the proposed model.展开更多
Deep neural networks have been widely applied to bearing fault diagnosis systems and achieved impressive success recently.To address the problem that the insufficient fault feature extraction ability of traditional fa...Deep neural networks have been widely applied to bearing fault diagnosis systems and achieved impressive success recently.To address the problem that the insufficient fault feature extraction ability of traditional fault diagnosis methods results in poor diagnosis effect under variable load and noise interference scenarios,a rolling bearing fault diagnosis model combining Multi-Scale Convolutional Neural Network(MSCNN)and Long Short-Term Memory(LSTM)fused with attention mechanism is proposed.To adaptively extract the essential spatial feature information of various sizes,the model creates a multi-scale feature extraction module using the convolutional neural network(CNN)learning process.The learning capacity of LSTM for time information sequence is then used to extract the vibration signal’s temporal feature information.Two parallel large and small convolutional kernels teach the system spatial local features.LSTM gathers temporal global features to thoroughly and painstakingly mine the vibration signal’s characteristics,thus enhancing model generalization.Lastly,bearing fault diagnosis is accomplished by using the SoftMax classifier.The experiment outcomes demonstrate that the model can derive fault properties entirely from the initial vibration signal.It can retain good diagnostic accuracy under variable load and noise interference and has strong generalization compared to other fault diagnosis models.展开更多
With the advent of physics informed neural networks(PINNs),deep learning has gained interest for solving nonlinear partial differential equations(PDEs)in recent years.In this paper,physics informed memory networks(PIM...With the advent of physics informed neural networks(PINNs),deep learning has gained interest for solving nonlinear partial differential equations(PDEs)in recent years.In this paper,physics informed memory networks(PIMNs)are proposed as a new approach to solving PDEs by using physical laws and dynamic behavior of PDEs.Unlike the fully connected structure of the PINNs,the PIMNs construct the long-term dependence of the dynamics behavior with the help of the long short-term memory network.Meanwhile,the PDEs residuals are approximated using difference schemes in the form of convolution filter,which avoids information loss at the neighborhood of the sampling points.Finally,the performance of the PIMNs is assessed by solving the Kd V equation and the nonlinear Schr?dinger equation,and the effects of difference schemes,boundary conditions,network structure and mesh size on the solutions are discussed.Experiments show that the PIMNs are insensitive to boundary conditions and have excellent solution accuracy even with only the initial conditions.展开更多
Aiming at the problems of low accuracy and slow convergence speed of current intrusion detection models,SpiralConvolution is combined with Long Short-Term Memory Network to construct a new intrusion detection model.Th...Aiming at the problems of low accuracy and slow convergence speed of current intrusion detection models,SpiralConvolution is combined with Long Short-Term Memory Network to construct a new intrusion detection model.The dataset is first preprocessed using solo thermal encoding and normalization functions.Then the spiral convolution-Long Short-Term Memory Network model is constructed,which consists of spiral convolution,a two-layer long short-term memory network,and a classifier.It is shown through experiments that the model is characterized by high accuracy,small model computation,and fast convergence speed relative to previous deep learning models.The model uses a new neural network to achieve fast and accurate network traffic intrusion detection.The model in this paper achieves 0.9706 and 0.8432 accuracy rates on the NSL-KDD dataset and the UNSWNB-15 dataset under five classifications and ten classes,respectively.展开更多
Maintaining a steady power supply requires accurate forecasting of solar irradiance,since clean energy resources do not provide steady power.The existing forecasting studies have examined the limited effects of weathe...Maintaining a steady power supply requires accurate forecasting of solar irradiance,since clean energy resources do not provide steady power.The existing forecasting studies have examined the limited effects of weather conditions on solar radiation such as temperature and precipitation utilizing convolutional neural network(CNN),but no comprehensive study has been conducted on concentrations of air pollutants along with weather conditions.This paper proposes a hybrid approach based on deep learning,expanding the feature set by adding new air pollution concentrations,and ranking these features to select and reduce their size to improve efficiency.In order to improve the accuracy of feature selection,a maximum-dependency and minimum-redundancy(mRMR)criterion is applied to the constructed feature space to identify and rank the features.The combination of air pollution data with weather conditions data has enabled the prediction of solar irradiance with a higher accuracy.An evaluation of the proposed approach is conducted in Istanbul over 12 months for 43791 discrete times,with the main purpose of analyzing air data,including particular matter(PM10 and PM25),carbon monoxide(CO),nitric oxide(NOX),nitrogen dioxide(NO_(2)),ozone(O₃),sulfur dioxide(SO_(2))using a CNN,a long short-term memory network(LSTM),and MRMR feature extraction.Compared with the benchmark models with root mean square error(RMSE)results of 76.2,60.3,41.3,32.4,there is a significant improvement with the RMSE result of 5.536.This hybrid model presented here offers high prediction accuracy,a wider feature set,and a novel approach based on air concentrations combined with weather conditions for solar irradiance prediction.展开更多
In the fast-evolving landscape of digital networks,the incidence of network intrusions has escalated alarmingly.Simultaneously,the crucial role of time series data in intrusion detection remains largely underappreciat...In the fast-evolving landscape of digital networks,the incidence of network intrusions has escalated alarmingly.Simultaneously,the crucial role of time series data in intrusion detection remains largely underappreciated,with most systems failing to capture the time-bound nuances of network traffic.This leads to compromised detection accuracy and overlooked temporal patterns.Addressing this gap,we introduce a novel SSAE-TCN-BiLSTM(STL)model that integrates time series analysis,significantly enhancing detection capabilities.Our approach reduces feature dimensionalitywith a Stacked Sparse Autoencoder(SSAE)and extracts temporally relevant features through a Temporal Convolutional Network(TCN)and Bidirectional Long Short-term Memory Network(Bi-LSTM).By meticulously adjusting time steps,we underscore the significance of temporal data in bolstering detection accuracy.On the UNSW-NB15 dataset,ourmodel achieved an F1-score of 99.49%,Accuracy of 99.43%,Precision of 99.38%,Recall of 99.60%,and an inference time of 4.24 s.For the CICDS2017 dataset,we recorded an F1-score of 99.53%,Accuracy of 99.62%,Precision of 99.27%,Recall of 99.79%,and an inference time of 5.72 s.These findings not only confirm the STL model’s superior performance but also its operational efficiency,underpinning its significance in real-world cybersecurity scenarios where rapid response is paramount.Our contribution represents a significant advance in cybersecurity,proposing a model that excels in accuracy and adaptability to the dynamic nature of network traffic,setting a new benchmark for intrusion detection systems.展开更多
Due to the unpredictable output characteristics of distributed photovoltaics,their integration into the grid can lead to voltage fluctuations within the regional power grid.Therefore,the development of spatial-tempora...Due to the unpredictable output characteristics of distributed photovoltaics,their integration into the grid can lead to voltage fluctuations within the regional power grid.Therefore,the development of spatial-temporal coordination and optimization control methods for distributed photovoltaics and energy storage systems is of utmost importance in various scenarios.This paper approaches the issue from the perspective of spatiotemporal forecasting of distributed photovoltaic(PV)generation and proposes a Temporal Convolutional-Long Short-Term Memory prediction model that combines Temporal Convolutional Networks(TCN)and Long Short-Term Memory(LSTM).To begin with,an analysis of the spatiotemporal distribution patterns of PV generation is conducted,and outlier data is handled using the 3σ rule.Subsequently,a novel approach that combines temporal convolution and LSTM networks is introduced,with TCN extracting spatial features and LSTM capturing temporal features.Finally,a real spatiotemporal dataset from Gansu,China,is established to compare the performance of the proposed network against other models.The results demonstrate that the model presented in this paper exhibits the highest predictive accuracy,with a single-step Mean Absolute Error(MAE)of 1.782 and an average Root Mean Square Error(RMSE)of 3.72 for multi-step predictions.展开更多
Nowadays,with the rapid development of industrial Internet technology,on the one hand,advanced industrial control systems(ICS)have improved industrial production efficiency.However,there are more and more cyber-attack...Nowadays,with the rapid development of industrial Internet technology,on the one hand,advanced industrial control systems(ICS)have improved industrial production efficiency.However,there are more and more cyber-attacks targeting industrial control systems.To ensure the security of industrial networks,intrusion detection systems have been widely used in industrial control systems,and deep neural networks have always been an effective method for identifying cyber attacks.Current intrusion detection methods still suffer from low accuracy and a high false alarm rate.Therefore,it is important to build a more efficient intrusion detection model.This paper proposes a hybrid deep learning intrusion detection method based on convolutional neural networks and bidirectional long short-term memory neural networks(CNN-BiLSTM).To address the issue of imbalanced data within the dataset and improve the model’s detection capabilities,the Synthetic Minority Over-sampling Technique-Edited Nearest Neighbors(SMOTE-ENN)algorithm is applied in the preprocessing phase.This algorithm is employed to generate synthetic instances for the minority class,simultaneously mitigating the impact of noise in the majority class.This approach aims to create a more equitable distribution of classes,thereby enhancing the model’s ability to effectively identify patterns in both minority and majority classes.In the experimental phase,the detection performance of the method is verified using two data sets.Experimental results show that the accuracy rate on the CICIDS-2017 data set reaches 97.7%.On the natural gas pipeline dataset collected by Lan Turnipseed from Mississippi State University in the United States,the accuracy rate also reaches 85.5%.展开更多
Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a s...Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a single prediction model is hard to capture temporal features effectively, resulting in diminished predictionaccuracy. In this study, a hybrid deep learning framework that integrates attention mechanism, convolution neuralnetwork (CNN), improved chaotic particle swarm optimization (ICPSO), and long short-term memory (LSTM), isproposed for short-term household load forecasting. Firstly, the CNN model is employed to extract features fromthe original data, enhancing the quality of data features. Subsequently, the moving average method is used for datapreprocessing, followed by the application of the LSTM network to predict the processed data. Moreover, the ICPSOalgorithm is introduced to optimize the parameters of LSTM, aimed at boosting the model’s running speed andaccuracy. Finally, the attention mechanism is employed to optimize the output value of LSTM, effectively addressinginformation loss in LSTM induced by lengthy sequences and further elevating prediction accuracy. According tothe numerical analysis, the accuracy and effectiveness of the proposed hybrid model have been verified. It canexplore data features adeptly, achieving superior prediction accuracy compared to other forecasting methods forthe household load exhibiting significant fluctuations across different seasons.展开更多
文摘There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement and time series of a landslide.The second one is the dynamic evolution of a landslide,which could not be feasibly simulated simply by traditional prediction models.In this paper,a dynamic model of displacement prediction is introduced for composite landslides based on a combination of empirical mode decomposition with soft screening stop criteria(SSSC-EMD)and deep bidirectional long short-term memory(DBi-LSTM)neural network.In the proposed model,the time series analysis and SSSC-EMD are used to decompose the observed accumulated displacements of a slope into three components,viz.trend displacement,periodic displacement,and random displacement.Then,by analyzing the evolution pattern of a landslide and its key factors triggering landslides,appropriate influencing factors are selected for each displacement component,and DBi-LSTM neural network to carry out multi-datadriven dynamic prediction for each displacement component.An accumulated displacement prediction has been obtained by a summation of each component.For accuracy verification and engineering practicability of the model,field observations from two known landslides in China,the Xintan landslide and the Bazimen landslide were collected for comparison and evaluation.The case study verified that the model proposed in this paper can better characterize the"stepwise"deformation characteristics of a slope.As compared with long short-term memory(LSTM)neural network,support vector machine(SVM),and autoregressive integrated moving average(ARIMA)model,DBi-LSTM neural network has higher accuracy in predicting the periodic displacement of slope deformation,with the mean absolute percentage error reduced by 3.063%,14.913%,and 13.960%respectively,and the root mean square error reduced by 1.951 mm,8.954 mm and 7.790 mm respectively.Conclusively,this model not only has high prediction accuracy but also is more stable,which can provide new insight for practical landslide prevention and control engineering.
文摘In this paper, a filtering method is presented to estimate time-varying parameters of a missile dual control system with tail fins and reaction jets as control variables. In this method, the long-short-term memory(LSTM) neural network is nested into the extended Kalman filter(EKF) to modify the Kalman gain such that the filtering performance is improved in the presence of large model uncertainties. To avoid the unstable network output caused by the abrupt changes of system states,an adaptive correction factor is introduced to correct the network output online. In the process of training the network, a multi-gradient descent learning mode is proposed to better fit the internal state of the system, and a rolling training is used to implement an online prediction logic. Based on the Lyapunov second method, we discuss the stability of the system, the result shows that when the training error of neural network is sufficiently small, the system is asymptotically stable. With its application to the estimation of time-varying parameters of a missile dual control system, the LSTM-EKF shows better filtering performance than the EKF and adaptive EKF(AEKF) when there exist large uncertainties in the system model.
基金supported by the Natural Science Foundation of Shaanxi Province under Grant 2019JQ206in part by the Science and Technology Department of Shaanxi Province under Grant 2020CGXNG-009in part by the Education Department of Shaanxi Province under Grant 17JK0346。
文摘An accurate landslide displacement prediction is an important part of landslide warning system. Aiming at the dynamic characteristics of landslide evolution and the shortcomings of traditional static prediction models, this paper proposes a dynamic prediction model of landslide displacement based on singular spectrum analysis(SSA) and stack long short-term memory(SLSTM) network. The SSA is used to decompose the landslide accumulated displacement time series data into trend term and periodic term displacement subsequences. A cubic polynomial function is used to predict the trend term displacement subsequence, and the SLSTM neural network is used to predict the periodic term displacement subsequence. At the same time, the Bayesian optimization algorithm is used to determine that the SLSTM network input sequence length is 12 and the number of hidden layer nodes is 18. The SLSTM network is updated by adding predicted values to the training set to achieve dynamic displacement prediction. Finally, the accumulated landslide displacement is obtained by superimposing the predicted value of each displacement subsequence. The proposed model was verified on the Xintan landslide in Hubei Province, China. The results show that when predicting the displacement of the periodic term, the SLSTM network has higher prediction accuracy than the support vector machine(SVM) and auto regressive integrated moving average(ARIMA). The mean relative error(MRE) is reduced by 4.099% and 3.548% respectively, while the root mean square error(RMSE) is reduced by 5.830 mm and 3.854 mm respectively. It is concluded that the SLSTM network model can better simulate the dynamic characteristics of landslides.
文摘In dense pedestrian tracking,frequent object occlusions and close distances between objects cause difficulty when accurately estimating object trajectories.In this study,a conditional random field tracking model is established by using a visual long short term memory network in the three-dimensional(3D)space and the motion estimations jointly performed on object trajectory segments.Object visual field information is added to the long short term memory network to improve the accuracy of the motion related object pair selection and motion estimation.To address the uncertainty of the length and interval of trajectory segments,a multimode long short term memory network is proposed for the object motion estimation.The tracking performance is evaluated using the PETS2009 dataset.The experimental results show that the proposed method achieves better performance than the tracking methods based on the independent motion estimation.
基金This work is supported by the National Nature Science Foundation of China(NSFC)under Grant Nos.61571106,61501169,41706103the Fundamental Research Funds for the Central Universities under Grant No.2242013K30010.
文摘Speaker separation in complex acoustic environment is one of challenging tasks in speech separation.In practice,speakers are very often unmoving or moving slowly in normal communication.In this case,the spatial features among the consecutive speech frames become highly correlated such that it is helpful for speaker separation by providing additional spatial information.To fully exploit this information,we design a separation system on Recurrent Neural Network(RNN)with long short-term memory(LSTM)which effectively learns the temporal dynamics of spatial features.In detail,a LSTM-based speaker separation algorithm is proposed to extract the spatial features in each time-frequency(TF)unit and form the corresponding feature vector.Then,we treat speaker separation as a supervised learning problem,where a modified ideal ratio mask(IRM)is defined as the training function during LSTM learning.Simulations show that the proposed system achieves attractive separation performance in noisy and reverberant environments.Specifically,during the untrained acoustic test with limited priors,e.g.,unmatched signal to noise ratio(SNR)and reverberation,the proposed LSTM based algorithm can still outperforms the existing DNN based method in the measures of PESQ and STOI.It indicates our method is more robust in untrained conditions.
基金This work was supported by the National Key Research and Development Program of China(Grant No.2018YFC0407004)the Natural Science Foundation of China(Grants No.51939004 and 11772116).
文摘The unloading relaxation caused by excavation for construction of high arch dams is an important factor influencing the foundation’s integrity and strength.To evaluate the degree of unloading relaxation,the long-short term memory(LSTM)network was used to estimate the depth of unloading relaxation zones on the left bank foundation of the Baihetan Arch Dam.Principal component analysis indicates that rock charac-teristics,the structural plane,the protection layer,lithology,and time are the main factors.The LSTM network results demonstrate the unloading relaxation characteristics of the left bank,and the relationships with the factors were also analyzed.The structural plane has the most significant influence on the distribution of unloading relaxation zones.Compared with massive basalt,the columnar jointed basalt experiences a more significant unloading relaxation phenomenon with a clear time effect,with the average unloading relaxation period being 50 d.The protection layer can effectively reduce the unloading relaxation depth by approximately 20%.
基金Project (No. 60372026) supported by the National Natural ScienceFoundation of China
文摘RF power amplifiers (PAs) are usually considered as memoryless devices in most existing predistortion techniques. Nevertheless, in wideband communication systems, PA memory effects can no longer be ignored and memoryless predistortion cannot linearize PAs effectively. After analyzing PA memory effects, a novel predistortion method based on wavelet networks (WNs) is proposed to linearize wideband RF power amplifiers. A complex wavelet network with tapped delay lines is applied to construct the predistorter and then a complex backpropagation algorithm is developed to train the predistorter parameters. The simulation results show that compared with the previously published feed-forward neural network predistortion method, the proposed method provides faster convergence rate and better performance in reducing out-of-band spectral regrowth.
基金the Beijing Chaoyang District Collaborative Innovation Project(No.CYXT2013)the subject support of Beijing Municipal Science and Technology Key R&D Program-Capital Blue Sky Action Cultivation Project(Z19110900910000)+1 种基金“Research and Demonstration ofHigh Emission Vehicle Monitoring Equipment System Based on Sensor Integration Technology”(Z19110000911003)This work was supported by the Academic Research Projects of Beijing Union University(No.ZK80202103).
文摘Time series forecasting and analysis are widely used in many fields and application scenarios.Time series historical data reflects the change pattern and trend,which can serve the application and decision in each application scenario to a certain extent.In this paper,we select the time series prediction problem in the atmospheric environment scenario to start the application research.In terms of data support,we obtain the data of nearly 3500 vehicles in some cities in China fromRunwoda Research Institute,focusing on the major pollutant emission data of non-road mobile machinery and high emission vehicles in Beijing and Bozhou,Anhui Province to build the dataset and conduct the time series prediction analysis experiments on them.This paper proposes a P-gLSTNet model,and uses Autoregressive Integrated Moving Average model(ARIMA),long and short-term memory(LSTM),and Prophet to predict and compare the emissions in the future period.The experiments are validated on four public data sets and one self-collected data set,and the mean absolute error(MAE),root mean square error(RMSE),and mean absolute percentage error(MAPE)are selected as the evaluationmetrics.The experimental results show that the proposed P-gLSTNet fusion model predicts less error,outperforms the backbone method,and is more suitable for the prediction of time-series data in this scenario.
基金supported by the Department of Education of Liaoning Province under Grant JDL2020020the Transportation Science and Technology Project of Liaoning Province under Grant 202243.
文摘In this study,an optimized long short-term memory(LSTM)network is proposed to predict the reliability and remaining useful life(RUL)of rolling bearings based on an improved whale-optimized algorithm(IWOA).The multi-domain features are extracted to construct the feature dataset because the single-domain features are difficult to characterize the performance degeneration of the rolling bearing.To provide covariates for reliability assessment,a kernel principal component analysis is used to reduce the dimensionality of the features.A Weibull distribution proportional hazard model(WPHM)is used for the reliability assessment of rolling bearing,and a beluga whale optimization(BWO)algorithm is combined with maximum likelihood estimation(MLE)to improve the estimation accuracy of the model parameters of the WPHM,which provides the data basis for predicting reliability.Considering the possible gradient explosion by training the rolling bearing lifetime data and the difficulties in selecting the key network parameters,an optimized LSTM network called the improved whale optimization algorithm-based long short-term memory(IWOA-LSTM)network is proposed.As IWOA better jumps out of the local optimization,the fitting and prediction accuracies of the network are correspondingly improved.The experimental results show that compared with the whale optimization algorithm-based long short-term memory(WOA-LSTM)network,the reliability prediction and RUL prediction accuracies of the rolling bearing are improved by the proposed IWOA-LSTM network.
基金Project supported by the National Key Research and Development Program of China(Grant No.2016YFA0302101)the Initiative Program of State Key Laboratory of Precision Measurement Technology and Instrument。
文摘Ionosphere delay is one of the main sources of noise affecting global navigation satellite systems, operation of radio detection and ranging systems and very-long-baseline-interferometry. One of the most important and common methods to reduce this phase delay is to establish accurate nowcasting and forecasting ionospheric total electron content models. For forecasting models, compared to mid-to-high latitudes, at low latitudes, an active ionosphere leads to extreme differences between long-term prediction models and the actual state of the ionosphere. To solve the problem of low accuracy for long-term prediction models at low latitudes, this article provides a low-latitude, long-term ionospheric prediction model based on a multi-input-multi-output, long-short-term memory neural network. To verify the feasibility of the model, we first made predictions of the vertical total electron content data 24 and 48 hours in advance for each day of July 2020 and then compared both the predictions corresponding to a given day, for all days. Furthermore, in the model modification part, we selected historical data from June 2020 for the validation set, determined a large offset from the results that were predicted to be active, and used the ratio of the mean absolute error of the detected results to that of the predicted results as a correction coefficient to modify our multi-input-multi-output long short-term memory model. The average root mean square error of the 24-hour-advance predictions of our modified model was 4.4 TECU, which was lower and better than5.1 TECU of the multi-input-multi-output, long short-term memory model and 5.9 TECU of the IRI-2016 model.
基金support from the Major National Science and Technology Special Projects(2016ZX02301003-004-007)the Natural Science Foundation of Hebei Province(F2020202067)。
文摘Some reconstruction-based anomaly detection models in multivariate time series have brought impressive performance advancements but suffer from weak generalization ability and a lack of anomaly identification.These limitations can result in the misjudgment of models,leading to a degradation in overall detection performance.This paper proposes a novel transformer-like anomaly detection model adopting a contrastive learning module and a memory block(CLME)to overcome the above limitations.The contrastive learning module tailored for time series data can learn the contextual relationships to generate temporal fine-grained representations.The memory block can record normal patterns of these representations through the utilization of attention-based addressing and reintegration mechanisms.These two modules together effectively alleviate the problem of generalization.Furthermore,this paper introduces a fusion anomaly detection strategy that comprehensively takes into account the residual and feature spaces.Such a strategy can enlarge the discrepancies between normal and abnormal data,which is more conducive to anomaly identification.The proposed CLME model not only efficiently enhances the generalization performance but also improves the ability of anomaly detection.To validate the efficacy of the proposed approach,extensive experiments are conducted on well-established benchmark datasets,including SWaT,PSM,WADI,and MSL.The results demonstrate outstanding performance,with F1 scores of 90.58%,94.83%,91.58%,and 91.75%,respectively.These findings affirm the superiority of the CLME model over existing stateof-the-art anomaly detection methodologies in terms of its ability to detect anomalies within complex datasets accurately.
文摘The classification of infrasound events has considerable importance in improving the capability to identify the types of natural disasters.The traditional infrasound classification mainly relies on machine learning algorithms after artificial feature extraction.However,guaranteeing the effectiveness of the extracted features is difficult.The current trend focuses on using a convolution neural network to automatically extract features for classification.This method can be used to extract signal spatial features automatically through a convolution kernel;however,infrasound signals contain not only spatial information but also temporal information when used as a time series.These extracted temporal features are also crucial.If only a convolution neural network is used,then the time dependence of the infrasound sequence will be missed.Using long short-term memory networks can compensate for the missing time-series features but induces spatial feature information loss of the infrasound signal.A multiscale squeeze excitation–convolution neural network–bidirectional long short-term memory network infrasound event classification fusion model is proposed in this study to address these problems.This model automatically extracted temporal and spatial features,adaptively selected features,and also realized the fusion of the two types of features.Experimental results showed that the classification accuracy of the model was more than 98%,thus verifying the effectiveness and superiority of the proposed model.
文摘Deep neural networks have been widely applied to bearing fault diagnosis systems and achieved impressive success recently.To address the problem that the insufficient fault feature extraction ability of traditional fault diagnosis methods results in poor diagnosis effect under variable load and noise interference scenarios,a rolling bearing fault diagnosis model combining Multi-Scale Convolutional Neural Network(MSCNN)and Long Short-Term Memory(LSTM)fused with attention mechanism is proposed.To adaptively extract the essential spatial feature information of various sizes,the model creates a multi-scale feature extraction module using the convolutional neural network(CNN)learning process.The learning capacity of LSTM for time information sequence is then used to extract the vibration signal’s temporal feature information.Two parallel large and small convolutional kernels teach the system spatial local features.LSTM gathers temporal global features to thoroughly and painstakingly mine the vibration signal’s characteristics,thus enhancing model generalization.Lastly,bearing fault diagnosis is accomplished by using the SoftMax classifier.The experiment outcomes demonstrate that the model can derive fault properties entirely from the initial vibration signal.It can retain good diagnostic accuracy under variable load and noise interference and has strong generalization compared to other fault diagnosis models.
文摘With the advent of physics informed neural networks(PINNs),deep learning has gained interest for solving nonlinear partial differential equations(PDEs)in recent years.In this paper,physics informed memory networks(PIMNs)are proposed as a new approach to solving PDEs by using physical laws and dynamic behavior of PDEs.Unlike the fully connected structure of the PINNs,the PIMNs construct the long-term dependence of the dynamics behavior with the help of the long short-term memory network.Meanwhile,the PDEs residuals are approximated using difference schemes in the form of convolution filter,which avoids information loss at the neighborhood of the sampling points.Finally,the performance of the PIMNs is assessed by solving the Kd V equation and the nonlinear Schr?dinger equation,and the effects of difference schemes,boundary conditions,network structure and mesh size on the solutions are discussed.Experiments show that the PIMNs are insensitive to boundary conditions and have excellent solution accuracy even with only the initial conditions.
基金the Gansu University of Political Science and Law Key Research Funding Project in 2018(GZF2018XZDLW20)Gansu Provincial Science and Technology Plan Project(Technology Innovation Guidance Plan)(20CX9ZA072).
文摘Aiming at the problems of low accuracy and slow convergence speed of current intrusion detection models,SpiralConvolution is combined with Long Short-Term Memory Network to construct a new intrusion detection model.The dataset is first preprocessed using solo thermal encoding and normalization functions.Then the spiral convolution-Long Short-Term Memory Network model is constructed,which consists of spiral convolution,a two-layer long short-term memory network,and a classifier.It is shown through experiments that the model is characterized by high accuracy,small model computation,and fast convergence speed relative to previous deep learning models.The model uses a new neural network to achieve fast and accurate network traffic intrusion detection.The model in this paper achieves 0.9706 and 0.8432 accuracy rates on the NSL-KDD dataset and the UNSWNB-15 dataset under five classifications and ten classes,respectively.
文摘Maintaining a steady power supply requires accurate forecasting of solar irradiance,since clean energy resources do not provide steady power.The existing forecasting studies have examined the limited effects of weather conditions on solar radiation such as temperature and precipitation utilizing convolutional neural network(CNN),but no comprehensive study has been conducted on concentrations of air pollutants along with weather conditions.This paper proposes a hybrid approach based on deep learning,expanding the feature set by adding new air pollution concentrations,and ranking these features to select and reduce their size to improve efficiency.In order to improve the accuracy of feature selection,a maximum-dependency and minimum-redundancy(mRMR)criterion is applied to the constructed feature space to identify and rank the features.The combination of air pollution data with weather conditions data has enabled the prediction of solar irradiance with a higher accuracy.An evaluation of the proposed approach is conducted in Istanbul over 12 months for 43791 discrete times,with the main purpose of analyzing air data,including particular matter(PM10 and PM25),carbon monoxide(CO),nitric oxide(NOX),nitrogen dioxide(NO_(2)),ozone(O₃),sulfur dioxide(SO_(2))using a CNN,a long short-term memory network(LSTM),and MRMR feature extraction.Compared with the benchmark models with root mean square error(RMSE)results of 76.2,60.3,41.3,32.4,there is a significant improvement with the RMSE result of 5.536.This hybrid model presented here offers high prediction accuracy,a wider feature set,and a novel approach based on air concentrations combined with weather conditions for solar irradiance prediction.
基金supported in part by the Gansu Province Higher Education Institutions Industrial Support Program:Security Situational Awareness with Artificial Intelligence and Blockchain Technology.Project Number(2020C-29).
文摘In the fast-evolving landscape of digital networks,the incidence of network intrusions has escalated alarmingly.Simultaneously,the crucial role of time series data in intrusion detection remains largely underappreciated,with most systems failing to capture the time-bound nuances of network traffic.This leads to compromised detection accuracy and overlooked temporal patterns.Addressing this gap,we introduce a novel SSAE-TCN-BiLSTM(STL)model that integrates time series analysis,significantly enhancing detection capabilities.Our approach reduces feature dimensionalitywith a Stacked Sparse Autoencoder(SSAE)and extracts temporally relevant features through a Temporal Convolutional Network(TCN)and Bidirectional Long Short-term Memory Network(Bi-LSTM).By meticulously adjusting time steps,we underscore the significance of temporal data in bolstering detection accuracy.On the UNSW-NB15 dataset,ourmodel achieved an F1-score of 99.49%,Accuracy of 99.43%,Precision of 99.38%,Recall of 99.60%,and an inference time of 4.24 s.For the CICDS2017 dataset,we recorded an F1-score of 99.53%,Accuracy of 99.62%,Precision of 99.27%,Recall of 99.79%,and an inference time of 5.72 s.These findings not only confirm the STL model’s superior performance but also its operational efficiency,underpinning its significance in real-world cybersecurity scenarios where rapid response is paramount.Our contribution represents a significant advance in cybersecurity,proposing a model that excels in accuracy and adaptability to the dynamic nature of network traffic,setting a new benchmark for intrusion detection systems.
基金The Science and Technology Project of the State Grid Corporation of China(Research and Demonstration of Loss Reduction Technology Based on Reactive Power Potential Exploration and Excitation of Distributed Photovoltaic-Energy Storage Converters:5400-202333241 A-1-1-ZN).
文摘Due to the unpredictable output characteristics of distributed photovoltaics,their integration into the grid can lead to voltage fluctuations within the regional power grid.Therefore,the development of spatial-temporal coordination and optimization control methods for distributed photovoltaics and energy storage systems is of utmost importance in various scenarios.This paper approaches the issue from the perspective of spatiotemporal forecasting of distributed photovoltaic(PV)generation and proposes a Temporal Convolutional-Long Short-Term Memory prediction model that combines Temporal Convolutional Networks(TCN)and Long Short-Term Memory(LSTM).To begin with,an analysis of the spatiotemporal distribution patterns of PV generation is conducted,and outlier data is handled using the 3σ rule.Subsequently,a novel approach that combines temporal convolution and LSTM networks is introduced,with TCN extracting spatial features and LSTM capturing temporal features.Finally,a real spatiotemporal dataset from Gansu,China,is established to compare the performance of the proposed network against other models.The results demonstrate that the model presented in this paper exhibits the highest predictive accuracy,with a single-step Mean Absolute Error(MAE)of 1.782 and an average Root Mean Square Error(RMSE)of 3.72 for multi-step predictions.
基金support from the Liaoning Province Nature Fund Project(No.2022-MS-291)the Scientific Research Project of Liaoning Province Education Department(LJKMZ20220781,LJKMZ20220783,LJKQZ20222457,JYTMS20231488).
文摘Nowadays,with the rapid development of industrial Internet technology,on the one hand,advanced industrial control systems(ICS)have improved industrial production efficiency.However,there are more and more cyber-attacks targeting industrial control systems.To ensure the security of industrial networks,intrusion detection systems have been widely used in industrial control systems,and deep neural networks have always been an effective method for identifying cyber attacks.Current intrusion detection methods still suffer from low accuracy and a high false alarm rate.Therefore,it is important to build a more efficient intrusion detection model.This paper proposes a hybrid deep learning intrusion detection method based on convolutional neural networks and bidirectional long short-term memory neural networks(CNN-BiLSTM).To address the issue of imbalanced data within the dataset and improve the model’s detection capabilities,the Synthetic Minority Over-sampling Technique-Edited Nearest Neighbors(SMOTE-ENN)algorithm is applied in the preprocessing phase.This algorithm is employed to generate synthetic instances for the minority class,simultaneously mitigating the impact of noise in the majority class.This approach aims to create a more equitable distribution of classes,thereby enhancing the model’s ability to effectively identify patterns in both minority and majority classes.In the experimental phase,the detection performance of the method is verified using two data sets.Experimental results show that the accuracy rate on the CICIDS-2017 data set reaches 97.7%.On the natural gas pipeline dataset collected by Lan Turnipseed from Mississippi State University in the United States,the accuracy rate also reaches 85.5%.
基金the Shanghai Rising-Star Program(No.22QA1403900)the National Natural Science Foundation of China(No.71804106)the Noncarbon Energy Conversion and Utilization Institute under the Shanghai Class IV Peak Disciplinary Development Program.
文摘Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a single prediction model is hard to capture temporal features effectively, resulting in diminished predictionaccuracy. In this study, a hybrid deep learning framework that integrates attention mechanism, convolution neuralnetwork (CNN), improved chaotic particle swarm optimization (ICPSO), and long short-term memory (LSTM), isproposed for short-term household load forecasting. Firstly, the CNN model is employed to extract features fromthe original data, enhancing the quality of data features. Subsequently, the moving average method is used for datapreprocessing, followed by the application of the LSTM network to predict the processed data. Moreover, the ICPSOalgorithm is introduced to optimize the parameters of LSTM, aimed at boosting the model’s running speed andaccuracy. Finally, the attention mechanism is employed to optimize the output value of LSTM, effectively addressinginformation loss in LSTM induced by lengthy sequences and further elevating prediction accuracy. According tothe numerical analysis, the accuracy and effectiveness of the proposed hybrid model have been verified. It canexplore data features adeptly, achieving superior prediction accuracy compared to other forecasting methods forthe household load exhibiting significant fluctuations across different seasons.