期刊文献+
共找到645篇文章
< 1 2 33 >
每页显示 20 50 100
An Enhanced Ensemble-Based Long Short-Term Memory Approach for Traffic Volume Prediction
1
作者 Duy Quang Tran Huy Q.Tran Minh Van Nguyen 《Computers, Materials & Continua》 SCIE EI 2024年第3期3585-3602,共18页
With the advancement of artificial intelligence,traffic forecasting is gaining more and more interest in optimizing route planning and enhancing service quality.Traffic volume is an influential parameter for planning ... With the advancement of artificial intelligence,traffic forecasting is gaining more and more interest in optimizing route planning and enhancing service quality.Traffic volume is an influential parameter for planning and operating traffic structures.This study proposed an improved ensemble-based deep learning method to solve traffic volume prediction problems.A set of optimal hyperparameters is also applied for the suggested approach to improve the performance of the learning process.The fusion of these methodologies aims to harness ensemble empirical mode decomposition’s capacity to discern complex traffic patterns and long short-term memory’s proficiency in learning temporal relationships.Firstly,a dataset for automatic vehicle identification is obtained and utilized in the preprocessing stage of the ensemble empirical mode decomposition model.The second aspect involves predicting traffic volume using the long short-term memory algorithm.Next,the study employs a trial-and-error approach to select a set of optimal hyperparameters,including the lookback window,the number of neurons in the hidden layers,and the gradient descent optimization.Finally,the fusion of the obtained results leads to a final traffic volume prediction.The experimental results show that the proposed method outperforms other benchmarks regarding various evaluation measures,including mean absolute error,root mean squared error,mean absolute percentage error,and R-squared.The achieved R-squared value reaches an impressive 98%,while the other evaluation indices surpass the competing.These findings highlight the accuracy of traffic pattern prediction.Consequently,this offers promising prospects for enhancing transportation management systems and urban infrastructure planning. 展开更多
关键词 Ensemble empirical mode decomposition traffic volume prediction long short-term memory optimal hyperparameters deep learning
下载PDF
Slope stability prediction based on a long short-term memory neural network:comparisons with convolutional neural networks,support vector machines and random forest models 被引量:4
2
作者 Faming Huang Haowen Xiong +4 位作者 Shixuan Chen Zhitao Lv Jinsong Huang Zhilu Chang Filippo Catani 《International Journal of Coal Science & Technology》 EI CAS CSCD 2023年第2期83-96,共14页
The numerical simulation and slope stability prediction are the focus of slope disaster research.Recently,machine learning models are commonly used in the slope stability prediction.However,these machine learning mode... The numerical simulation and slope stability prediction are the focus of slope disaster research.Recently,machine learning models are commonly used in the slope stability prediction.However,these machine learning models have some problems,such as poor nonlinear performance,local optimum and incomplete factors feature extraction.These issues can affect the accuracy of slope stability prediction.Therefore,a deep learning algorithm called Long short-term memory(LSTM)has been innovatively proposed to predict slope stability.Taking the Ganzhou City in China as the study area,the landslide inventory and their characteristics of geotechnical parameters,slope height and slope angle are analyzed.Based on these characteristics,typical soil slopes are constructed using the Geo-Studio software.Five control factors affecting slope stability,including slope height,slope angle,internal friction angle,cohesion and volumetric weight,are selected to form different slope and construct model input variables.Then,the limit equilibrium method is used to calculate the stability coefficients of these typical soil slopes under different control factors.Each slope stability coefficient and its corresponding control factors is a slope sample.As a result,a total of 2160 training samples and 450 testing samples are constructed.These sample sets are imported into LSTM for modelling and compared with the support vector machine(SVM),random forest(RF)and convo-lutional neural network(CNN).The results show that the LSTM overcomes the problem that the commonly used machine learning models have difficulty extracting global features.Furthermore,LSTM has a better prediction performance for slope stability compared to SVM,RF and CNN models. 展开更多
关键词 Slope stability prediction long short-term memory deep learning Geo-Studio software Machine learning model
下载PDF
Landslide displacement prediction based on optimized empirical mode decomposition and deep bidirectional long short-term memory network
3
作者 ZHANG Ming-yue HAN Yang +1 位作者 YANG Ping WANG Cong-ling 《Journal of Mountain Science》 SCIE CSCD 2023年第3期637-656,共20页
There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement an... There are two technical challenges in predicting slope deformation.The first one is the random displacement,which could not be decomposed and predicted by numerically resolving the observed accumulated displacement and time series of a landslide.The second one is the dynamic evolution of a landslide,which could not be feasibly simulated simply by traditional prediction models.In this paper,a dynamic model of displacement prediction is introduced for composite landslides based on a combination of empirical mode decomposition with soft screening stop criteria(SSSC-EMD)and deep bidirectional long short-term memory(DBi-LSTM)neural network.In the proposed model,the time series analysis and SSSC-EMD are used to decompose the observed accumulated displacements of a slope into three components,viz.trend displacement,periodic displacement,and random displacement.Then,by analyzing the evolution pattern of a landslide and its key factors triggering landslides,appropriate influencing factors are selected for each displacement component,and DBi-LSTM neural network to carry out multi-datadriven dynamic prediction for each displacement component.An accumulated displacement prediction has been obtained by a summation of each component.For accuracy verification and engineering practicability of the model,field observations from two known landslides in China,the Xintan landslide and the Bazimen landslide were collected for comparison and evaluation.The case study verified that the model proposed in this paper can better characterize the"stepwise"deformation characteristics of a slope.As compared with long short-term memory(LSTM)neural network,support vector machine(SVM),and autoregressive integrated moving average(ARIMA)model,DBi-LSTM neural network has higher accuracy in predicting the periodic displacement of slope deformation,with the mean absolute percentage error reduced by 3.063%,14.913%,and 13.960%respectively,and the root mean square error reduced by 1.951 mm,8.954 mm and 7.790 mm respectively.Conclusively,this model not only has high prediction accuracy but also is more stable,which can provide new insight for practical landslide prevention and control engineering. 展开更多
关键词 Landslide displacement Empirical mode decomposition Soft screening stop criteria deep bidirectional long short-term memory neural network Xintan landslide Bazimen landslide
下载PDF
Solar cycle prediction using a long short-term memory deep learning model 被引量:1
4
作者 Qi-Jie Wang Jia-Chen Li Liang-Qi Guo 《Research in Astronomy and Astrophysics》 SCIE CAS CSCD 2021年第1期119-126,共8页
In this paper,we propose a long short-term memory(LSTM)deep learning model to deal with the smoothed monthly sunspot number(SSN),aiming to address the problem whereby the prediction results of the existing sunspot pre... In this paper,we propose a long short-term memory(LSTM)deep learning model to deal with the smoothed monthly sunspot number(SSN),aiming to address the problem whereby the prediction results of the existing sunspot prediction methods are not uniform and have large deviations.Our method optimizes the number of hidden nodes and batch sizes of the LSTM network structures to 19 and 20,respectively.The best length of time series and the value of the timesteps were then determined for the network training,and one-step and multi-step predictions for Cycle 22 to Cycle 24 were made using the well-established network.The results showed that the maximum root-mean-square error(RMSE)of the one-step prediction model was6.12 and the minimum was only 2.45.The maximum amplitude prediction error of the multi-step prediction was 17.2%and the minimum was only 3.0%.Finally,the next solar cycles(Cycle 25)peak amplitude was predicted to occur around 2023,with a peak value of about 114.3.The accuracy of this prediction method is better than that of the other commonly used methods,and the method has high applicability. 展开更多
关键词 Sun:solar activity Sun:sunspot number techniques:deep learning techniques:long short-term memory
下载PDF
Deep Learning for Financial Time Series Prediction:A State-of-the-Art Review of Standalone and HybridModels
5
作者 Weisi Chen Walayat Hussain +1 位作者 Francesco Cauteruccio Xu Zhang 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第4期187-224,共38页
Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep lear... Financial time series prediction,whether for classification or regression,has been a heated research topic over the last decade.While traditional machine learning algorithms have experienced mediocre results,deep learning has largely contributed to the elevation of the prediction performance.Currently,the most up-to-date review of advanced machine learning techniques for financial time series prediction is still lacking,making it challenging for finance domain experts and relevant practitioners to determine which model potentially performs better,what techniques and components are involved,and how themodel can be designed and implemented.This review article provides an overview of techniques,components and frameworks for financial time series prediction,with an emphasis on state-of-the-art deep learning models in the literature from2015 to 2023,including standalonemodels like convolutional neural networks(CNN)that are capable of extracting spatial dependencies within data,and long short-term memory(LSTM)that is designed for handling temporal dependencies;and hybrid models integrating CNN,LSTM,attention mechanism(AM)and other techniques.For illustration and comparison purposes,models proposed in recent studies are mapped to relevant elements of a generalized framework comprised of input,output,feature extraction,prediction,and related processes.Among the state-of-the-artmodels,hybrid models like CNNLSTMand CNN-LSTM-AM in general have been reported superior in performance to stand-alone models like the CNN-only model.Some remaining challenges have been discussed,including non-friendliness for finance domain experts,delayed prediction,domain knowledge negligence,lack of standards,and inability of real-time and highfrequency predictions.The principal contributions of this paper are to provide a one-stop guide for both academia and industry to review,compare and summarize technologies and recent advances in this area,to facilitate smooth and informed implementation,and to highlight future research directions. 展开更多
关键词 Financial time series prediction convolutional neural network long short-term memory deep learning attention mechanism FINANCE
下载PDF
Credit Card Fraud Detection Using Improved Deep Learning Models
6
作者 Sumaya S.Sulaiman Ibraheem Nadher Sarab M.Hameed 《Computers, Materials & Continua》 SCIE EI 2024年第1期1049-1069,共21页
Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown pr... Fraud of credit cards is a major issue for financial organizations and individuals.As fraudulent actions become more complex,a demand for better fraud detection systems is rising.Deep learning approaches have shown promise in several fields,including detecting credit card fraud.However,the efficacy of these models is heavily dependent on the careful selection of appropriate hyperparameters.This paper introduces models that integrate deep learning models with hyperparameter tuning techniques to learn the patterns and relationships within credit card transaction data,thereby improving fraud detection.Three deep learning models:AutoEncoder(AE),Convolution Neural Network(CNN),and Long Short-Term Memory(LSTM)are proposed to investigate how hyperparameter adjustment impacts the efficacy of deep learning models used to identify credit card fraud.The experiments conducted on a European credit card fraud dataset using different hyperparameters and three deep learning models demonstrate that the proposed models achieve a tradeoff between detection rate and precision,leading these models to be effective in accurately predicting credit card fraud.The results demonstrate that LSTM significantly outperformed AE and CNN in terms of accuracy(99.2%),detection rate(93.3%),and area under the curve(96.3%).These proposed models have surpassed those of existing studies and are expected to make a significant contribution to the field of credit card fraud detection. 展开更多
关键词 Card fraud detection hyperparameter tuning deep learning autoencoder convolution neural network long short-term memory RESAMPLING
下载PDF
Short-Term Household Load Forecasting Based on Attention Mechanism and CNN-ICPSO-LSTM
7
作者 Lin Ma Liyong Wang +5 位作者 Shuang Zeng Yutong Zhao Chang Liu Heng Zhang Qiong Wu Hongbo Ren 《Energy Engineering》 EI 2024年第6期1473-1493,共21页
Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a s... Accurate load forecasting forms a crucial foundation for implementing household demand response plans andoptimizing load scheduling. When dealing with short-term load data characterized by substantial fluctuations,a single prediction model is hard to capture temporal features effectively, resulting in diminished predictionaccuracy. In this study, a hybrid deep learning framework that integrates attention mechanism, convolution neuralnetwork (CNN), improved chaotic particle swarm optimization (ICPSO), and long short-term memory (LSTM), isproposed for short-term household load forecasting. Firstly, the CNN model is employed to extract features fromthe original data, enhancing the quality of data features. Subsequently, the moving average method is used for datapreprocessing, followed by the application of the LSTM network to predict the processed data. Moreover, the ICPSOalgorithm is introduced to optimize the parameters of LSTM, aimed at boosting the model’s running speed andaccuracy. Finally, the attention mechanism is employed to optimize the output value of LSTM, effectively addressinginformation loss in LSTM induced by lengthy sequences and further elevating prediction accuracy. According tothe numerical analysis, the accuracy and effectiveness of the proposed hybrid model have been verified. It canexplore data features adeptly, achieving superior prediction accuracy compared to other forecasting methods forthe household load exhibiting significant fluctuations across different seasons. 展开更多
关键词 short-term household load forecasting long short-term memory network attention mechanism hybrid deep learning framework
下载PDF
Detecting the backfill pipeline blockage and leakage through an LSTM-based deep learning model 被引量:1
8
作者 Bolin Xiao Shengjun Miao +2 位作者 Daohong Xia Huatao Huang Jingyu Zhang 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2023年第8期1573-1583,共11页
Detecting a pipeline's abnormal status,which is typically a blockage and leakage accident,is important for the continuity and safety of mine backfill.The pipeline system for gravity-transport high-density backfill... Detecting a pipeline's abnormal status,which is typically a blockage and leakage accident,is important for the continuity and safety of mine backfill.The pipeline system for gravity-transport high-density backfill(GHB)is complex.Specifically designed,efficient,and accurate abnormal pipeline detection methods for GHB are rare.This work presents a long short-term memory-based deep learning(LSTM-DL)model for GHB pipeline blockage and leakage diagnosis.First,an industrial pipeline monitoring system was introduced using pressure and flow sensors.Second,blockage and leakage field experiments were designed to solve the problem of negative sample deficiency.The pipeline's statistical characteristics with different working statuses were analyzed to show their complexity.Third,the architecture of the LSTM-DL model was elaborated on and evaluated.Finally,the LSTM-DL model was compared with state-of-the-art(SOTA)learning algorithms.The results show that the backfilling cycle comprises multiple working phases and is intermittent.Although pressure and flow signals fluctuate stably in a normal cycle,their values are diverse in different cycles.Plugging causes a sudden change in interval signal features;leakage results in long variation duration and a wide fluctuation range.Among the SOTA models,the LSTM-DL model has the highest detection accuracy of98.31%for all states and the lowest misjudgment or false positive rate of 3.21%for blockage and leakage states.The proposed model can accurately recognize various pipeline statuses of complex GHB systems. 展开更多
关键词 mine backfill blockage and leakage pipeline detection long short-term memory networks deep learning
下载PDF
Device Anomaly Detection Algorithm Based on Enhanced Long Short-Term Memory Network
9
作者 罗辛 陈静 +1 位作者 袁德鑫 杨涛 《Journal of Donghua University(English Edition)》 CAS 2023年第5期548-559,共12页
The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-... The problems in equipment fault detection include data dimension explosion,computational complexity,low detection accuracy,etc.To solve these problems,a device anomaly detection algorithm based on enhanced long short-term memory(LSTM)is proposed.The algorithm first reduces the dimensionality of the device sensor data by principal component analysis(PCA),extracts the strongly correlated variable data among the multidimensional sensor data with the lowest possible information loss,and then uses the enhanced stacked LSTM to predict the extracted temporal data,thus improving the accuracy of anomaly detection.To improve the efficiency of the anomaly detection,a genetic algorithm(GA)is used to adjust the magnitude of the enhancements made by the LSTM model.The validation of the actual data from the pumps shows that the algorithm has significantly improved the recall rate and the detection speed of device anomaly detection,with the recall rate of 97.07%,which indicates that the algorithm is effective and efficient for device anomaly detection in the actual production environment. 展开更多
关键词 anomaly detection production equipment genetic algorithm(GA) long short-term memory(lstm) principal component analysis(PCA)
下载PDF
Long Short-Term Memory Recurrent Neural Network-Based Acoustic Model Using Connectionist Temporal Classification on a Large-Scale Training Corpus 被引量:8
10
作者 Donghyun Lee Minkyu Lim +4 位作者 Hosung Park Yoseb Kang Jeong-Sik Park Gil-Jin Jang Ji-Hwan Kim 《China Communications》 SCIE CSCD 2017年第9期23-31,共9页
A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a force... A Long Short-Term Memory(LSTM) Recurrent Neural Network(RNN) has driven tremendous improvements on an acoustic model based on Gaussian Mixture Model(GMM). However, these models based on a hybrid method require a forced aligned Hidden Markov Model(HMM) state sequence obtained from the GMM-based acoustic model. Therefore, it requires a long computation time for training both the GMM-based acoustic model and a deep learning-based acoustic model. In order to solve this problem, an acoustic model using CTC algorithm is proposed. CTC algorithm does not require the GMM-based acoustic model because it does not use the forced aligned HMM state sequence. However, previous works on a LSTM RNN-based acoustic model using CTC used a small-scale training corpus. In this paper, the LSTM RNN-based acoustic model using CTC is trained on a large-scale training corpus and its performance is evaluated. The implemented acoustic model has a performance of 6.18% and 15.01% in terms of Word Error Rate(WER) for clean speech and noisy speech, respectively. This is similar to a performance of the acoustic model based on the hybrid method. 展开更多
关键词 acoustic model connectionisttemporal classification LARGE-SCALE trainingcorpus long short-term memory recurrentneural network
下载PDF
基于Transformer-LSTM的闽南语唇语识别
11
作者 曾蔚 罗仙仙 王鸿伟 《泉州师范学院学报》 2024年第2期10-17,共8页
针对端到端句子级闽南语唇语识别的问题,提出一种基于Transformer和长短时记忆网络(LSTM)的编解码模型.编码器采用时空卷积神经网络及Transformer编码器用于提取唇读序列时空特征,解码器采用长短时记忆网络并结合交叉注意力机制用于文... 针对端到端句子级闽南语唇语识别的问题,提出一种基于Transformer和长短时记忆网络(LSTM)的编解码模型.编码器采用时空卷积神经网络及Transformer编码器用于提取唇读序列时空特征,解码器采用长短时记忆网络并结合交叉注意力机制用于文本序列预测.最后,在自建闽南语唇语数据集上进行实验.实验结果表明:模型能有效地提高唇语识别的准确率. 展开更多
关键词 唇语识别 闽南语 TRANSFORMER 长短时记忆网络(lstm) 用时空卷积神经网络 注意力机制 端到端模型
下载PDF
Prophet-LSTM组合模型在运输航空征候预测中的应用
12
作者 杜红兵 邢梦柯 赵德超 《安全与环境学报》 CAS CSCD 北大核心 2024年第5期1878-1885,共8页
为准确预测中国运输航空征候万时率,提出了一种将时间序列模型和神经网络模型组合的预测方法。首先,利用2008年1月—2020年12月的运输航空征候万时率数据建立Prophet模型,使用RStudio软件进行模型拟合,获取运输航空征候万时率的线性部分... 为准确预测中国运输航空征候万时率,提出了一种将时间序列模型和神经网络模型组合的预测方法。首先,利用2008年1月—2020年12月的运输航空征候万时率数据建立Prophet模型,使用RStudio软件进行模型拟合,获取运输航空征候万时率的线性部分;其次,利用长短期记忆网络(Long Short-Term Memory,LSTM)建模,获取运输航空征候万时率的非线性部分;最后,利用方差倒数法建立Prophet-LSTM组合模型,使用建立的组合模型对2021年1—12月运输航空征候万时率进行预测,将预测结果与实际值进行对比验证。结果表明,Prophet-LSTM组合模型的EMA、EMAP、ERMS分别为0.0973、16.1285%、0.1287。相较于已有的自回归移动平均(Auto Regression Integrated Moving Average,ARIMA)+反向传播神经网络(Back Propagation Neural Network,BPNN)组合模型和GM(1,1)+ARIMA+LSTM组合模型,Prophet-LSTM组合模型的EMA、EMAP、ERMS分别减小了0.0259、10.4874百分点、0.0143和0.0128、2.0599百分点、0.0086,验证了Prophet-LSTM组合模型的预测精度更高,性能更优良。 展开更多
关键词 安全社会工程 运输航空征候 Prophet模型 长短期记忆网络(lstm)模型 组合预测模型
下载PDF
基于LSTM模型的船舶材料成本滚动预测
13
作者 潘燕华 李公卿 王平 《造船技术》 2024年第3期71-77,共7页
船舶建造周期长、材料成本占比大,易受大宗商品价格指数和汇率等多个因素的影响,造成实际完工成本与报价估算存在较大误差的情况。采用灰色关联分析(Grey Correlation Analysis,GCA)方法识别材料成本的影响因素,基于长短期记忆网络(Long... 船舶建造周期长、材料成本占比大,易受大宗商品价格指数和汇率等多个因素的影响,造成实际完工成本与报价估算存在较大误差的情况。采用灰色关联分析(Grey Correlation Analysis,GCA)方法识别材料成本的影响因素,基于长短期记忆网络(Long Short-Term Memory,LSTM)模型构建船舶材料成本滚动预测模型,并使用某造船企业53艘64000 t散货船63个月的材料成本数据和对应的影响因素数据进行试验分析。结果表明,预测数据与实际数据误差在可接受范围内,可证明所选择方法和构建模型的有效性。研究结果对制造过程的成本实时预测和控制具有现实意义。 展开更多
关键词 船舶 材料成本 滚动预测 长短期记忆网络模型 灰色关联分析
下载PDF
LSTM-DPPO based deep reinforcement learning controller for path following optimization of unmanned surface vehicle 被引量:1
14
作者 XIA Jiawei ZHU Xufang +1 位作者 LIU Zhong XIA Qingtao 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2023年第5期1343-1358,共16页
To solve the path following control problem for unmanned surface vehicles(USVs),a control method based on deep reinforcement learning(DRL)with long short-term memory(LSTM)networks is proposed.A distributed proximal po... To solve the path following control problem for unmanned surface vehicles(USVs),a control method based on deep reinforcement learning(DRL)with long short-term memory(LSTM)networks is proposed.A distributed proximal policy opti-mization(DPPO)algorithm,which is a modified actor-critic-based type of reinforcement learning algorithm,is adapted to improve the controller performance in repeated trials.The LSTM network structure is introduced to solve the strong temporal cor-relation USV control problem.In addition,a specially designed path dataset,including straight and curved paths,is established to simulate various sailing scenarios so that the reinforcement learning controller can obtain as much handling experience as possible.Extensive numerical simulation results demonstrate that the proposed method has better control performance under missions involving complex maneuvers than trained with limited scenarios and can potentially be applied in practice. 展开更多
关键词 unmanned surface vehicle(USV) deep reinforce-ment learning(DRL) path following path dataset proximal po-licy optimization long short-term memory(lstm)
下载PDF
利用长短期记忆网络LSTM对赤道太平洋海表面温度短期预报
15
作者 张桃 林鹏飞 +6 位作者 刘海龙 郑伟鹏 王鹏飞 徐天亮 李逸文 刘娟 陈铖 《大气科学》 CSCD 北大核心 2024年第2期745-754,共10页
海表面温度作为海洋中一个最重要的变量,对全球气候、海洋生态等有很大的影响,因此十分有必要对海表面温度(SST)进行预报。深度学习具备高效的数据处理能力,但目前利用深度学习对整个赤道太平洋的SST短期预报及预报技巧的研究仍较少。... 海表面温度作为海洋中一个最重要的变量,对全球气候、海洋生态等有很大的影响,因此十分有必要对海表面温度(SST)进行预报。深度学习具备高效的数据处理能力,但目前利用深度学习对整个赤道太平洋的SST短期预报及预报技巧的研究仍较少。本文基于最优插值海表面温度(OISST)的日平均SST数据,利用长短期记忆(LSTM)网络构建了未来10天赤道太平洋(10°S~10°N,120°E~80°W)SST的逐日预报模型。LSTM预报模型利用1982~2010年的观测数据进行训练,2011~2020年的观测数据作为初值进行预报和检验评估。结果表明:赤道太平洋东部地区预报均方根误差(RMSE)大于中、西部,东部预报第1天RMSE为0.6℃左右,而中、西部均小于0.3℃。在不同的年际变化位相,预报RMSE在拉尼娜出现时期最大,正常年份次之,厄尔尼诺时期最小,RMSE在拉尼娜时期比在厄尔尼诺时期可达20%。预报偏差整体表现为东正、西负。相关预报技巧上,中部最好,可预报天数基本为10天以上,赤道冷舌附近可预报天数为4~7天,赤道西边部分地区可预报天数为3天。预报模型在赤道太平洋东部地区各月份预报技巧普遍低于西部地区,相比较而言各区域10、11月份预报技巧最低。总的来说,基于LSTM构建的SST预报模型能很好地捕捉到SST在时序上的演变特征,在不同案例中预报表现良好。同时该预报模型依靠数据驱动,能迅速且较好地预报未来10天以内的日平均SST的短期变化。 展开更多
关键词 海表面温度 lstm (long short-term memory) 短期预报 赤道太平洋
下载PDF
基于Bi-LSTM的浅层地下双孔洞探测技术
16
作者 梁靖 张红 +3 位作者 叶晨 周立成 刘泽佳 汤立群 《合肥工业大学学报(自然科学版)》 CAS 北大核心 2024年第6期778-783,共6页
文章探究一种基于深度学习的浅层地下孔洞探测技术,以应对地下孔洞给桩基施工安全所造成的严重威胁。基于浅层地震反射波法的原理,采用基础施工过程中的桩锤激震作为激励源,通过在探测区域地表上布置少量加速度传感器采集孔洞反射信号,... 文章探究一种基于深度学习的浅层地下孔洞探测技术,以应对地下孔洞给桩基施工安全所造成的严重威胁。基于浅层地震反射波法的原理,采用基础施工过程中的桩锤激震作为激励源,通过在探测区域地表上布置少量加速度传感器采集孔洞反射信号,并将反射信号作为深度学习的输入,以输出孔洞信息,建立一种新型的智能孔洞探测方法。结果表明,双向长短期记忆神经网络(bidirectional long short-term memory neural network,Bi-LSTM)的预测模型对于地下双孔洞的工况具有较高的识别准确率,在容许误差为2 m的情况下,孔洞位置和直径的预测准确率可达95.3%。该研究验证了基于深度学习的多孔洞探测技术的可行性,有望为施工前期土层地质状况的评估提供技术保障。 展开更多
关键词 地下孔洞探测 桩锤激震 深度学习 双向长短期记忆神经网络(Bi-lstm) 有限元仿真
下载PDF
Navigation jamming signal recognition based on long short-term memory neural networks 被引量:3
17
作者 FU Dong LI Xiangjun +2 位作者 MOU Weihua MA Ming OU Gang 《Journal of Systems Engineering and Electronics》 SCIE EI CSCD 2022年第4期835-844,共10页
This paper introduces the time-frequency analyzed long short-term memory(TF-LSTM) neural network method for jamming signal recognition over the Global Navigation Satellite System(GNSS) receiver. The method introduces ... This paper introduces the time-frequency analyzed long short-term memory(TF-LSTM) neural network method for jamming signal recognition over the Global Navigation Satellite System(GNSS) receiver. The method introduces the long shortterm memory(LSTM) neural network into the recognition algorithm and combines the time-frequency(TF) analysis for signal preprocessing. Five kinds of navigation jamming signals including white Gaussian noise(WGN), pulse jamming, sweep jamming, audio jamming, and spread spectrum jamming are used as input for training and recognition. Since the signal parameters and quantity are unknown in the actual scenario, this work builds a data set containing multiple kinds and parameters jamming to train the TF-LSTM. The performance of this method is evaluated by simulations and experiments. The method has higher recognition accuracy and better robustness than the existing methods, such as LSTM and the convolutional neural network(CNN). 展开更多
关键词 satellite navigation jamming recognition time-frequency(TF)analysis long short-term memory(lstm)
下载PDF
Adaptive Deep Learning Model for Software Bug Detection and Classification
18
作者 S.Sivapurnima D.Manjula 《Computer Systems Science & Engineering》 SCIE EI 2023年第5期1233-1248,共16页
Software is unavoidable in software development and maintenance.In literature,many methods are discussed which fails to achieve efficient software bug detection and classification.In this paper,efficient Adaptive Deep... Software is unavoidable in software development and maintenance.In literature,many methods are discussed which fails to achieve efficient software bug detection and classification.In this paper,efficient Adaptive Deep Learning Model(ADLM)is developed for automatic duplicate bug report detection and classification process.The proposed ADLM is a combination of Conditional Random Fields decoding with Long Short-Term Memory(CRF-LSTM)and Dingo Optimizer(DO).In the CRF,the DO can be consumed to choose the efficient weight value in network.The proposed automatic bug report detection is proceeding with three stages like pre-processing,feature extraction in addition bug detection with classification.Initially,the bug report input dataset is gathered from the online source system.In the pre-processing phase,the unwanted information from the input data are removed by using cleaning text,convert data types and null value replacement.The pre-processed data is sent into the feature extraction phase.In the feature extraction phase,the four types of feature extraction method are utilized such as contextual,categorical,temporal and textual.Finally,the features are sent to the proposed ADLM for automatic duplication bug report detection and classification.The proposed methodology is proceeding with two phases such as training and testing phases.Based on the working process,the bugs are detected and classified from the input data.The projected technique is assessed by analyzing performance metrics such as accuracy,precision,Recall,F_Measure and kappa. 展开更多
关键词 Software bug detection classification PRE-PROCESSING feature extraction deep belief neural network long short-term memory
下载PDF
Bi-LSTM-Based Deep Stacked Sequence-to-Sequence Autoencoder for Forecasting Solar Irradiation and Wind Speed
19
作者 Neelam Mughees Mujtaba Hussain Jaffery +2 位作者 Abdullah Mughees Anam Mughees Krzysztof Ejsmont 《Computers, Materials & Continua》 SCIE EI 2023年第6期6375-6393,共19页
Wind and solar energy are two popular forms of renewable energy used in microgrids and facilitating the transition towards net-zero carbon emissions by 2050.However,they are exceedingly unpredictable since they rely h... Wind and solar energy are two popular forms of renewable energy used in microgrids and facilitating the transition towards net-zero carbon emissions by 2050.However,they are exceedingly unpredictable since they rely highly on weather and atmospheric conditions.In microgrids,smart energy management systems,such as integrated demand response programs,are permanently established on a step-ahead basis,which means that accu-rate forecasting of wind speed and solar irradiance intervals is becoming increasingly crucial to the optimal operation and planning of microgrids.With this in mind,a novel“bidirectional long short-term memory network”(Bi-LSTM)-based,deep stacked,sequence-to-sequence autoencoder(S2SAE)forecasting model for predicting short-term solar irradiation and wind speed was developed and evaluated in MATLAB.To create a deep stacked S2SAE prediction model,a deep Bi-LSTM-based encoder and decoder are stacked on top of one another to reduce the dimension of the input sequence,extract its features,and then reconstruct it to produce the forecasts.Hyperparameters of the proposed deep stacked S2SAE forecasting model were optimized using the Bayesian optimization algorithm.Moreover,the forecasting performance of the proposed Bi-LSTM-based deep stacked S2SAE model was compared to three other deep,and shallow stacked S2SAEs,i.e.,the LSTM-based deep stacked S2SAE model,gated recurrent unit-based deep stacked S2SAE model,and Bi-LSTM-based shallow stacked S2SAE model.All these models were also optimized and modeled in MATLAB.The results simulated based on actual data confirmed that the proposed model outperformed the alternatives by achieving an accuracy of up to 99.7%,which evidenced the high reliability of the proposed forecasting. 展开更多
关键词 deep stacked autoencoder sequence to sequence autoencoder bidirectional long short-term memory network wind speed forecasting solar irradiation forecasting
下载PDF
基于深度AttLSTM网络的脱硫过程建模
20
作者 刘泉伯 李晓理 王康 《北京工业大学学报》 CAS CSCD 北大核心 2024年第2期140-151,共12页
脱硫过程是具有高度动态非线性和较大延迟时间的复杂工业过程,为了解决烟气脱硫过程的建模问题,设计了注意力机制下的深度长短期记忆(attention mechanism-based long short-term memory,AttLSTM)网络,并基于该网络设计自动编码器,完成... 脱硫过程是具有高度动态非线性和较大延迟时间的复杂工业过程,为了解决烟气脱硫过程的建模问题,设计了注意力机制下的深度长短期记忆(attention mechanism-based long short-term memory,AttLSTM)网络,并基于该网络设计自动编码器,完成脱硫过程异常点的检测。该文首次提出使用AttLSTM网络自编码器对脱硫过程进行离群点检测,并且该网络模型同样首次应用于脱硫过程的辨识任务中。从更深的意义上讲,该文尝试使用深度学习模型对复杂系统进行辨识,所建立的AttLSTM网络之前未出现在系统辨识领域,该网络的出现可以丰富辨识模型的选择,同时为人工智能技术在系统辨识领域和控制领域的应用与推广提供参考。实验结果表明,相比于之前文献出现的脱硫过程建模方法,所提方法在不同性能指标上均具有更好的表现,由此可以证明深度AttLSTM网络在脱硫场景下的有效性。 展开更多
关键词 湿法烟气脱硫 过程建模 长短期记忆网络 注意力机制 自动编码器 大气污染
下载PDF
上一页 1 2 33 下一页 到第
使用帮助 返回顶部