期刊文献+

基于长短时记忆网络的结构地震响应预测 被引量:7

Structural seismic response prediction based on long short-term memory network
下载PDF
导出
摘要 在地震作用下,为了给建筑结构的半主动、主动及智能控制系统提供持续稳定、精确实时的响应输入,提出一种基于长短时记忆网络(LSTM)深度学习框架的地震响应预测方法。将三种不同数据特点的地震加速度作用于钢梁有限元模型,采集响应数据库,对LSTM深度学习框架进行搭建、训练及参数优化,完成模型的数据测试,最后将该深度学习框架运用于某框架-剪力墙结构模型的顶层位移响应预测,并与试验数据进行对比。研究结果表明:在不同外部激励作用下,基于LSTM的地震响应预测策略具有较高的精确度和稳定性,可为控制系统预先提供准确的动力响应,从而有利于实现工程结构的在线实时减震控制。 In order to provide continuous,accurate and real-time response to the semi-active,active and intelligent control systems of building structures under earthquake action,a method of earthquake response prediction is put forward based on LSTM deep learning framework.Three kinds of seismic acceleration with different data characteristics are applied to the finite element model of steel beam,by collecting the response database,the LSTM deep learning framework is built and trained,and optimized parameters and the model data test are completed.Finally,the deep learning framework is applied to the top floor displacement response prediction of a frameshear wall structure model,and the results are compared with the experimental data.The results show that the LSTM seismic response prediction strategy has high accuracy and stability under different external excitations.It can provide accurate dynamic response for the control system in advance,which is beneficial to realize on-line real-time shock absorption control of engineering structures.
作者 高经纬 张春涛 Gao Jing-wei;Zhang Chun-tao(Hubei Key Laboratory ofRoadway Bridge and Structure Engineering,Wuhan University of Technology,Wuhan 430070,China;College of Civil Engineering and Architecture,Southwest University of Science and Technology,Mianyang 621010,China)
出处 《工程抗震与加固改造》 北大核心 2020年第3期130-136,共7页 Earthquake Resistant Engineering and Retrofitting
基金 国家自然科学基金资助项目(51568058) 国家科技支撑计划资助项目(2015BAL03B03) 西藏自治区科技计划资助项目(CGZH2018000014)。
关键词 深度学习 长短时记忆网络 结构地震响应 时间序列预测 deep learning LSTM seismic response of structure time series prediction
  • 相关文献

参考文献1

二级参考文献17

  • 1Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155.
  • 2Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048.
  • 3Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161.
  • 4Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243.
  • 5Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780.
  • 6Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136.
  • 7Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642.
  • 8Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104.
  • 9Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577.
  • 10Le P, Zuidema W. Inside-Outside Semantics: A Framework for Neural Models of Semantic Composi tlon[C]//Proceeding of the Deep Learning and Rep- resentation Learning Workshop: NIPS 2014.

共引文献90

同被引文献83

引证文献7

二级引证文献21

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部