期刊文献+

基于神经网络的EAST密度极限破裂预测 被引量:1

Density Limit Disruption Prediction of EAST Based on Neural Network
下载PDF
导出
摘要 为了对全超导托卡马克核聚变实验装置(EAST)密度极限破裂进行预测,根据密度极限破裂的基本特征从2014到2019年放电数据中筛选出972炮密度极限破裂炮,选取了13种诊断信号为特征作为输入,分别由多层感知机(MLP)和长短时记忆网络(LSTM)为模型、以破裂概率为模型输出建立破裂预测器对密度极限破裂进行预测实验.结果表明:对密度极限破裂炮,在不同的预警时间下,LSTM的成功预测率(95%)均高于MLP的成功预测率(85%);而对于非破裂炮,LSTM和MLP的错误预测率相近(8%).LSTM对密度极限破裂的预测性能较MLP有较大的提高.说明利用神经网络进行EAST密度极限破裂预测以及提高破裂避免和缓解系统响应性能的可行性. To solve the blank of current research on the prediction of density limit disruption of EAST,972 density limit disruptive pulses selected as data sets from the EAST’s 2014 to 2019 discharge.13 diagnostic signals were chosen as features.Multi-Layer Perceptron(MLP)and Long Short-Term Memory(LSTM)was used as models and the disruption risk was used as output to build the predictors.The experimental results show that for density limit disruptive pulses,under different alarming times,the successful prediction rate of LSTM(around 95%)is higher than that of MLP(85%),and for non-disruptive pulses,the false prediction rate is around 8%for both MLP and LSTM.The performance of LSTM has great improvement than MLP,shows the feasibility of building EAST density limit disruption system with neural networks and improving the response performance of disruption avoidance and mitigation system.
作者 陈俊杰 胡文慧 肖建元 郭笔豪 肖炳甲 CHEN Jun-Jie;HU Wen-Hui;XIAO Jian-Yuan;GUO Bi-Hao;XIAO Bing-Jia(Department of Engineering and Applied Physics,School of Physics Sciences,University of Science and Technology of China,Hefei 230026,China;Institute of Plasma Physics,Hefei Institutes of Physical Science,Chinese Academy of Sciences,Hefei 230031,China)
出处 《计算机系统应用》 2020年第11期21-28,共8页 Computer Systems & Applications
基金 国家重点研发计划(2016YFA0400600,2016YFA0400601,2016YFA0400602) 国家自然科学基金(11775219,11575186) 中国科学院合肥物质科学研究院院长基金(YZJJ2020QN11)
关键词 托卡马克 密度极限破裂预测 MLP LSTM 机器学习 Tokamak density limit disruption prediction MLP LSTM machine learning
  • 相关文献

参考文献4

二级参考文献6

共引文献3

同被引文献7

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部