期刊文献+

基于粗糙集和Bi-LSTM-Attention网络的电力系统暂态评估 被引量:3

Transient Assessment of Power Systems Based on Bi-LSTM-Attention and Rough Set
下载PDF
导出
摘要 为了降低不稳定因素对现代复杂的电力系统暂态评估的影响,在基于模糊领域单参数粗糙集和注意力机制的双向长短期记忆(Rs-Bi-LSTM-Attention)网络下,提出了一种电力系统暂态稳定评估模型。该模型首先用模糊领域单参数粗糙集对电力系统潮流数据进行属性约简,该方法既保证了准确率又提高了核心因素的权重;然后,通过Bi-LSTM-Attention网络对约简后的数据与电力系统暂态稳定状态之间建立映射关系,其中引入了LayerNormalization对高层神经网络的输入数据进行处理,使得高层神经网络不仅能适应低层参数更新,而且可以加快网络的收敛速率。最后,用评估准确率和F_(1-measure)两种评价指标对该模型的性能进行评估。IEEE39算例分析表明,Rs-Bi-LSTM-Attention模型比机器学习模型和部分深度学习模型具有更高的优越性。 In order to reduce the influence of unstable factors on the transient assessment of modern complex power systems, a transient assessment model for power systems based on a single-parameter rough set in the fuzzy domain and a bidirectional long-term short-term memory network under the attention mechanism(Rs-BiLSTM-Attention) is proposed in this paper. Firstly, the properties of the data are reduced by single-parameter rough set in the fuzzy domain, which not only guarantees the accuracy, but also increases the weight of the core factors. Then, the Bi-LSTM-Attention network is used to establish a mapping relationship between the reduced data and the transient stable state of the power system. And the Layer-Normalization is introduced to process the input data of the high-level neural network, so that the high-level neural network can not only adapt to the lowlevel parameter update, but also accelerate the convergence rate of the network. Finally, the model is evaluated by two evaluation indicators, F_(1-measure) LSTM-Attention model has higher superiority than machine learning models and some deep learning models.
作者 王晨宇 王锡淮 肖健梅 WANG Chen-yu;WANG Xi-huai;XIAO Jian-mei(Logistics Engineering College,Shanghai Maritime University,Shanghai 201306,China)
出处 《控制工程》 CSCD 北大核心 2022年第2期330-338,共9页 Control Engineering of China
基金 国家自然科学基金资助项目(71771143)。
关键词 模糊邻域单参数粗糙集 电力系统暂态评估 Layer-Normalization 双向长短期记忆神经网络 注意力机制 single-parameter rough set in the fuzzy domain transient assessment of power system Layer-Normalization Bi-LSTM attention mechanism
  • 相关文献

参考文献14

二级参考文献204

共引文献414

同被引文献45

引证文献3

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部