期刊文献+

融合LSTM和MoE的倒闸操作识别

Identification of Switching Operation Based on LSTM and MoE
下载PDF
导出
摘要 为解决不同人员相同操作的个体差异以及同一人员不同时间相同操作差异的问题,提出一种基于混合专家系统(mixture of experts,MoE)和长短期记忆神经网络(long short-term memory,LSTM)的倒闸操作识别方法MoE-LSTM。基于MoE对LSTM进行集成,学习不同来源数据的特征分布。采集加速度动作数据构建倒闸操作数据集,基于滑动窗口对动作序列进行切分;将动作序列输入到MoE-LSTM中,由不同LSTM独立学习不同动作的时序依赖;通过门控网络选择对当前输入分类较好的LSTM的输出作为动作识别结果。仿真结果表明:不同LSTM对来自不同时空的动作数据都有擅长分类的特征空间。 Aiming at the individual differences of different personnel in the same operation and differences of the same person in the same operation at different times,a switching operation recognition model(MoE-LSTM)based on Mixture of experts model(MOE)and long short-term memory network(LSTM)is proposed.Based on MoE,LSTM is integrated to learn the feature distribution of different sources data.The acceleration data is collected to build the switching operation dataset and the action sequence is segmented and aligned based on sliding window.The action sequence is input to MoELSTM,and the temporal dependencies of different actions are independently learned by different LSTMs.The gating network selects the output of LSTM that classifies the current input better as the action recognition result.The result of model learning is that for action data from different time and space,different LSTMs perform better in a certain feature area than other LSTMs.The experiments on the switching operation dataset demonstrate superior performance of the proposed method compared to other existing action recognition algorithms.
作者 张晓青 肖万芳 郭英杰 刘博文 韩学森 马经纬 高高 黄赫 夏时洪 Zhang Xiaoqing;Xiao Wanfang;Guo Yingjie;Liu Bowen;Han Xuesen;Ma Jingwei;Gao Gao;Huang He;Xia Shihong(Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100190,China;College of Computer,Beijing University of Posts and Telecommunications,Beijing 100876,China;State Grid Beijing urban power supply company,Beijing 110102,China)
出处 《系统仿真学报》 CAS CSCD 北大核心 2022年第8期1899-1907,共9页 Journal of System Simulation
基金 国家重点研发计划(2020YFF0304701) 北京市电力公司科技项目(202021900T7)。
关键词 倒闸操作 长短期记忆神经网络 混合专家系统 神经网络 switching operation long short-term memory network(LSTM) mixture of experts model(MOE) neural network
  • 相关文献

参考文献3

二级参考文献3

共引文献44

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部