期刊文献+

门控递归神经网络处理不规则时间序列数据综述

Review of Gated Recurrent Neural Networks for Processing Irregular Time Series Data
下载PDF
导出
摘要 随着多传感器系统的发展以及非结构化手工数据记录机制的持续使用,不规则时间序列数据越来越普遍。不规则数据和由此产生的缺失值严重限制了对数据进行分析和建模以完成分类和预测任务的能力。通常情况下,用于处理时间序列数据的传统方法会引起偏差,并对底层数据的生成过程进行强假设,这可能会导致较差的模型预测结果。传统的机器学习和深度学习方法虽仍处于数据建模的前沿,但最多只能受到不规则时间序列数据集的影响,无法对不完整时间序列的时间不规则性进行建模。门控递归神经网络(RNN),如LSTM和GRU,在序列建模方面取得了突出的成绩,并在许多应用领域得到了应用,如自然语言处理。这些模型已成为时间序列建模的良好选择,也是处理不规则时间序列数据的重要工具。文中重点介绍了处理不规则时间序列数据的两种常用方法,即在数据预处理阶段输入缺失值以及在学习过程中修改算法从而直接处理缺失值,旨在介绍这一研究分支中出现的有效的技术,以便研究人员创造出进一步处理不规则时间序列数据的新技术。 With the development of multi-sensor systems and the continued use of unstructured manual data recording mechanisms,irregular time series data is becoming more and more common.Irregular data and the resulting missing values severely limit the ability to analyze and model the data to accomplish classification and prediction tasks.Often,tra-ditional methods used to process time series data cause bias and make strong assumptions about the generation process of the underlying data,which can lead to poor model prediction results.T raditional machine learning and deep learning met-hods,although still at the forefront of data modeling,can only be affected by irregular time series datasets at best,and cannot model the temporal irregularities of incomplete time series.Gated Recurrent Neural Networks(RNNs),such as.LSTMs and GRUs,have made outstanding achievements in sequence modeling and have been applied in many application fields,such as natural language processing.These models have become a good choice for time series modeling and an im-portant tool for processing irregular time series data.This paper focuses on two common methods for processing irregular time series data,namely entering missing values in the data preprocessing stage and modifying the algorithm to directly deal with missing values during the learning process,techniques so that researchers can create new techniques for further processing irregular time series data.
作者 马永航 林志诚 MA Yonghang;LIN Zhicheng(Xinjiang University,Urumqi 830046,China)
机构地区 新疆大学
出处 《移动信息》 2023年第11期151-153,157,共4页 MOBILE INFORMATION
关键词 循环神经网络 强化学习 模型匹配 门控递归神经网络 Recurrent neural networks Reinforcement learning Model matching Gated recurrent neural networks
  • 相关文献

参考文献7

二级参考文献28

共引文献11

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部