期刊文献+

基于掩码时间注意力和置信度损失函数的序列数据早期分类方法

Early time series classification based on masked time attention and confidence loss function
下载PDF
导出
摘要 序列数据的早期分类对于高时效性应用具有重要意义。该任务的目标是在满足预期分类精度的前提下,尽快地对持续输入的时间序列进行分类。目前,深度学习已经在序列数据早期分类任务中得到了广泛应用。现有的深度方法通常利用递归神经网络来适应流数据的长度变化,并通过设置分类概率阈值退出分类过程。然而这些方法忽视了流数据的关键识别区域随信息量的增加持续变化。为了解决该问题,本文提出了一种基于掩码时间注意力机制的时间卷积网络来动态关注关键识别区域。此外,考虑到正确类别的分类概率分数应随模型观察到更多数据单调不递减,本文设计了一个置信度损失函数惩罚不符合该条件的模型,进一步促使模型提取更有区分性的特征。在8个公开数据集的实验结果表明了所提方法优越的早期分类性能。 Early time series classification(ETSC)is significant for time-sensitive applications.This task aims to classify a continuous input time series as quickly as possible and meet the expected classification quality.Deep learning has been extensively explored in ETSC.Existing deep learning methods usually leverage recurrent neural networks to adapt to the length variation of flow data,and set the threshold for early quitting.However,these methods underestimate the continuous change of critical identification regions of flow data.To solve this problem,this paper proposes a temporal convolutional network based on a masked time attention mechanism.In this way,the model can dynamically focus on key recognition regions.Besides,considering that the classification probability of the correct category should be monotonically non-decreasing as the model observes more data,this paper designs a confidence loss function to penalize models that do not meet this condition,further prompting the model to extract more discriminative features.Experimental results on 8 public datasets demonstrate the superior early classification performance of the proposed method.
作者 陈慧玲 张晔 田奥升 赵晗馨 CHEN Huiling;ZHANG Ye;TIAN Aosheng;ZHAO Hanxin(College of Electronic Science and Technology,National University of Defense Technology,Changsha 410073,China)
出处 《智能计算机与应用》 2023年第7期27-32,共6页 Intelligent Computer and Applications
关键词 序列数据早期分类 掩码时间注意力 置信度损失函数 时间卷积网络 early time series classification masked time attention confidence loss function temporal convolutional network
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部