期刊文献+

基于序列数据稳定的黑盒局部解释性方法研究

Research on Black Box Local Interpretative Method Based on Sequence Data Stability
下载PDF
导出
摘要 机器学习在实际应用场景中取得了巨大成功,但无法给出决策的明确解释限制了它在一些领域的应用。为改善其计算结果的不可理解性,一些学者对机器学习的可解释性进行研究,已有针对图像的解释性方法很难对具有时序相关性的文本数据做出正确的解释。针对这一问题,论文提出一种面向时序数据的稳定黑盒局部解释性方法DLEMNA。利用聚类算法解决随机扰动解释造成的不稳定性,引入Fused Lasso约束考量特征之间的时序相关性,通过构建线性模型计算影响决策的重要特征。论文以20newsgroups数据集为实验对象,实验结果表明论文提出的DLEMNA方法在保真性和稳定性两方面优于LIME方法。 Machine learning has achieved great success in practical application scenarios,but can't give some explanations for decision limited it apply in some fields.In order to improve the incomprehensibility of its calculation results,some scholars have conducted research on the interpretability of machine learning,but existing interpretative methods for images is difficult for text data with temporal correlation.In response to this problem,this paper proposes a deterministic local interpretable Model-Agnostic explanations approach DLEMNA for time series data.Firstly,the clustering algorithm is used to solve the instability caused by the random disturbance,then the Fused Lasso constraint is introduced to consider the temporal correlation between the features,and finally,the important features that affect the decision are calculated by constructing a linear model.This paper takes 20newsgroups data set as the experimental object.The experimental results show that the DLEMNA method proposed in this paper is superior to the classic LIME method in terms of fidelity and stability.
作者 邱玫媚 刘冬梅 QIU Meimei;LIU Dongmei(Nanjing University of Science&Technology,Nanjing 210000)
机构地区 南京理工大学
出处 《计算机与数字工程》 2022年第11期2509-2514,2520,共7页 Computer & Digital Engineering
关键词 序列数据 可解释性 聚类 Fused Lasso 线性模型 series data interpretability clustering algorithm Fused Lasso linear model
  • 相关文献

参考文献1

共引文献149

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部