期刊文献+

基于神经网络的混合数据的因果发现

Causal Discovery of Mixed Data Based on Neural Network
下载PDF
导出
摘要 因果推理正在成为机器学习领域一个越来越受关注的研究热点,现阶段的因果发现主要是在研究某一种假设条件下,基于纯粹的观测数据推断变量之间的因果方向。然而在现实世界中观察到的数据往往是由一些假设生成,使得传统因果推断方法的识别率不高、稳定性较差。针对当前的问题,提出了一种基于神经网络来解决混合数据因果推断的方法。该方法在混合加性噪声模型(ANM-MM)的假设下,使用梯度下降法最优化改进的损失函数得到混合数据的抽象因果分布参数,然后将分布参数看作是原因变量和结果变量之间的隐变量,通过比较原因变量和分布参数之间的HilberSchmidt独立性来确定二元变量的因果方向。在理论上证明了该方法的可行性,并通过实验表明该算法在人工数据和真实数据的表现较传统的IGCI,ANM,PNL,LiNGAM,SLOPE方法具有较好的准确率和稳定性。 Causal discovery is becoming a research hotspot in the field of machine learning. At present,the causal discovery is mainly to investigate the causal direction between variables based on pure observation data under the study of a certain assumption. However,the data observed in the real world is often generated by some assumptions,which makes the traditional causal inference method less accurate and less stable. Aiming at the current problem,a method based on neural network to solve the causal inference of mixed data is proposed. Under the assumption of additive noise model-mixture model(ANM-MM),the gradient loss method is used to optimize the improved loss function to obtain the abstract causal distribution parameters of the mixed data,and then the distribution parameters are regarded as hidden variable between the cause variable and the result variable. The hidden variable determines the causal direction of the binary variable by comparing the HilberSchmidt independence between the causal variable and the distribution parameter. The feasibility of the method is proved theoretically. The experiment shows that the proposed algorithm has better accuracy and stability than the traditional methods like IGCI,ANM,PNL,LiNGAM and SLOPE.
作者 耿家兴 万亚平 李洪飞 GENG Jia-xing;WAN Ya-ping;LI Hong-fei(School of Computer Science,University of South China,Hengyang 421001,China;CNNC Key Laboratory on High Trusted Computing,Hengyang 421001,China)
出处 《计算机技术与发展》 2020年第5期26-31,共6页 Computer Technology and Development
基金 国家自然科学基金(11805093) 中央军委科技委创新特区项目(17-163-15-XJ-002-002-04) 湖南省教育重点项目(17A185) 湖南省自然科学基金资助项目(2019JJ0486)。
关键词 神经网络 混合加性噪声 因果推断 梯度下降 HilberSchmidt独立性 neural network mixed additive noise causal inference gradient descent HilberSchmidt independence
  • 相关文献

参考文献2

二级参考文献23

  • 1白俊杰,何长艳.No.7信令网的网络优化[J].天津通信技术,2004(4):52-53. 被引量:4
  • 2周文静.无线网络优化方式新思路的探讨[J].广东通信技术,2006,26(12):2-6. 被引量:8
  • 3卢纪宇,白波.GSM无线网络的优化[J].电信技术,2006(12):60-62. 被引量:1
  • 4AGRAWAL R, MANNILA H, SRIKANT R, et al. Fast discovery of association rules [ M]//Advances in Knowledge Discovery and Data Mining. Menlo Park, CA: American Association for Artificial Intel- ligence, 1996:307-328.
  • 5CAI R, ZHANG Z, HAO Z. BASSUM: a Bayesian semi-super- vised method for classification feature selection [ J]. Pattern Rec- ognition, 2011, 44(4) : 811 -820.
  • 6PEARL J. Causality: Models, Reasoning and Inference [ M]. 2nd ed. Cambridge, UK: Cambridge University Press, 2009:49 -51.
  • 7HOYER P O, JANZING D, MOOIJ J M, et al. Nonlinear causal discovery with additive noise models [ C] // Advances in Neural In- formation Processing Systems 21. Cambridge, MA: MIT Press, 2008:689 - 696.
  • 8XIE X, GENG Z. A recursive method for structural learning of di- rected acyclic graphs [ J]. Journal of Machine Learning Research, 2008, 9:459-483.
  • 9HAN L, SONG G, CONG G, et al. Overlapping decomposition for causal graphical modeling [ C]//KDD '12: Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York: ACM, 2012:114-122.
  • 10TSAMARDINOS I, BROWN L E, ALIFERIS C F. The max-min hill-climbing Bayesian network structure learning algorithm [ J]. Machine Learning, 2006, 65(1): 31-78.

共引文献15

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部