期刊文献+

生物医学语义关系抽取方法综述 被引量:9

A Review of Methods for Semantic Relation Extraction in Biomedical Field
下载PDF
导出
摘要 深度学习在自然语言处理方面取得了显著成效,为生物医学领域的信息抽取带来新的研究范式。本研究旨在系统调研生物医学语义关系抽取方法、分析其发展历程,为深度学习方法的进一步运用提供基础和启示。通过检索Pub Med、Web of Science和IEEE数据库,以及Bio Creative、Sem Eval等重要测评网站,遴选出具有代表性的抽取方法,并从目的、方法、数据集和效果四个维度进行分析。经过系统梳理,可将生物医学语义关系抽取方法分为三个阶段:基于知识、传统机器学习和深度学习。将先验知识和领域资源恰当地融入到深度学习模型中,是进一步提升语义关系抽取效果的探索方向。 Deep-learning has made remarkable achievements in natural language processing (NLP), and is bringing a new research paradigm to information extraction in biomedical field. This paper studies the extraction methods of biomedical semantic relations and analyzes its development progress and principles, which may serve as foundation for further application of deep learning. After retrieving relevant information from PubMed, Web of Science, IEEE, and other important websites such as BioCreative and SemEval, representative methods are selected and analyzed from four dimensions of purpose, approach, dataset and performance. Extraction methods of biomedical semantic relation can be divided into three stages: knowledge-based, traditional machine learning- based and deep learning-based. It is a new exploration effort to enhance the extraction effect of semantic relations by introducing prior knowledge and domain resources into deep learning model properly.
出处 《图书馆论坛》 CSSCI 北大核心 2017年第6期61-69,共9页 Library Tribune
关键词 语义关系抽取 生物医学 深度学习 卷积神经网络 自然语言处理 semantic relation extraction biomedicine deep learning convolutional neural networks natural language processing
  • 相关文献

参考文献2

二级参考文献61

  • 1张钹.自然语言处理的计算模型[J].中文信息学报,2007,21(3):3-7. 被引量:17
  • 2Tenenbaum J, Kemp C, Griffiths T, et al. How to Grow a Mind: Statistics, Structure, and Abstraction [J]. Science, 2011, (331): 1279 1285.
  • 3Zhu J, Lao N, Xing E. Grafting-Light: Fast, Incre- mental Feature Selection and Structure Learning of Markov Networks[C]//Proceedings of SIGKDD Inter national Conference on Knowledge Discovery and Data Mining, 2010.
  • 4Kim S, Xing E. Tree-guided Group Lasso for Multi task Regression with Structured Sparsity [C]//Pro- ceedings of International Conference on Machine Learning (ICML), 2010.
  • 5Zhu J, Xing E, Zhang B. Laplace Maximum Margin Markov Networks [C]//Proceedings of International Conference on Machine Learning.(ICML) 1256 1263, 2008.
  • 6Ganchev K, Gra a J, Gillenwater J, et ai. Posterior Regularization for Structured Latent Variable Models [J]. Journal of Machine Learning Research. 2010 (11) .. 2001-2049.
  • 7AltunY, Tsochantaridis I, Hofmann T. Hidden Markov Support Vector Machines[C]//Proceedings of International Conference on Machine Learning (IC- ML), 2003.
  • 8Poon H, Domingos P. Unsupervised Ontology Induc- tion from Text[C]//Proceedings of the Annual Meet- Computational Linguistics Cohen S, Smith N. Covariance in Unsupervised Learn- ing of Probabilistic Grammars[J]. Journal of Machine Learning Research, 2010(11) :3017-3051.
  • 9Hinton G, Osindero S, Teh Y. A Fast Learning Al- gorithm for Deep Belief Nets[J]. Neural Computa- tion, 2006(18): 1527-1554.
  • 10Bengio Y, Lamblin P, Popoviei D, et al. Greedy Lay er-Wise Training of Deep Networks[C]//Proceedings of Advances in Neural Information Processing Systems 19 (NIPS 2006): 153-160, MIT Press, 2006.

共引文献69

同被引文献111

引证文献9

二级引证文献27

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部