摘要
近几年,神经网络因其强大的表征能力逐渐取代传统的机器学习成为自然语言处理任务的基本模型。然而经典的神经网络模型只能处理欧氏空间中的数据,自然语言处理领域中,篇章结构,句法甚至句子本身都以图数据的形式存在。因此,图神经网络引起学界广泛关注,并在自然语言处理的多个领域成功应用。该文对图神经网络在自然语言处理领域中的应用进行了系统性的综述,首先介绍了图神经网络的核心思想并梳理了三种经典方法:图循环网络,图卷积网络和图注意力网络;然后在具体任务中,详细描述了如何根据任务特性构建合适的图结构以及如何合理运用图结构表示模型。该文认为,相比专注于探索图神经网络的不同结构,探索如何以图的方式建模不同任务中的关键信息,是图神经网络未来工作中更具普遍性和学术价值的一个研究方向。
In recent years, neural networks have gradually overtaken classical machine learning models and become the de facto paradigm for natural language processing tasks. Most typical neural networks are capable of dealing with data in Euclidean space. Due to the linguistic nature, however, the language information such as discourse and syntactic information is of graph structures. Therefore, there has been an increasing number of researches that use graph neural networks to explore structures in natural languages. This paper systematically introduces applications of graph neural networks in natural language processing areas. It first discusses the fundamental concepts and introduces three main categories of graph neural networks, namely graph recurrent neural network, graph convolutional network, and graph attention network. Then this paper introduces methods to construct proper graph structures according to different tasks, and to apply graph neural networks to embed those structures. This paper suggests that compared with focusing on novel structures, exploring how to use the key information in specific tasks to create corresponding graphs is more universal and is of more academic value, which can be a promising future research direction.
作者
陈雨龙
付乾坤
张岳
CHEN Yulong;FU Qiankun;ZHANG Yue(Zhejiang University,Hangzhou,Zhejiang 310027,China;School of Engineering,Westlake University,Hangzhou,Zhejiang 310024,China;Institute of Advanced Technology,Westlake Institute for Advanced Study,Hangzhou,Zhejiang 310024,China)
出处
《中文信息学报》
CSCD
北大核心
2021年第3期1-23,共23页
Journal of Chinese Information Processing
基金
国家自然科学基金(61976180)。
关键词
综述
自然语言处理
图神经网络
survey
natural language processing
graph neural network