摘要
现今是高度信息化、智能化的社会,人们的生活早已离不开人工智能,如停车厂的车牌识别、美颜相机的智能滤镜以及机器人、语言翻译等。而在这些智能系统的背后,是人工智能在起着关键的作用,深度学习是人工智能的深层次理论,而本文所研究的自然语言处理则是深度学习的一个主要方向,也是机器学习的重要组成部分。自然语言处理简称NLP(natural language processing),NLP是人工智能方向中专门研究人类语言的,它的研究范围包括各个国家的所用语言,并应用于机器翻译、观点提取、语音识别、自动摘要、文本分类等多个方面。N L P的主要神经网络模型包括RNN(recurrent neural network)、LSTM(long short-term memory)、GRU(gate recurrent unit)、Transformer等。通过结合RNN、LSTM以及Transformer模型来对自然语言处理研究的发展进行说明和探讨。
Today's society is highly information,intelligent.Nowadays,people's lives have long been inseparable from artificial intelligence,such as license plate recognition in parking lots,smart filters in beauty cameras,robots,language translation and so on.Behind these intelligent systems,artificial intelligence plays a key role.Deep learning is a deep theory of artificial intelligence,and natural language processing studied in this paper is a major direction of deep learning,as well as an important part of machine learning.Natural Language Processing(NLP),NLP is the artificial intelligence direction specializing in the study of human Language,its research scope includes the Language used in various countries,and applied to machine translation,view extraction,speech recognition,automatic abstract,text classification and many other aspects.The main Neural Network models of NLP include RNN(Recurrent Neural Network),LSTM(Long short-term Memory),Gate Recurrent Unit(GRU),Transformer,etc.In this paper,RNN,LSTM and Transformer models are combined to explain and discuss the development of natural language processing research.
作者
李华旭
Li Huaxu(School of Electronics and Information,Guangxi University for Nationalities,Nanning,Guangxi 530000,China)
出处
《信息记录材料》
2021年第12期7-10,共4页
Information Recording Materials