期刊文献+

基于深度学习产品质检命名实体识别研究

Research on Named Entity Recognition of Product Quality Inspection Based on Deep Learning
下载PDF
导出
摘要 产品质量检测(pruduct quality inspection,PQI)是维护市场秩序和保障健康安全的一项重要工作,通过自然语言处理技术对质检文本数据实体识别,能够有效监督和控制产品质量。为了提高PQI文本数据命名实体识别效果,提出一种融合注意力机制的CNN-BiGRU-CRF模型,在CNN层将字词向量和词长特征向量联合输入,充分获取文本特征。使用注意力机制重点关注目标信息特征,抑制无用信息,将输出序列进行加权后得到标注分数值,以提高识别精度。以人工构建的PQI数据集作为实验数据,并与其他模型对比。实验结果表明,该模型在8种实体类型的实体识别中准确率和F_(1)值达到74.7%以上,优于其他传统模型,在PQI数据的实体识别上达到了较好的识别效果。 Product quality inspection(PQI)is an important task to maintain market order and ensure health and safety.Natural language processing technology can effectively monitor and control product quality by entity recognition of text data.In order to improve the effect of PQI data named entity recognition,a CNN-BiGRU-CRF model fused with attention mechanism is proposed.The word vector and word length feature vector are input together at CNN layer to fully obtain text features.Use attentional mechanisms to focus on the features of target information and suppress useless information.The output sequence is weighted to obtain the annotation score value to improve the recognition accuracy and use the artificially constructed PQI data set as the experimental data,and compare it with other models.The experimental results show that the accuracy and F_(1) values of the eight types of entity identification are more than 74.7%,which is better than other traditional models,and achieves a better recognition effect in the entity recognition of PQI data.
作者 方红 张澜 苏铭 冯一铂 FANG Hong;ZHANG Lan;SU Ming;FENG Yibo(College of Arts and Sciences,Shanghai Polytechnic University,Shanghai 201209,China;College of Mathematics and Statistics,Xinjiang Kashi University,Kashi 844000,Xinjinag,China)
出处 《上海第二工业大学学报》 2021年第4期307-312,共6页 Journal of Shanghai Polytechnic University
基金 电子信息类专业硕士协同创新平台建设项目(A10GY21F015)资助。
关键词 自然语言处理 命名实体识别 产品质量检测 注意力机制 条件随机场 natural language processing named entity recognition product quality inspection attention mechanism condition random field
  • 相关文献

参考文献2

二级参考文献25

共引文献161

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部