期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
ALBERT with Knowledge Graph Encoder Utilizing Semantic Similarity for Commonsense Question Answering 被引量:1
1
作者 byeongmin choi YongHyun Lee +1 位作者 Yeunwoong Kyung Eunchan Kim 《Intelligent Automation & Soft Computing》 SCIE 2023年第4期71-82,共12页
Recently,pre-trained language representation models such as bidirec-tional encoder representations from transformers(BERT)have been performing well in commonsense question answering(CSQA).However,there is a problem th... Recently,pre-trained language representation models such as bidirec-tional encoder representations from transformers(BERT)have been performing well in commonsense question answering(CSQA).However,there is a problem that the models do not directly use explicit information of knowledge sources existing outside.To augment this,additional methods such as knowledge-aware graph network(KagNet)and multi-hop graph relation network(MHGRN)have been proposed.In this study,we propose to use the latest pre-trained language model a lite bidirectional encoder representations from transformers(ALBERT)with knowledge graph information extraction technique.We also propose to applying the novel method,schema graph expansion to recent language models.Then,we analyze the effect of applying knowledge graph-based knowledge extraction techniques to recent pre-trained language models and confirm that schema graph expansion is effective in some extent.Furthermore,we show that our proposed model can achieve better performance than existing KagNet and MHGRN models in CommonsenseQA dataset. 展开更多
关键词 Commonsense reasoning question answering knowledge graph language representation model
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部