期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Enhancing low-resource cross-lingual summarization from noisy data with fine-grained reinforcement learning 被引量:1
1
作者 Yuxin HUANG Huailing GU +3 位作者 Zhengtao YU Yumeng GAO Tong PAN Jialong XU 《Frontiers of Information Technology & Electronic Engineering》 SCIE EI CSCD 2024年第1期121-134,共14页
Cross-lingual summarization(CLS)is the task of generating a summary in a target language from a document in a source language.Recently,end-to-end CLS models have achieved impressive results using large-scale,high-qual... Cross-lingual summarization(CLS)is the task of generating a summary in a target language from a document in a source language.Recently,end-to-end CLS models have achieved impressive results using large-scale,high-quality datasets typically constructed by translating monolingual summary corpora into CLS corpora.However,due to the limited performance of low-resource language translation models,translation noise can seriously degrade the performance of these models.In this paper,we propose a fine-grained reinforcement learning approach to address low-resource CLS based on noisy data.We introduce the source language summary as a gold signal to alleviate the impact of the translated noisy target summary.Specifically,we design a reinforcement reward by calculating the word correlation and word missing degree between the source language summary and the generated target language summary,and combine it with cross-entropy loss to optimize the CLS model.To validate the performance of our proposed model,we construct Chinese-Vietnamese and Vietnamese-Chinese CLS datasets.Experimental results show that our proposed model outperforms the baselines in terms of both the ROUGE score and BERTScore. 展开更多
关键词 Cross-lingual summarization Low-resource language Noisy data Fine-grained reinforcement learning word correlation word missing degree
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部