期刊文献+

基于BiLSTM的低资源老挝语文本正则化任务

A low-resource Lao text regularization task based on BiLSTM
下载PDF
导出
摘要 文本正则化TN是语音合成文本前端分析任务中必不可少的工作,老挝语的文本正则化是将老挝语文本中不可读的词NSW转化为可以口头表达的词SFW。目前文本正则化任务尚未在老挝语中开展,主要面临训练数据难获取、部分不可读词存在歧义的问题。针对以上问题,构建了老挝语文本正则化任务的语料,并将老挝语文本正则化任务当作序列标注任务,使用神经网络结合上下文语境预测存在歧义的不可读的老挝语文本,增加自注意力机制加深序列字符间的关系,探究了不同策略引入预训练语言模型的效果,融合各自注意力机制的BiLSTM模型在测试集上达到67.59%的准确率。 Text normalization(TN)is an indispensable work in the front-end analysis task of speech synthesis text.Lao text normalization is to convert non-standard words(NSW)in Lao text into spoken-form words(SFW).At present,the task of text normalization has not yet been carried out in Lao,which mainly faces the problems of difficult acquisition of training data,diversified language expression and text regularization with ambiguity.A text normalization task in Lao is carried out.This task is completed as a sequence tagging task,and neural networks are used to predict NSW with ambiguity in combination with context.The corpus of the Lao text normalization task is constructed,the model results is predicted through the neural network,the self-attention mechanism is increased to deepen the relationship between the sequence characters,and different strategies are explored to introduce the pre-trained language model.An accuracy of 67.59% is achieved on the test set.
作者 王剑 姜林 王琳钦 余正涛 张松 高盛祥 WANG Jian;JIANG Lin;WANG Lin-qin;YU Zheng-tao;ZHANG Song;GAO Sheng-xiang(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500;Yunnan Key Laboratory of Artificial Intelligence,Kunming University of Science and Technology,Kunming 650500,China)
出处 《计算机工程与科学》 CSCD 北大核心 2023年第7期1292-1299,共8页 Computer Engineering & Science
基金 国家自然科学基金(61732005,U21B2027,61761026,61972186,61762056) 国家重点研发计划(2019QY1802,2019QY1801,2019QY1800) 云南省高科技人才项目(201606,202105AC160018) 云南省重大科技专项计划(202002AD080001-5,202103AA080015) 云南省基础研究计划(202001AS070014,2018FB104)。
关键词 老挝语 文本正则化 神经网络 自注意力机制 Lao text normalization neural network self-attention mechanism
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部