期刊文献+

医疗电商平台中大语言模型驱动的中文医学对话系统研究

Research on Chinese Medical Dialogue System Driven by Large Language Models in Medical E-Commerce Platforms
下载PDF
导出
摘要 随着互联网技术和人工智能的迅猛发展,医疗电商平台在现代医药服务中扮演着越来越重要的角色。本研究提出了一种基于大语言模型(LLM)的中文医学对话系统模型MedAsst,并探讨其在医疗电商平台中的应用。该模型以Qwen2-7B为基础,通过LoRA方法在147万条医学问答数据上进行监督微调。本文在医学多项选择题测试和自定义医学问答数据集上对MedAsst的有效性进行了全面评估。实验结果显示,MedAsst在BLEU-4、ROUGE-1、ROUGE-2和ROUGE-L等评价指标上均优于其他基线模型,特别是在医学问答能力上展现出显著优势。与LlaMa-3-8B、Gemma-7B、Mistral-7B和未经微调的Qwen2-7B模型相比,MedAsst通过合理的微调策略在特定领域的任务中表现出色,证明了监督微调的必要性和有效性。本文的研究不仅提升了模型在中文医学问答任务中的表现,也展示了大语言模型在医疗电商平台中的应用潜力,为未来在更复杂场景中的优化和实际应用提供了有力支持。With the rapid development of Internet technology and artificial intelligence, medical e-commerce platforms play an increasingly important role in modern pharmaceutical services. This study proposes a Chinese medical dialogue system model MedAsst based on Large Language Model (LLM) and explores its application in medical e-commerce platform. The model is based on Qwen2-7B, and supervised fine-tuning is performed on 1.47 million medical question and answer data by LoRA method. In this paper, the effectiveness of MedAsst is thoroughly evaluated on a medical multiple-choice test and a customised medical quiz dataset. The experimental results show that MedAsst outperforms other baseline models on the evaluation metrics of BLEU-4, ROUGE-1, ROUGE-2, and ROUGE-L, and in particular demonstrates a significant advantage in medical quizzing ability. Compared with LlaMa-3-8B, Gemma-7B, Mistral-7B, and the unfine-tuned Qwen2-7B model, MedAsst performs well in domain-specific tasks through reasonable fine-tuning strategies, demonstrating the necessity and effectiveness of supervised fine-tuning. The research in this paper not only improves the performance of the model in the Chinese medical Q&A task, but also demonstrates the potential application of large language models in medical e-commerce platforms, which provides strong support for future optimisation and practical application in more complex scenarios.
出处 《电子商务评论》 2024年第4期1611-1620,共10页 E-Commerce Letters
  • 相关文献

参考文献5

二级参考文献56

共引文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部