期刊文献+

面向文科学生的AI自然语言生成实验与教学设计 被引量:2

Artificial intelligence natural language generation experiment and instruction design for the liberal arts students
下载PDF
导出
摘要 在AI技术在各行业的应用呈爆发式增长的背景下,培养文科学生的AI思维和跨学科创新能力很有必要。该文以AI自然语言生成技术为切入点,设计了一套基于语言大模型的藏头诗生成实验教学方案,主要包括语言大模型的基本原理、微调方法、云端部署以及最终实现藏头诗生成的方法。该实验方案成功落地实验教学,形成并开放了一套可操作、可复制的教学资料成果,有助于类似实验教学工作的开展。 [Significance]With the recent technological breakthroughs achieved by a series of large models in cross-media content generation and reasoning,artificial intelligence(AI)applications have emerged as a hot topic across various industries.Notably,natural language generation(NLG)technology,powered by large language models,has the remarkable ability to simulate human language behavior.This allows it to automatically generate texts that are not only grammatically accurate but also semantically clear and highly expressive,thereby facilitating smooth and efficient human–computer interaction.Given its vast potential,this technology is currently employed in a wide range of fields,including education,law,healthcare,finance,literature,and art.In this context,incorporating the teaching and training of AI NLG technology into the“New Liberal Arts”curriculum in universities has become paramount.This integration aims to assist students in cultivating AI thinking and interdisciplinary innovation capabilities,ultimately preparing them for future employment.[Progress]This paper represents a groundbreaking endeavor in“New Liberal Arts”education by introducing an experimental course to create hidden head poets utilizing a language model.The objective of this innovative curriculum is multifaceted-to impart a fundamental understanding of the concepts and principles underlying language models and to foster practical skills in AI applications among students.The experimental coursework is structured to provide a stepwise approach to mastering the subject matter.First,students are introduced to the basics of learning language models,laying the foundation for more advanced concepts.This is followed by an exploration of fine-tuning techniques,which involves adapting pretrained models to specific datasets.Subsequently,students try to deploy code,data,and training and predicting models on the Huawei cloud server.This hands-on experience equips them with the skills necessary to navigate the real-world challenges of AI implementation.The final step of the course is the generation of hidden poetry using the fine-tuned model.This creative application demonstrates the ability of language models and serves to kindle students’interest in the potential of AI in various fields.Overall,this experimental course serves as a bridge between basic concepts and practical applications,enabling students to gain a holistic understanding of language models and their potential in the era of AI.[Conclusions and Prospects]Supported by the second batch of the“Industry-University Cooperation for Collaborative Education-New Liberal Arts Construction Project(Category A)”jointly initiated by the Ministry of Education and Huawei,we have successfully implemented this experimental program in the teaching of the“Python Programming”public course for the entire university during the spring and summer semesters of 2023,which received tremendous positive feedback from a wide range of students.Moreover,we developed and shared a set of operational and replicable experimental teaching materials online(URL:https://gitee.com/yaochw/python_course_zju/).This allows us to establish a long-term teaching mechanism for our university and provides a valuable reference for more educators to conduct similar experimental teaching work.
作者 姚诚伟 陈春晖 陈梅 YAO Chengwei;CHEN Chunhui;CHEN Mei(College of Computer Science and Technology,Zhejiang University,Hangzhou 310027,China)
出处 《实验技术与管理》 CAS 北大核心 2024年第4期177-184,共8页 Experimental Technology and Management
基金 教育部-华为第二批“产学合作协同育人—新文科建设项目(A类)”(220900007181709) 浙江大学计算机技术与工程国家级实验教学示范中心项目(浙大发本2012790)。
关键词 人工智能 新文科 语言大模型 自然语言生成 artificial intelligence new liberal arts large language model natural language generation
  • 相关文献

参考文献3

二级参考文献29

共引文献55

同被引文献25

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部