期刊文献+

技术扩散基础上的整体性对齐:大模型的开源与闭源之争

Holistic Alignment Based Technological Diffusion:The Debate Between Open-source and Closed-source Large Language Models
下载PDF
导出
摘要 大模型的开闭源是目前一个重要的争论点,这种争论主要集中在效率问题、安全问题以及平等化与民主化问题上。促使大模型开源的动力主要来源于两个方面:一是开源大模型在市场上的独特优势所带来的经济利益;二是带有一定理想主义色彩的开发者对技术平等和技术民主的追求。大模型开源亦面临一些挑战,包括经济效益挑战、社会加速挑战和社会安全挑战。后两种挑战需要人类社会进行应对,包括两种方式,一是通过适当的闭源来缓冲技术扩散的速度,二是通过对齐技术来对大模型进行人类社会规则约束。大模型对齐应当是整体性的,包括阶梯性的对齐、人与大模型的双重对齐和大模型生产全过程对齐三个基本方面,而其实现机制包括透明性机制和协商机制。 The open-source versus closed-source debate surrounding large language models(LLMs)is a significant point of contention,primarily focused on issues of efficiency,security,and principles of equality and democratization.The motivation for open-sourcing LLMs stems mainly from the unique market advantages and economic benefits they offer,as well as the idealistic pursuit of developers who advocate for technological equality and democracy.However,open-sourcing LLMs also pose several challenges,including economic benefit,societal acceleration,and social security challenges.Addressing the latter two challenges requires human society to respond in two ways:firstly,by using appropriate levels of closed-source strategies to buffer the speed of technological diffusion,and secondly,by aligning technology to ensure large models are constrained by societal rules.Alignment of LLMs should be holistic,encompassing graded alignment,dual alignment between humans and LLMs,and alignment across the entire production process of LLMs.The mechanisms for achieving this include transparency mechanisms and negotiation mechanisms.
作者 高奇琦 张皓淼 GAO Qiqi;ZHANG Haomiao(School of Government,East China University of Political Science and Law,Shanghai 201620,China;Institute of Political Science,East China University of Political Science and Law,Shanghai 201620,China)
出处 《上海大学学报(社会科学版)》 北大核心 2024年第5期84-97,共14页 Journal of Shanghai University(Social Sciences Edition)
基金 国家社会科学基金重点项目(21AZD021)。
关键词 大模型 开源 技术扩散 整体性对齐 GPT LLMs open source technological diffusion holistic alignment GPT
  • 相关文献

共引文献33

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部