期刊文献+

人机对话中个人信息的“设计保护”——以ChatGPT模型为切入点 被引量:7

The"Protection by Design"Approach of Personal Information in Human-computer Communication:Taking ChatGPT Model as the Starting Point
下载PDF
导出
摘要 作为新一代的通用人工智能技术,ChatGPT为业界发展带来了巨大机遇,但对个人信息安全造成的威胁同样值得关注。ChatGPT引发的个人信息安全风险主要包括违规收集、过度挖掘、信息外泄和算法偏见。传统公平实践原则在应对ChatGPT的挑战时显得捉襟见肘,“设计保护”通过伦理、法律与技术的融合治理,可为解决困境提供思路。ChatGPT运行中个人信息的“设计保护”需坚持以人为本的基本导向,满足安全、公正、可控、负责的算法伦理要求;依靠应用场景调查、个人信息分类、权益影响分析和安全风险评估确定具体保护需求;通过要素确定、措施选择、环节设计、方案评估,形成点线面结合的技术保护体系。 As a new generation of general AI technology,ChatGPT brings great opportunities for the industry to grow,but the threat it poses to the security of personal information is also a cause for concern.The main risks to personal information security posed by ChatGPT include illegal collection,over-mining,information leakage and algorithm bias.Traditional principles of fair practice are not applicable in addressing the challenges of ChatGPT,and the theory of"protection by design"can provide a solution to this dilemma through the integrated governance of ethics,law and technology."Protection by design"of personal information in the operation of ChatGPT should adhere to the basic principle of being“people-oriented”and meet the algorithmic ethical requirements of security,fairness,controllability and accountability.Specific protection needs should be determined based on the investigation of application scenarios,the classification of personal information,the analysis of the impact of rights and interests and the assessment of security risks.A holistic protection system should be established through the identification of elements,selection of measures,process design and programme evaluation.
作者 张宇轩 ZHANG Yuxuan
机构地区 南开大学法学院
出处 《图书馆论坛》 北大核心 2023年第8期77-87,共11页 Library Tribune
基金 司法部国家法治与法学理论研究项目“个人基因信息的法律保护研究”(项目编号:19SFB2045)研究成果。
关键词 人工智能 人机对话 ChatGPT 个人信息保护 设计保护 artificial intelligence human-computer communication ChatGPT personal information protection protection by design
  • 相关文献

参考文献17

二级参考文献112

共引文献828

同被引文献142

引证文献7

二级引证文献9

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部