摘要
以Sora和ChatGPT为代表的生成式人工智能与刑事司法进一步融合,人工智能深度参与司法鉴定的探索已初见成效,同时也暴露出新的问题。传统以鉴定人和经验为中心的鉴定意见生成模式易受认知偏见影响,而以算法和数据为中心的人工智能参与生成鉴定意见的模式可有效克服人的认知偏见,由人工智能充当专责检验的鉴定人,鉴定人充当情景管理员,从而增强鉴定意见的可靠性,并提高司法鉴定效率。要防范生成式人工智能在司法鉴定领域应用可能存在的诸如“算法黑箱”“算法偏见”等新问题,防止陷入数字时代的“新法定证据主义”。首先,确立“人主机辅”的人机关系,明确鉴定人负有对算法参与生成鉴定意见的审查和监管职责。其次,建立数据规范与溯源机制,防止算法模型学习内容的“源头污染”。最后,对于算法的公开与有效审查,一个可行的方案是构建独立的算法审查委员会及算法审查听证程序,以缓解算法公开审查要求与商业利益保护之间的冲突。
The generative artificial intelligence represented by Sora and ChatGPT has been further integrated with criminal justice,and exploring the deep participation of artificial intelligence in the field of forensic appraisal has achieved initial results,while new problems have been exposed.Traditional appraisal opinions generation model with expert and experience as the center is susceptible to cognitive bias,and algorithms and data as the center of artificial intelligence participation in the generation of appraisal opinions can effectively overcome the cognitive bias.With artificial intelligence acting as an examiner and the appraiser as a“situational manager”,the reliability of appraisal opinions and the efficiency of forensic appraisal will be enhanced.It’s necessary to prevent the possible new problems,such as“algorithm black box”and“algorithmic bias”,in the application of generative artificial intelligence in forensic appraisal,and to prevent falling into the“new doctrine legal evidence”of the digital age.Firstly,the man-machine relationship of “the appraiser as the main body, and the machine as the auxiliary”should be established. It is necessary to clarify that the appraiser is responsible for the review and supervision of the algorithm’s participation in the generation of appraisal opinions. Secondly, a data specification and traceability mechanism should be established to prevent the “source pollution” of the algorithmic model learning content. Finally, as to the disclosure and effective review of the algorithms, a feasible solution is to build an independent algorithm review committee and an algorithm review hearing procedure, in order to cushion the conflict between the requirement for public scrutiny of algorithms and the protection of commercial interests.
作者
张栋
陈修勇
ZHANG Dong;CHEN Xiuyong(Criminal Justice College,East China University of Political Science and Law,Shanghai 200042,China)
出处
《中国司法鉴定》
2024年第5期1-10,共10页
Chinese Journal of Forensic Sciences
基金
国家社科基金重点项目(24AFX025)。
关键词
人工智能
鉴定意见
认知偏见
对质权
算法审查听证
artificial intelligence
appraisal opinion
cognitive bias
the right to cross-examination
algorithm review hearing