摘要
算法黑箱对算法决策提出了挑战,算法透明要求算法可解释。算法透明并不是简单的算法公开。尽管人们普遍寄希望于通过算法公开来打开算法黑箱实现算法透明,但是单纯的算法公开不仅面临着商业秘密泄露的诘难,也无法真正对算法决策作出解释。与之相比,技术上制造可解释人工智能是实现算法透明的一种新的选择。可解释人工智能不仅可以凭借不同的解释路径透视算法黑箱,还可以满足受众对算法解释的不同需求。技术的发展有赖于制度的保障,构建一套合理的制度促进可解释人工智能的发展是确有必要的。这要求在立法上规定可解释人工智能的适用场域;在行政上利用政府采购引导可解释人工智能的生产;在市场管理上借助第三方认证倒逼企业制造可解释人工智能;最终实现算法透明。
The algorithm black box poses a challenge to the algorithm decision,and the algorithm transparency requires that the algorithm can be explained.Algorithm transparency is not a simple algorithm disclosure.Although people generally hope to open the black box of algorithms to realize the algorithm transparency through algorithm disclosure,the simple algorithm disclosure not only faces the responsibility of trade secrets,but also cannot really explain the algorithmic decisions.In contrast,opening a algorithm black box by manufacturing Explainable Artificial Intelligence may be a new alternative.It can not only penetrate the black box of the algorithm with different interpretation methods,but also meet the different interpretation requirements of the algorithmic decisions.The development of technology depends on the guarantee of the system,and it is indeed necessary to build a reasonable system to promote the development of Explainable Artificial Intelligence.This requires the legislative regulation of Explainable Artificial Intelligence application fields,using government procurement to guide Explainable Artificial Intelligence production administratively,and forcing enterprises to manufacture Explainable Artificial Intelligence through third-party certification in market management.On this basis,the algorithm transparency will be finally realized.
作者
杨志航
YANG Zhi-hang(Jilin University School of Law,Changchun 130012)
出处
《行政法学研究》
CSSCI
北大核心
2024年第3期154-163,共10页
ADMINISTRATIVE LAW REVIEW
基金
2017年度国家社科基金重大专项项目“核心价值观融入法治建设研究:以公正司法为核心的考察”(项目编号:17VHJ007)。
关键词
可解释人工智能
算法透明
算法黑箱
算法可解释
Explainable Artificial Intelligence
Algorithm Transparency
Algorithm Black Box
Explainable Algorithm