期刊文献+

以人为本的可解释智能医疗综述 被引量:1

Review of Human-Centered Explainable AI in Healthcare
下载PDF
导出
摘要 随着人工智能(artificial intelligence,AI)的高速发展,“黑盒”模型已逐渐展示出逼近甚至超越人类的能力,尤其在智能医疗等高风险应用场景中,其可解释性是用户在应用中信任和理解AI的关键基础.虽然已有工作提供了大量事前和事后的AI可解释方法,但大都采用通用型解决思路,未考虑不同用户在不同场景下多维度的理解和信任需求.以人为本的AI可解释方法能够针对用户实际需求对AI模型进行可解释分析,近年来逐渐受到国内外学者的关注.文中聚焦智能医疗应用,对近5年人机交互国际顶级会议的文献进行分析,回顾现有辅助诊断、辅助用药、未病预警方面以人为本的AI可解释方法及系统,从决策时间花费、用户专业度和诊疗工作流程3个维度梳理和定位可解释需求的系统性方法,得出4类典型用户画像和对应案例;并从考虑资源受限、不同用户的多样需求、与现有流程结合3个方面,为如何设计可解释的医疗辅助诊断系统提出建议. With the development of Artificial Intelligence(AI),“black box”models have demonstrated significant capabilities that now approach,or even surpass,human performance.However,ensuring the explainability of AI is crucial for users to trust and understand its applications in their daily lives,particularly in high-risk scenarios like healthcare.Although previous research has introduced numerous direct and post-hoc explainable AI methods,many of them adhere to a“one-fits-all”approach,disregarding the multidimensional understanding and trust requirements of diverse users in different contexts.In recent years,there has been growing attention from researchers worldwide towards human-centered explainable AI,which aims to provide explainable analyses of AI models based on the specific needs of users.This article examines literature reviews published over the last five years at top-tier global conferences in the field of human-computer interaction,with a specific emphasis on healthcare.It reviews existing human-centered,explainable AI methods and systems used for computer-aided diagnosis,com puter-aided treatment,and preventive disease warning.Based on this review,it explores and identifies explainability needs from three perspectives:decision time constraints,user expertise levels,and diagnosis workflow processes.Additionally,the article lists four classic user persona types along with respective examples and provides suggestions for designing explainable medical diagnostic systems,considering resource constraints,varying user needs across different stakeholders,and integration with existing clinical workflows.
作者 宋淑超 陈益强 于汉超 张迎伟 杨晓东 Song Shuchao;Chen Yiqiang;Yu Hanchao;Zhang Yingwei;Yang Xiaodong(Key Laboratory of Mobile Computing and Pervasive Devices,Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100190;University of Chinese Academy of Sciences,Beijing 100190;Bureau of Frontier Sciences and Education,Chinese Academy of Sciences,Beijing 100864)
出处 《计算机辅助设计与图形学学报》 EI CSCD 北大核心 2024年第5期645-657,共13页 Journal of Computer-Aided Design & Computer Graphics
基金 国家重点研发计划(2021YFC2501202) 国家自然科学基金(62202455,61972383,62302487) 北京市科学技术委员会资助项目(Z221100002722009).
关键词 可解释人工智能 临床决策支持系统 智能医疗 人智交互 用户体验 explainable artificial intelligence clinical decision support system healthcare human-AI interaction user experience
  • 相关文献

参考文献6

二级参考文献20

共引文献181

同被引文献11

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部