摘要
以深度神经网络为代表的新一代人工智能技术,为推动教育数字化转型与智能化升级提供了底层支撑,智能教育成为当前全球教育发展的显著特征与重要趋势。但由于神经网络普遍存在的黑箱属性,难以阐释模型的决策过程或显性表达模型的内部知识,导致在教育实践中往往“知其然而不知其所以然”,制约了智能教育的纵深发展。为应对这一挑战,需从以下方面着力:揭示推动教育从可计算到可解释计算跃迁的多维因素,建立覆盖智能建模核心流程的教育可解释计算逻辑框架,发展具有因果效应的教育可解释计算技术路向。此外,教育可解释计算的长远发展还依赖于理论引导、评测适配与个性关切。
The new generation of artificial intelligence technology represented by deep neural network provides the underlying support for promoting the digital transformation and intelligent upgrading of education.Intelligent education has become a significant feature and important trend of the current global education development.However,due to the ubiquitous black-box properties of neural networks,it is difficult to explain the decision-making process of the model or explicitly express the internal knowledge of the model,resulting in“knowing what is right but not knowing why”in educational practice,which restricts the in-depth development of intelligent education.In order to meet this challenge,we need to focus on the following aspects:revealing the multidimensional factors that promote the transition from educational computing to educational explainable computing,establishing an educational explainable computing logical framework covering the core process of intelligent modeling,and developing the path of educational explainable computing technology with causal effects.In addition,the long-term development of educational explainable computing also relies on theoretical guidance,evaluation adaptation,and individual concerns.
作者
孙建文
高璐
刘三女牙
李卿
SUN Jianwen;GAO Lu;LIU Sannyuya;LI Qing(the Faculty of Artificial Intelligence in Education,Central China Normal University,Wuhan 430079)
出处
《教育研究与实验》
CSSCI
北大核心
2023年第4期100-107,共8页
Educational Research and Experiment
基金
国家社会科学基金2022年度教育学一般课题“混合教学环境下的大学生学习投入智能评测与干预:基于多模态分析视角”(BCA220210)的研究成果。
关键词
智能教育
可解释计算
逻辑框架
技术路向
intelligent education
explainable computing
logical framework
technical direction