摘要
在概率空间上统计学习理论是机器学习的重要组成部分.在概率空间上统计学习理论中一致收敛速度的界有重要的意义,利用经验风险最小化原则,这些界决定了学习机器的推广能力.本文在可能性空间中讨论了学习过程一致收敛速度的界,给出了一致收敛速度的界的估计并讨论了这些界和函数集容量之间的关系.
Statistical learning theory on probability spaces is an important part of Machine Learning. In the statistical learning theory on probability spaces, the bounds on the rate of uniform convergence have significant meanings. They determine generalization abilities of the learning machines utilizing the empirical risk minimization induction principle. In this paper, we discuss the bounds on the risk for indicator loss function on possibility space, and then estimate the rate of uniform convergence and finally point out the relation between the rate of convergence and the capacity of a set of function.
出处
《河北大学学报(自然科学版)》
CAS
2004年第1期1-6,共6页
Journal of Hebei University(Natural Science Edition)
基金
河北省教育厅科学研究计划项目(2002159)
河北省自然科学基金资助项目(603137)
关键词
可能性空间
可信性则度
期望风险
经验风验
possibility space
credibility measure
expected risk
empirical risk