摘要
希尔伯特-施密特独立性准则(Hilbert-Schmidtindependencecriterion,HSIC)是一种基于核函数的独立性度量标准,具有计算简单、收敛速度快和偏差低等优点,广泛应用于统计分析和机器学习问题中。特征选择是一种有效的降维技术,它能评估特征的重要性,并构造适合学习任务的最优特征子空间。系统综述了基于HSIC的特征选择方法,详细介绍了其中的理论基础、算法模型和求解方法,分析了基于HSIC的特征选择的优点与不足,并对未来的研究做出展望。
The Hilbert-Schmidt independence criterion(HSIC)is a kernel function-based independence criterion with the advantages of simple computation,fast convergence and low bias,which is widely used in statistical analysis and machine learning problems.Feature selection is an effective dimensionality reduction technique that evaluates the importance of features and constructs an optimal feature subspace suitable for the learning task.The HSIC-based feature selection meth-ods are systematically reviewed,in which the theoretical basis,algorithmic models and solution methods are introduced in detail,the advantages and disadvantages of HSIC-based feature selection are analyzed,and future directions are given.
作者
胡振威
汪廷华
周慧颖
HU Zhenwei;WANG Tinghua;ZHOU Huiying(School of Mathematics and Computer Science,Gannan Normal University,Ganzhou,Jiangxi 341000,China)
出处
《计算机工程与应用》
CSCD
北大核心
2022年第22期54-64,共11页
Computer Engineering and Applications
基金
国家自然科学基金(61966002)
江西省教育厅科学技术研究项目(GJJ191659)
江西省研究生创新专项资金项目(YC2021-S726)。
关键词
特征选择
希尔伯特-施密特独立性准则
核方法
机器学习
feature selection
Hilbert-Schmidt independence criterion(HSIC)
kernel method
machine learning