期刊文献+

结构稀疏模型及其算法研究进展 被引量:5

Research and Development on Structured Sparse Models and Algorithms
下载PDF
导出
摘要 结构稀疏模型在统计学、信号处理和机器学习等领域中具有重要的应用。结构稀疏模型主要通过在目标函数中引入会导致组稀疏效果的罚函数来实现特征组结构选择。有趣的是一些组稀疏模型不仅能实现特征组选择,而且同时能够实现组内的特征选择。根据使用的罚函数的类型,结构稀疏模型主要分为组套索模型和非凸罚组稀疏模型两大类。系统地总结了重要的组结构稀疏模型,分析了各种组结构稀疏模型之间的区别与联系,归纳比较了各种组结构稀疏模型的统计特性(例如模型选择一致性、参数估计一致性和oracle性质)和组结构稀疏模型的求解算法。当前,结构套索模型主要包括普通组套索模型、L∞,1组套索模型、重叠组套索模型、树组套索模型、多输出树组套索模型、混合组套索模型、自适应组套索模型、逻辑斯蒂组套索模型和贝叶斯组套索模型。非凸罚组稀疏模型包括组SCAD罚模型、组桥模型和组MC罚模型等。求解组稀疏模型的算法有组最小角回归算法、块坐标下降(上升)算法、活动集算法、内点算法、投影梯度算法、谱投影梯度算法、轮换方向乘子算法和块坐标梯度下降算法等,结合组稀疏模型对这些算法进行了详细的分析。在使用上述优化方法前,通常需要对目标函数进行预处理,将不平滑的、非凸的、块坐标不可分离的组稀疏模型的目标函数向平滑、凸、块坐标可分离的方向进行转化,这一步常利用的技巧有变分不等式、Nesterov的平滑近似技巧、局部一阶泰勒展开近似、局部二次近似、对偶范数和对偶函数等。接着给出了最新提出的一些组稀疏模型,如关于广义加模型的组套索模型、复合组桥模型、平方根组套索模型和关于Tobit模型的组套索模型等。最后,对组稀疏模型未来的研究方向进行了探讨。 The group sparse model has many important applications in the statistics,signal processing and machine learning.The group sparse model achieves feature group selection through introducing the sparsity-inducing penalty function into the objective function.It's interesting that some group sparse models can achieve feature group selection and feature selection within groups simultaneously.According to the penalty functions,the sparse group models are mainly divided into two categories,i.e.,Group Lasso models and the group sparse models with non-convex penalty.This paper systematically summarized important group sparse models and analyzed the differences and relations between various group sparse models.In addition,we summarized and compared the statistical properties(such as model selection consistency,parameter estimation consistency and oracle property)and the solving algorithms for various group sparse models.Roughly speaking,the Group Lasso models include normal Group Lasso model,L∞,1penalty Group Lasso model,overlapping Group Lasso model,tree guided Group Lasso model,multiple-output tree guided Group Lasso model,mixed Group Lasso model,adaptive Group Lasso model,logistic Group Lasso model and Bayesian Group Lasso model.Algorithms for solving group sparse model are composed of Group LARS,block coordinate descent method(block coordinate ascent method),active set method,interior point method,projected gradient method,spectral projected gradient method,alternating direction method of multipliers and block coordinate gradient descent method.We carried out a detailed analysis of these algorithms for specific group sparse models.Before using the optimization methods above,we must pretreat the objective function,i.e.,we must transform the nonsmooth,nonconvex and non-separable penalty function in the objective function of group sparse model into smooth,convex and separable functions.Variational inequalities,Nesterov's smooth approximation techniques,local first-order approximation by Taylor series expansion,local quadratic approximation,the dual norm and dual function are often used in this step.Next,some group sparse models which are recently proposed were introduced,such as Group Lasso model based on generalized additive model,composite group bridge model,group square-root Lasso model,Group Lasso model based on Tobit model and so on.Finally,we talked the future research directions of the group sparse models.
出处 《计算机科学》 CSCD 北大核心 2016年第S1期1-16,共16页 Computer Science
基金 中国石油大学(北京)基础学科研究基金项目(JCXK-2011-07)资助
关键词 稀疏 组稀疏 罚函数 组套索 特征组选择 组内特征选择 算法 Sparsity Group sparsity Penalty function Group lasso Feature group selection Feature selection within group Algorithm
  • 相关文献

参考文献3

  • 1Junzhou Huang,Tong Zhang.The benefit of group sparsity. The Annals of Statistics . 2010
  • 2Tibshirani R.Regression shrinkage and selection via the LASSO,1996.
  • 3Efron B,Hastie T,Johnstone I,et al.Least angle regression. The Annals of Statistics . 2004

共引文献6

同被引文献45

引证文献5

二级引证文献42

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部