期刊文献+

Quantile Regression Based on Laplacian Manifold Regularizer with the Data Sparsity in <i>l</i>1 Spaces

Quantile Regression Based on Laplacian Manifold Regularizer with the Data Sparsity in <i>l</i>1 Spaces
下载PDF
导出
摘要 In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There is no regularized condition with the kernel function, excepting continuity and boundness. The graph-based semi-supervised algorithm leads to an extra error term called manifold error. Part of new error bounds and convergence rates are exactly derived with the techniques consisting of l1-empirical covering number and boundness decomposition. In this paper, we consider the regularized learning schemes based on l1-regularizer and pinball loss in a data dependent hypothesis space. The target is the error analysis for the quantile regression learning. There is no regularized condition with the kernel function, excepting continuity and boundness. The graph-based semi-supervised algorithm leads to an extra error term called manifold error. Part of new error bounds and convergence rates are exactly derived with the techniques consisting of l1-empirical covering number and boundness decomposition.
出处 《Open Journal of Statistics》 2017年第5期786-802,共17页 统计学期刊(英文)
关键词 SEMI-SUPERVISED Learning Conditional QUANTILE Regression l1-Regularizer Manifold-Regularizer Pinball Loss Semi-Supervised Learning Conditional Quantile Regression l1-Regularizer Manifold-Regularizer Pinball Loss
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部