期刊文献+

Robust reduced rank regression in a distributed setting

原文传递
导出
摘要 This paper studies the reduced rank regression problem,which assumes a low-rank structure of the coefficient matrix,together with heavy-tailed noises.To address the heavy-tailed noise,we adopt the quantile loss function instead of commonly used squared loss.However,the non-smooth quantile loss brings new challenges to both the computation and development of statistical properties,especially when the data are large in size and distributed across different machines.To this end,we first transform the response variable and reformulate the problem into a trace-norm regularized least-square problem,which greatly facilitates the computation.Based on this formulation,we further develop a distributed algorithm.Theoretically,we establish the convergence rate of the obtained estimator and the theoretical guarantee for rank recovery.The simulation analysis is provided to demonstrate the effectiveness of our method.
出处 《Science China Mathematics》 SCIE CSCD 2022年第8期1707-1730,共24页 中国科学:数学(英文版)
基金 supported by National Basic Research Program of China(973 Program)(Grant No.2018AAA0100704) National Natural Science Foundation of China(Grant Nos.11825104 and 11690013) Youth Talent Support Program and Australian Research Council supported by National Natural Science Foundation of China(Grant No.12001109) Shanghai Sailing Program(Grant No.19YF1402800) the Science and Technology Commission of Shanghai Municipality(Grant No.20dz1200600)。
  • 相关文献

参考文献1

共引文献57

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部