摘要
This paper studies the reduced rank regression problem,which assumes a low-rank structure of the coefficient matrix,together with heavy-tailed noises.To address the heavy-tailed noise,we adopt the quantile loss function instead of commonly used squared loss.However,the non-smooth quantile loss brings new challenges to both the computation and development of statistical properties,especially when the data are large in size and distributed across different machines.To this end,we first transform the response variable and reformulate the problem into a trace-norm regularized least-square problem,which greatly facilitates the computation.Based on this formulation,we further develop a distributed algorithm.Theoretically,we establish the convergence rate of the obtained estimator and the theoretical guarantee for rank recovery.The simulation analysis is provided to demonstrate the effectiveness of our method.
基金
supported by National Basic Research Program of China(973 Program)(Grant No.2018AAA0100704)
National Natural Science Foundation of China(Grant Nos.11825104 and 11690013)
Youth Talent Support Program and Australian Research Council
supported by National Natural Science Foundation of China(Grant No.12001109)
Shanghai Sailing Program(Grant No.19YF1402800)
the Science and Technology Commission of Shanghai Municipality(Grant No.20dz1200600)。