期刊文献+

Nearly optimal stochastic approximation for online principal subspace estimation

原文传递
导出
摘要 Principal component analysis(PCA) has been widely used in analyzing high-dimensional data. It converts a set of observed data points of possibly correlated variables into a set of linearly uncorrelated variables via an orthogonal transformation. To handle streaming data and reduce the complexities of PCA,(subspace)online PCA iterations were proposed to iteratively update the orthogonal transformation by taking one observed data point at a time. Existing works on the convergence of(subspace) online PCA iterations mostly focus on the case where the samples are almost surely uniformly bounded. In this paper, we analyze the convergence of a subspace online PCA iteration under more practical assumptions and obtain a nearly optimal finite-sample error bound. Our convergence rate almost matches the minimax information lower bound. We prove that the convergence is nearly global in the sense that the subspace online PCA iteration is convergent with high probability for random initial guesses. This work also leads to a simpler proof of the recent work on analyzing online PCA for the first principal component only.
出处 《Science China Mathematics》 SCIE CSCD 2023年第5期1087-1122,共36页 中国科学:数学(英文版)
基金 supported by National Natural Science Foundation of China(Grant No.11901340) National Science Foundation of USA(Grant Nos.DMS-1719620 and DMS-2009689) the ST Yau Centre at the Yang Ming Chiao Tung University.
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部