摘要
该文考虑两个经典监督学习问题(即最小二乘和logistic回归)的随机逼近.在损失函数假设非强凸性基础上,减弱了梯度的Lipschitz连续条件,提出了两种加速随机梯度算法.通过对大多数现有工作中的经验风险(期望)的非渐近分析,得到该算法的收敛速度为O(1/n),其中n是样本数量.与已知的结果相比,只需要较少的条件就可以得到最小二乘回归和logistic回归问题的收敛速度.
This paper studies the regression learning problem from given sample data by using stochastic approximation(SA)type algorithm,namely,the accelerated SA.We focus on problems without strong convexity,for which all well known algorithms achieve a convergence rate for function values of O(1/n).We consider and analyze accelerated SA algorithm that achieves a rate of O(1/n)for classical least square regression and logistic regression problems respectively.Comparing with the well known results,we only need fewer conditions to obtain the tight convergence rate for least square regression and logistic regression problems.
作者
程一元
查星星
张永全
Cheng Yiyuan;Zha Xingxing;Zhang Yongquan(School of Mathematics and Statistics,Chaohu University,Hefei 238024;School of Data Sciences,Zhejiang University of Finance&Economics,Hangzhou 310018)
出处
《数学物理学报(A辑)》
CSCD
北大核心
2021年第5期1574-1584,共11页
Acta Mathematica Scientia
基金
国家自然科学基金(61573324)
安徽省高校自然科学研究项目(KJ2018A0455)
安徽省高校青年人才支持基金(gxyq2019082)
巢湖学院校级科研基金(XLY-201903)。
关键词
最小二乘回归
逻辑回归
收敛速度
Least-square regression
Logistic regression
Convergence rate.