In matrix completion,additional covariates often provide valuable information for completing the unobserved entries of a high-dimensional low-rank matrix A.In this paper,the authors consider the matrix recovery proble...In matrix completion,additional covariates often provide valuable information for completing the unobserved entries of a high-dimensional low-rank matrix A.In this paper,the authors consider the matrix recovery problem when there are multiple structural breaks in the coefficient matrix β under the column-space-decomposition model A=Xβ+B.A cumulative sum(CUSUM)statistic is constructed based on the penalized estimation of β.Then the CUSUM is incorporated into the Wild Binary Segmentation(WBS)algorithm to consistently estimate the location of breaks.Consequently,a nearly-optimal recovery of A is fulfilled.Theoretical findings are further corroborated via numerical experiments and a real-data application.展开更多
近年来,在线学习由于其巨大的实际应用价值,已经得到人们广泛的研究.然而,在许多开放环境应用场景下,当前时刻数据可能会增加新的特征,而下一时刻只有部分原有特征得以继承.例如,在环境监测中,新的传感器部署会产生数据新特征;下一时刻...近年来,在线学习由于其巨大的实际应用价值,已经得到人们广泛的研究.然而,在许多开放环境应用场景下,当前时刻数据可能会增加新的特征,而下一时刻只有部分原有特征得以继承.例如,在环境监测中,新的传感器部署会产生数据新特征;下一时刻部分旧的传感器失效,部分原有特征被保留.这样的数据被称为特征继承性增减的流式数据.传统的在线学习算法大多建立在数据特征空间稳定不变的基础之上,无法直接处理此种情形.针对上述问题,提出了一种面向特征继承性增减的在线分类算法(online classification algorithm with feature inheritably increasing and decreasing,OFID)及其2种变体.当新特征出现时,通过结合在线被动主动方法与结构风险最小化原则分别更新原始特征与新增特征上的分类器;当旧特征消失时,对数据流使用Frequent-Directions算法进行补全,使得旧分类器得以继续更新迭代.从理论上证明了OFID系列算法的损失上界,同时通过大量的实验验证了所提算法的有效性.展开更多
基金supported by the National Natural Science Foundation of China under Grant Nos.12226007,12271271,11925106,12231011,11931001 and 11971247the Fundamental Research Funds for the Central Universities under Grant No.ZB22000105the China National Key R&D Program under Grant Nos.2022YFA1003703,2022YFA1003800,and 2019YFC1908502.
文摘In matrix completion,additional covariates often provide valuable information for completing the unobserved entries of a high-dimensional low-rank matrix A.In this paper,the authors consider the matrix recovery problem when there are multiple structural breaks in the coefficient matrix β under the column-space-decomposition model A=Xβ+B.A cumulative sum(CUSUM)statistic is constructed based on the penalized estimation of β.Then the CUSUM is incorporated into the Wild Binary Segmentation(WBS)algorithm to consistently estimate the location of breaks.Consequently,a nearly-optimal recovery of A is fulfilled.Theoretical findings are further corroborated via numerical experiments and a real-data application.
文摘近年来,在线学习由于其巨大的实际应用价值,已经得到人们广泛的研究.然而,在许多开放环境应用场景下,当前时刻数据可能会增加新的特征,而下一时刻只有部分原有特征得以继承.例如,在环境监测中,新的传感器部署会产生数据新特征;下一时刻部分旧的传感器失效,部分原有特征被保留.这样的数据被称为特征继承性增减的流式数据.传统的在线学习算法大多建立在数据特征空间稳定不变的基础之上,无法直接处理此种情形.针对上述问题,提出了一种面向特征继承性增减的在线分类算法(online classification algorithm with feature inheritably increasing and decreasing,OFID)及其2种变体.当新特征出现时,通过结合在线被动主动方法与结构风险最小化原则分别更新原始特征与新增特征上的分类器;当旧特征消失时,对数据流使用Frequent-Directions算法进行补全,使得旧分类器得以继续更新迭代.从理论上证明了OFID系列算法的损失上界,同时通过大量的实验验证了所提算法的有效性.