摘要
提取两个随机向量X与Y之间的相关性是非常重要的问题.核方法被用来提取非线性的相关性.本文通过极小化方差Var[f(X)-g(Y)]得到最大相关性,称为同时回归,其中f(X)和g(Y)分别是两个不同的再生核空间中的函数.本文利用正则经验方差极小化得到估计.为了所得的估计函数具有稀疏性,本文采用系数的l1范数作为惩罚项,在一些常规条件下建立学习率.同时回归问题与典型相关分析、切片逆回归等密切相关.
It is very important to extract the correlation between two random vectors X and Y. The kernel method is used to extract nonlinear correlations. In this paper, the maximum correlation is obtained by minimizing variance Var[f(X)-g(Y)], which is called simultaneous regression, where f(X) and g(Y) are respectively functions in two different reproducing kernel Hilbert spaces. We use regularized empirical variances minimization to construct estimates. In order to obtain the estimated function with sparsity, the l--1-norm of coeffcient is used as the penalty term. We establish a learning rate under some mild conditions. Simultaneous regression is associated with canonical correlation analysis and slice inverse regression, etc.
作者
陈珩
黄尉
陈迪荣
Heng Chen;Wei Huang;Dirong Chen
出处
《中国科学:数学》
CSCD
北大核心
2019年第9期1251-1260,共10页
Scientia Sinica:Mathematica
基金
国家自然科学基金(批准号:11501380,11571267和91538112)资助项目
关键词
相关性
系数正则
同时回归
学习率
correlation
coefficient regularized
simultaneous regression
learning rate