This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>&...This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>×<em>n</em> matrix, <em>A</em>. This method has important applications in nonnegative matrix factorizations, in matrix completion problems, and in tensor approximations. The second method is called Orthogonal Iterations. Other names of this method are Subspace Iterations, Simultaneous Iterations, and block-Power method. Given a real symmetric matrix, <em>G</em>, this method computes<em> k</em> dominant eigenvectors of <em>G</em>. To see the relation between these methods we assume that <em>G </em>=<em> A</em><sup>T</sup> <em>A</em>. It is shown that in this case the two methods generate the same sequence of subspaces, and the same sequence of low-rank approximations. This equivalence provides new insight into the convergence properties of both methods.展开更多
针对当前分布式潜在因子推荐算法存在时间复杂度较高、运行时间较长的问题,文中提出基于LU分解和交替最小二乘法(ALS)的分布式奇异值分解推荐算法,利用ALS利于分布式求解目标函数的特点,提出网格状分布式粒度分割策略,获取相互独立不相...针对当前分布式潜在因子推荐算法存在时间复杂度较高、运行时间较长的问题,文中提出基于LU分解和交替最小二乘法(ALS)的分布式奇异值分解推荐算法,利用ALS利于分布式求解目标函数的特点,提出网格状分布式粒度分割策略,获取相互独立不相关的特征向量.在更新特征矩阵时,使用LU分解求逆矩阵,加快算法的运行速度.在KDD CUP 2012 Track1中的腾讯微博数据集上的实验表明,文中算法在确保一定推荐精度的前提下,大幅提升推荐速度和算法效率.展开更多
文摘This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>×<em>n</em> matrix, <em>A</em>. This method has important applications in nonnegative matrix factorizations, in matrix completion problems, and in tensor approximations. The second method is called Orthogonal Iterations. Other names of this method are Subspace Iterations, Simultaneous Iterations, and block-Power method. Given a real symmetric matrix, <em>G</em>, this method computes<em> k</em> dominant eigenvectors of <em>G</em>. To see the relation between these methods we assume that <em>G </em>=<em> A</em><sup>T</sup> <em>A</em>. It is shown that in this case the two methods generate the same sequence of subspaces, and the same sequence of low-rank approximations. This equivalence provides new insight into the convergence properties of both methods.
文摘针对当前分布式潜在因子推荐算法存在时间复杂度较高、运行时间较长的问题,文中提出基于LU分解和交替最小二乘法(ALS)的分布式奇异值分解推荐算法,利用ALS利于分布式求解目标函数的特点,提出网格状分布式粒度分割策略,获取相互独立不相关的特征向量.在更新特征矩阵时,使用LU分解求逆矩阵,加快算法的运行速度.在KDD CUP 2012 Track1中的腾讯微博数据集上的实验表明,文中算法在确保一定推荐精度的前提下,大幅提升推荐速度和算法效率.
文摘联合对角化方法是求解盲源分离问题的有力工具.但是现存的联合对角化算法大都只能求解实数域盲源分离问题,且对目标矩阵有诸多限制.为了求解更具一般性的复数域盲源分离问题,提出了一种基于结构特点的联合对角化(Structural Traits Based Joint Diagonalization,STBJD)算法,既取消了预白化操作解除了对目标矩阵的正定性限制,又允许目标矩阵组为复值,具有极广的适用性.首先,引入矩阵变换,将待联合对角化的复数域目标矩阵组转化为新的具有鲜明结构特点的实对称目标矩阵组.随后,构建联合对角化最小二乘代价函数,引入交替最小二乘迭代算法求解代价函数,并在优化过程中充分挖掘所涉参量的结构特点加以利用.最终,求得混迭矩阵的估计并据此恢复源信号.仿真实验证明与现存的有代表性的对目标矩阵无特殊限制的复数域联合对角化算法FAJD算法及CVFFDIAG算法相比,STBJD算法具有更高的收敛精度,能有效地解决盲源分离问题.