期刊文献+

求解大规模谱聚类的近似加权核k-means算法 被引量:30

Approximate Weighted Kernel k-means for Large-Scale Spectral Clustering
下载PDF
导出
摘要 谱聚类将聚类问题转化成图划分问题,是一种基于代数图论的聚类方法.在求解图划分目标函数时,一般利用Rayleigh熵的性质,通过计算Laplacian矩阵的特征向量将原始数据点映射到一个低维的特征空间中,再进行聚类.然而在谱聚类过程中,存储相似矩阵的空间复杂度是O(n2),对Laplacian矩阵特征分解的时间复杂度一般为O(n3),这样的复杂度在处理大规模数据时是无法接受的.理论证明,Normalized Cut图聚类与加权核k-means都等价于矩阵迹的最大化问题.因此,可以用加权核k-means算法来优化Normalized Cut的目标函数,这就避免了对Laplacian矩阵特征分解.不过,加权核k-means算法需要计算核矩阵,其空间复杂度依然是O(n2).为了应对这一挑战,提出近似加权核k-means算法,仅使用核矩阵的一部分来求解大数据的谱聚类问题.理论分析和实验对比表明,近似加权核k-means的聚类表现与加权核k-means算法是相似的,但是极大地减小了时间和空间复杂性. Spectral clustering is based on algebraic graph theory. It turns the clustering problem into the graph partitioning problem. To solve the graph cut objective function, the properties of the Rayleigh quotient are usually utilized to map the original data points into a lower dimensional eigen-space by calculating the eigenvectors of Laplacian matrix and then conducting the clustering in the new space. However, during the process of spectral clustering, the space complexity of storing similarity matrix is O(n^2), and the time complexity of the eigen-decomposition of Laplacian matrix is usually O(n^3). Such complexity is unacceptable when dealing with large-scale data sets. It can be proved that both normalized cut graph clustering and weighted kernel k-means are equivalent to the matrix trace maximization problem, which suggests that weighted kernel k-means algorithm can be used to optimize the objective function of normalized cut without the eigen-decomposition of Laplacian matrix. Nonetheless, weighted kernel k-means algorithm needs to calculate the kernel matrix, and its space complexity is still O(n^2). To address this challenge, this study proposes an approximate weighted kernel k-means algorithm in which only part of the kernel matrix is used to solve big data spectral clustering problem. Theoretical analysis and experimental comparison show that approximate weighted kernel k-means has similar clustering performance with weighted kernel k-means algorithm, but its time and space complexity is greatly reduced.
出处 《软件学报》 EI CSCD 北大核心 2015年第11期2836-2846,共11页 Journal of Software
基金 国家重点基础研究发展计划(973)(2013CB329502) 国家自然科学基金(61379101) 江苏省普通高校研究生科研创新计划(KYLX15_1442)
关键词 谱聚类 迹最大化 加权核k-means 近似核矩阵 大数据 spectral clustering trace maximization weighted kernel k-means approximate kernel matrix big data
  • 相关文献

参考文献23

  • 1Sun JG, Liu J, Zhao LY. Clustering algorithms research. Ruan Jian Xue Bao/Joumal of Software, 2008,19(1): 48-61 (in Chinese with English abstract), http://www.jos.org.cn/1000-9825/19/48.htm [doi: 10.3724/SP.J.1001.2008.00048].
  • 2Schleif FM, Zhu XB, Gisbrecht A, Hammer B. Fast approximated relational and kernel clustering. In: Proc. of the 21st Int’l Conf. on Pattern Recognition. 2012. 1229-1232.
  • 3Jia HJ, Ding SF, Xu XZ, Nie R. The latest research progress on spectral clustering. Neural Computing and Applications, 2014, 24(7-8): 1477-1486. [doi: 10.1007/s00521 -013-1439-2].
  • 4Chan PK, Schlag MDF, Zien JY. Spectral fc-way ratio-cut partitioning and clustering. IEEE Trans, on Computer-Aided Design of Integrated Circuits and Systems, 1994,13(9):1088-1096. [doi: 10.1109/43.310898].
  • 5Shi J, Malik J. Normalized cuts and image segmentation. IEEE Trans, on Pattern Analysis and Machine Intelligence, 2000,22(8): 888-905. [doi: 10.1109/34.868688].
  • 6Rebagliati N, Verri A. Spectral clustering with more than k eigenvectors. Neurocomputing, 2011,74(9):1391-1401. [doi: 10.1016/j. neucom.2010.12.008].
  • 7Von Luxburg U. A tutorial on spectral clustering. Statistics and Computing, 2007,17(4):395-416. [doi: 10.1007/sl 1222-007-9033 -z].
  • 8Fowlkes C, Belongie S, Chung F, Malik J. Spectral grouping using the NystrOm method. IEEE Trans, on Pattern Analysis and Machine Intelligence, 2004,26(2):214-225. [doi: 10.1109/TPAMI.2004.1262185].
  • 9Kumar S, Mohri M, Talwalkar A. Sampling methods for the Nystrom method. Journal of Machine Learning Research, 2012,13(1): 981-1006.
  • 10Si S, Hsieh CJ, Dhillon I. Memory efficient kernel approximation. In: Proc. of the 31st Int’l Conf. on Machine Learning. 2014. 701-709.

同被引文献244

引证文献30

二级引证文献229

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部