期刊文献+

一种模型选择优化准则及其在高光谱图像非监督分类中的应用 被引量:1

An Unsupervised Classification Method Based on a Model Selection Criterion for Hyperspectral Data
下载PDF
导出
摘要 选择合适的类别数是非监督分类中的一个关键问题 .针对采用高斯混合建模的高光谱图像非监督分类问题 ,该文提出了一种基于主成分分析 (PCA)的最小描述长度 (MDL)型模型选择准则 (文中简称为PMDL)来确定分类类别数 ,即根据PCA变换后保留的各主成分表达的数据方差不同而应具有不同的编码长度这一事实 ,在计算描述长度时对各维进行加权 .分类过程中 ,论文采用期望最大化 (ExpectationMaximization)算法在合并的策略下对PCA变换后的数据求解混合模型 ,并应用所提出的准则进行模型选择从而确定待分类的类别数 .仿真数据实验证实了新准则的有效性和优良的性能 。 A new principal component analysis (PCA)-based minimum description length (MDL) type criterion (termed PMDL) is proposed in this paper to solve the key problem of class number selection in unsupervised classification.It is based on the fact that data of different dimensions after PCA transform should be encoded with different code length,as they represent different amount of data variance.We perform unsupervised classification to hyperspectral image by Gaussian mixture modeling,estimate the parameters of the mixture model using the expectation maximization (EM) algorithm in merged operations to data after PCA linear projection,and select the number of components according to the proposed criterion.Experiments on a set of synthetic data verify the new criterion.The whole algorithm performs quite effectively and gives proper class number without any prior information applied to real data.
出处 《电子学报》 EI CAS CSCD 北大核心 2003年第z1期2154-2157,共4页 Acta Electronica Sinica
关键词 非监督分类 高斯混合模型 期望最大化算法 主成分分析 最小描述长度准则 unsupervised classification gaussian mixture model expectation maximization algorithm principal component analysis principle of minimum description length
  • 相关文献

参考文献9

  • 1[1]A K Jain, R Dubes. Algorithms for Clustering Data [ M ]. New Jersey:Prentice Hall, 1988.
  • 2[2]Mark H Hansen, Bin Yu. Model selection and the principle of minimum description length [ J]. Journal of the American Statistical Association,2001,96:746 - 774.
  • 3[3]J Rissanen. Modeling by shortest data description [ J ]. Automatica,1978,14:165 - 471.
  • 4[4]Landgrebe D. Information extraction principles and methods for multispectral and hyperspectral image data[ A]. Chen C H, Information Processing for Remote Sensing[ M ]. New Jersey: the World Scientific Publishing Co,2000.
  • 5[5]Wu Hao, et al. An unsupervised classification method for hyperspectral image combining PCA and gaussian mixture model [ A ]. Proc. of the 3rd Inter. Symposium on MIPPR, SPIE Vol. 5286 [ C ]. Beijing, MIPPR,2003.729 - 734.
  • 6[7]Dempster A P, Laird N M, Rubin D B. Maximum-likelihood from incomplete data via the EM algorithm [ J ]. J Royal Stat. Soc. Ser. B,1977,39:1 - 38.
  • 7[8]Redner R A, Walker H F. Mixture density, maximum likelihood and the EM algorithm [ J ]. SIAM Review, 1984,26 ( 2 ): 195 - 239.
  • 8[9]Mario A T Figueiredo, Jose M N Leitao, Anil K Jain. On fitting mixture models [ J ]. Hancock E, Pellilo M (Edt.), Energy Minimization Methods in Computer Vision and Pattern Recognition, Springer-Verlag,1999:54 - 69.
  • 9[10]Mario A T Figueiredo, Anil K Jain. Unsupervised learning of finite mixture models [ J ]. IEEE Transaction on Pattern Analysis and Machine Intelligence, 2002, PAMI-24:381 - 396.

同被引文献5

  • 1岳佳,王士同.高斯混合模型聚类中EM算法及初始化的研究[J].微计算机信息,2006,22(11X):244-246. 被引量:51
  • 2Dempster A. P., Laird N.M., Rubin D. B.. Maximum-likelihood from incomplete data via the EM algorithm [J]. Journal of the Royal Statistical Society, Series B(M ethodological), 1977, 39(1): 1-38.
  • 3Figueiredo M, Jain A K. Unsupervised larning of finite Mixture Models [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence,2002,24(3):381-396.
  • 4N. Ueda, R. Nakano, Y.Z. Ghahramani, G.E. Hiton. SMEM al- gorithm for mixture models [J]. Neural Comput, 2000, 12(10): 2109-2128.
  • 5Zhihua Zhang, Chibiao Chen, Jian Sun, Kap Luk Chan. EM al- gorithms for Gaussian mixtures with split-and-merge operation [J].Pattern Recognition, 2003,36(9): 1973-1983.

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部