摘要
Existing manifold learning algorithms use Euclidean distance to measure the proximity of data points. However, in high-dimensional space, Minkowski metrics are no longer stable because the ratio of distance of nearest and farthest neighbors to a given query is almost unit. It will degracle the performance of manifold learning algorithms when applied to dimensionality reduction of high-dimensional data. We introduce a new distance function named shrinkage-divergence-proximity (SDP) to manifold learning, which is meaningful in any high-dimensional space. An improved locally linear embedding (LLE) algorithm named SDP-LLE is proposed in light of the theoretical result. Experiments are conducted on a hyperspectral data set and an image segmentation data set. Experimental results show that the proposed method can efficiently reduce the dimensionality while getting higher classification accuracy.
Existing manifold learning algorithms use Euclidean distance to measure the proximity of data points. However, in high-dimensional space, Minkowski metrics are no longer stable because the ratio of distance of nearest and farthest neighbors to a given query is almost unit. It will degracle the performance of manifold learning algorithms when applied to dimensionality reduction of high-dimensional data. We introduce a new distance function named shrinkage-divergence-proximity (SDP) to manifold learning, which is meaningful in any high-dimensional space. An improved locally linear embedding (LLE) algorithm named SDP-LLE is proposed in light of the theoretical result. Experiments are conducted on a hyperspectral data set and an image segmentation data set. Experimental results show that the proposed method can efficiently reduce the dimensionality while getting higher classification accuracy.
基金
the Graduate Starting Seed Fund of Northwestern Polytechnical University (No.Z200760)
the Innovation Fund of Northwestern Polytechnical University.