Retrieving the most similar objects in a large-scale database for a given query is a fundamental building block in many application domains, ranging from web searches, visual, cross media, to document retrievals. Stat...Retrieving the most similar objects in a large-scale database for a given query is a fundamental building block in many application domains, ranging from web searches, visual, cross media, to document retrievals. Stateof-the-art approaches have mainly focused on capturing the underlying geometry of the data manifolds. Graphbased approaches, in particular, define various diffusion processes on weighted data graphs. Despite success,these approaches rely on fixed-weight graphs, making ranking sensitive to the input affinity matrix. In this study,we propose a new ranking algorithm that simultaneously learns the data affinity matrix and the ranking scores.The proposed optimization formulation assigns adaptive neighbors to each point in the data based on the local connectivity, and the smoothness constraint assigns similar ranking scores to similar data points. We develop a novel and efficient algorithm to solve the optimization problem. Evaluations using synthetic and real datasets suggest that the proposed algorithm can outperform the existing methods.展开更多
Recently manifold learning algorithm for dimensionality reduction attracts more and more interests, and various linear and nonlinear,global and local algorithms are proposed. The key step of manifold learning algorith...Recently manifold learning algorithm for dimensionality reduction attracts more and more interests, and various linear and nonlinear,global and local algorithms are proposed. The key step of manifold learning algorithm is the neighboring region selection. However,so far for the references we know,few of which propose a generally accepted algorithm to well select the neighboring region. So in this paper,we propose an adaptive neighboring selection algorithm,which successfully applies the LLE and ISOMAP algorithms in the test. It is an algorithm that can find the optimal K nearest neighbors of the data points on the manifold. And the theoretical basis of the algorithm is the approximated curvature of the data point on the manifold. Based on Riemann Geometry,Jacob matrix is a proper mathematical concept to predict the approximated curvature. By verifying the proposed algorithm on embedding Swiss roll from R3 to R2 based on LLE and ISOMAP algorithm,the simulation results show that the proposed adaptive neighboring selection algorithm is feasible and able to find the optimal value of K,making the residual variance relatively small and better visualization of the results. By quantitative analysis,the embedding quality measured by residual variance is increased 45. 45% after using the proposed algorithm in LLE.展开更多
Locally linear embedding(LLE)algorithm has a distinct deficiency in practical application.It requires users to select the neighborhood parameter,k,which denotes the number of nearest neighbors.A new adaptive method is...Locally linear embedding(LLE)algorithm has a distinct deficiency in practical application.It requires users to select the neighborhood parameter,k,which denotes the number of nearest neighbors.A new adaptive method is presented based on supervised LLE in this article.A similarity measure is formed by utilizing the Fisher projection distance,and then it is used as a threshold to select k.Different samples will produce different k adaptively according to the density of the data distribution.The method is applied to classify plant leaves.The experimental results show that the average classification rate of this new method is up to 92.4%,which is much better than the results from the traditional LLE and supervised LLE.展开更多
In this paper,we propose an Unsupervised Nonlinear Adaptive Manifold Learning method(UNAML)that considers both global and local information.In this approach,we apply unlabeled training samples to study nonlinear manif...In this paper,we propose an Unsupervised Nonlinear Adaptive Manifold Learning method(UNAML)that considers both global and local information.In this approach,we apply unlabeled training samples to study nonlinear manifold features,while considering global pairwise distances and maintaining local topology structure.Our method aims at minimizing global pairwise data distance errors as well as local structural errors.In order to enable our UNAML to be more efficient and to extract manifold features from the external source of new data,we add a feature approximate error that can be used to learn a linear extractor.Also,we add a feature approximate error that can be used to learn a linear extractor.In addition,we use a method of adaptive neighbor selection to calculate local structural errors.This paper uses the kernel matrix method to optimize the original algorithm.Our algorithm proves to be more effective when compared with the experimental results of other feature extraction methods on real face-data sets and object data sets.展开更多
Since Beijing and Zhangjiakou in neighboring Hebei Province won the bid to host the 2022 Winter Olympics,the entire country has become increasingly enthusiastic about winter sports.Skiing and skating are popular winte...Since Beijing and Zhangjiakou in neighboring Hebei Province won the bid to host the 2022 Winter Olympics,the entire country has become increasingly enthusiastic about winter sports.Skiing and skating are popular winter activities while lesser-known games such as curling and ice hockey have also become familiar to sports lovers.展开更多
文摘Retrieving the most similar objects in a large-scale database for a given query is a fundamental building block in many application domains, ranging from web searches, visual, cross media, to document retrievals. Stateof-the-art approaches have mainly focused on capturing the underlying geometry of the data manifolds. Graphbased approaches, in particular, define various diffusion processes on weighted data graphs. Despite success,these approaches rely on fixed-weight graphs, making ranking sensitive to the input affinity matrix. In this study,we propose a new ranking algorithm that simultaneously learns the data affinity matrix and the ranking scores.The proposed optimization formulation assigns adaptive neighbors to each point in the data based on the local connectivity, and the smoothness constraint assigns similar ranking scores to similar data points. We develop a novel and efficient algorithm to solve the optimization problem. Evaluations using synthetic and real datasets suggest that the proposed algorithm can outperform the existing methods.
基金Sponsored by the National Natural Science Foundation of China (Grant No. 61101122 and 61071105)Fundamental Research Funds for the Central Universities (Grant No. HIT. NSRIF. 2010090)+1 种基金Science and Technology on Information Transmission and Dissemination in Communication Networks Laboratory (Grant No. ITD-U12004)Postdoctoral Science Research Development Foundation of Heilongjiang Province (Grant No. LBH-Q12080)
文摘Recently manifold learning algorithm for dimensionality reduction attracts more and more interests, and various linear and nonlinear,global and local algorithms are proposed. The key step of manifold learning algorithm is the neighboring region selection. However,so far for the references we know,few of which propose a generally accepted algorithm to well select the neighboring region. So in this paper,we propose an adaptive neighboring selection algorithm,which successfully applies the LLE and ISOMAP algorithms in the test. It is an algorithm that can find the optimal K nearest neighbors of the data points on the manifold. And the theoretical basis of the algorithm is the approximated curvature of the data point on the manifold. Based on Riemann Geometry,Jacob matrix is a proper mathematical concept to predict the approximated curvature. By verifying the proposed algorithm on embedding Swiss roll from R3 to R2 based on LLE and ISOMAP algorithm,the simulation results show that the proposed adaptive neighboring selection algorithm is feasible and able to find the optimal value of K,making the residual variance relatively small and better visualization of the results. By quantitative analysis,the embedding quality measured by residual variance is increased 45. 45% after using the proposed algorithm in LLE.
基金This study was financially supported by the National Natural Science Foundation of China(61172127)the Research Fund for the Doctoral Program of Higher Education(KJQN1114)+2 种基金Anhui Provincial Natural Science Foundation(1308085QC58)the 211 Project Youth Scientific Research Fund of Anhui UniversityProvincial Natural Science Foundation of Anhui Universities(KJ2013A026)。
文摘Locally linear embedding(LLE)algorithm has a distinct deficiency in practical application.It requires users to select the neighborhood parameter,k,which denotes the number of nearest neighbors.A new adaptive method is presented based on supervised LLE in this article.A similarity measure is formed by utilizing the Fisher projection distance,and then it is used as a threshold to select k.Different samples will produce different k adaptively according to the density of the data distribution.The method is applied to classify plant leaves.The experimental results show that the average classification rate of this new method is up to 92.4%,which is much better than the results from the traditional LLE and supervised LLE.
基金supported in part by the National Natural Science Foundation of China(Nos.61373093,61402310,61672364,and 61672365)the National Key Research and Development Program of China(No.2018YFA0701701)。
文摘In this paper,we propose an Unsupervised Nonlinear Adaptive Manifold Learning method(UNAML)that considers both global and local information.In this approach,we apply unlabeled training samples to study nonlinear manifold features,while considering global pairwise distances and maintaining local topology structure.Our method aims at minimizing global pairwise data distance errors as well as local structural errors.In order to enable our UNAML to be more efficient and to extract manifold features from the external source of new data,we add a feature approximate error that can be used to learn a linear extractor.Also,we add a feature approximate error that can be used to learn a linear extractor.In addition,we use a method of adaptive neighbor selection to calculate local structural errors.This paper uses the kernel matrix method to optimize the original algorithm.Our algorithm proves to be more effective when compared with the experimental results of other feature extraction methods on real face-data sets and object data sets.
文摘Since Beijing and Zhangjiakou in neighboring Hebei Province won the bid to host the 2022 Winter Olympics,the entire country has become increasingly enthusiastic about winter sports.Skiing and skating are popular winter activities while lesser-known games such as curling and ice hockey have also become familiar to sports lovers.