Dimensionality reduction and data visualization are useful and important processes in pattern recognition. Many techniques have been developed in the recent years. The self-organizing map (SOM) can be an efficient m...Dimensionality reduction and data visualization are useful and important processes in pattern recognition. Many techniques have been developed in the recent years. The self-organizing map (SOM) can be an efficient method for this purpose. This paper reviews recent advances in this area and related approaches such as multidimensional scaling (MDS), nonlinear PC A, principal manifolds, as well as the connections of the SOM and its recent variant, the visualization induced SOM (ViSOM), with these approaches. The SOM is shown to produce a quantized, qualitative scaling and while the ViSOM a quantitative or metric scaling and approximates principal curve/surface. The SOM can also be regarded as a generalized MDS to relate two metric spaces by forming a topological mapping between them. The relationships among various recently proposed techniques such as ViSOM, Isomap, LLE, and eigenmap are discussed and compared.展开更多
In this paper, we propose a novel uplink power control algorithm, SMST, for multiple-input multiple-output orthogonal frequency-division multiple access (MIMQ-OFDMA).We perform an extensive system-level simulation t...In this paper, we propose a novel uplink power control algorithm, SMST, for multiple-input multiple-output orthogonal frequency-division multiple access (MIMQ-OFDMA).We perform an extensive system-level simulation to compare different uplink power control algorithms, including the FPC adopted in 3GPP LTE and LTE-Advanced. Simulations show that SMST adopted in IEEE 802.16m outperforms other algorithms in terms of spectral efficiency, cell-edge performance, interference control, and trade-off control between sector-accumulated throughput and cell-edge user throughput. The SMST performance gain over FPC can be more than 40%展开更多
Recent decades have witnessed a much increased demand for advanced,effective and efficient methods and tools for analyzing,understanding and dealing with data of increasingly complex,high dimensionality and large volu...Recent decades have witnessed a much increased demand for advanced,effective and efficient methods and tools for analyzing,understanding and dealing with data of increasingly complex,high dimensionality and large volume.Whether it is in biology,neuroscience,modern medicine and social sciences or in engineering and computer vision,data are being sampled,collected and cumulated in an unprecedented speed.It is no longer a trivial task to analyze huge amounts of high dimensional data.A systematic,automated way of interpreting data and representing them has become a great challenge facing almost all fields and research in this emerging area has flourished.Several lines of research have embarked on this timely challenge and tremendous progresses and advances have been made recently.Traditional and linear methods are being extended or enhanced in order to meet the new challenges.This paper elaborates on these recent advances and discusses various state-of-the-art algorithms proposed from statistics,geometry and adaptive neural networks.The developments mainly follow three lines:multidimensional scaling,eigen-decomposition as well as principal manifolds.Neural approaches and adaptive or incremental methods are also reviewed.In the first line,traditional multidimensional scaling(MDS)has been extended not only to be more adaptive such as neural scale,curvilinear component analysis(CCA)and visualization induced self-organizing map(ViSOM)for online learning,but also to be more local scaling such as Isomap for enhanced flexibility for nonlinear data sets.The second line extends linear principal component analysis(PCA)and has attracted a huge amount of interest and enjoyed flourishing advances with methods like kernel PCA(KPCA),locally linear embedding(LLE)and Laplacian eigenmap.The advantage is obvious:a nonlinear problem is transformed into a linear one and a unique solution can then be sought.The third line starts with the nonlinear principal curve and surface and links up with adaptive neural network approaches such as self-organizing map(SOM)and ViSOM.Many of these frameworks have been further improved and enhanced for incremental learning and mapping function generalization.This paper discusses these recent advances and their connections.Their application issues and implementation matters will also be briefly enlightened and commented on.展开更多
文摘Dimensionality reduction and data visualization are useful and important processes in pattern recognition. Many techniques have been developed in the recent years. The self-organizing map (SOM) can be an efficient method for this purpose. This paper reviews recent advances in this area and related approaches such as multidimensional scaling (MDS), nonlinear PC A, principal manifolds, as well as the connections of the SOM and its recent variant, the visualization induced SOM (ViSOM), with these approaches. The SOM is shown to produce a quantized, qualitative scaling and while the ViSOM a quantitative or metric scaling and approximates principal curve/surface. The SOM can also be regarded as a generalized MDS to relate two metric spaces by forming a topological mapping between them. The relationships among various recently proposed techniques such as ViSOM, Isomap, LLE, and eigenmap are discussed and compared.
文摘In this paper, we propose a novel uplink power control algorithm, SMST, for multiple-input multiple-output orthogonal frequency-division multiple access (MIMQ-OFDMA).We perform an extensive system-level simulation to compare different uplink power control algorithms, including the FPC adopted in 3GPP LTE and LTE-Advanced. Simulations show that SMST adopted in IEEE 802.16m outperforms other algorithms in terms of spectral efficiency, cell-edge performance, interference control, and trade-off control between sector-accumulated throughput and cell-edge user throughput. The SMST performance gain over FPC can be more than 40%
文摘Recent decades have witnessed a much increased demand for advanced,effective and efficient methods and tools for analyzing,understanding and dealing with data of increasingly complex,high dimensionality and large volume.Whether it is in biology,neuroscience,modern medicine and social sciences or in engineering and computer vision,data are being sampled,collected and cumulated in an unprecedented speed.It is no longer a trivial task to analyze huge amounts of high dimensional data.A systematic,automated way of interpreting data and representing them has become a great challenge facing almost all fields and research in this emerging area has flourished.Several lines of research have embarked on this timely challenge and tremendous progresses and advances have been made recently.Traditional and linear methods are being extended or enhanced in order to meet the new challenges.This paper elaborates on these recent advances and discusses various state-of-the-art algorithms proposed from statistics,geometry and adaptive neural networks.The developments mainly follow three lines:multidimensional scaling,eigen-decomposition as well as principal manifolds.Neural approaches and adaptive or incremental methods are also reviewed.In the first line,traditional multidimensional scaling(MDS)has been extended not only to be more adaptive such as neural scale,curvilinear component analysis(CCA)and visualization induced self-organizing map(ViSOM)for online learning,but also to be more local scaling such as Isomap for enhanced flexibility for nonlinear data sets.The second line extends linear principal component analysis(PCA)and has attracted a huge amount of interest and enjoyed flourishing advances with methods like kernel PCA(KPCA),locally linear embedding(LLE)and Laplacian eigenmap.The advantage is obvious:a nonlinear problem is transformed into a linear one and a unique solution can then be sought.The third line starts with the nonlinear principal curve and surface and links up with adaptive neural network approaches such as self-organizing map(SOM)and ViSOM.Many of these frameworks have been further improved and enhanced for incremental learning and mapping function generalization.This paper discusses these recent advances and their connections.Their application issues and implementation matters will also be briefly enlightened and commented on.