This paper introduced a dynamical system (neural networks) algorithm for solving a least squares problem with orthogonality constraints, which has wide applications in computer vision and signal processing. A rigorous...This paper introduced a dynamical system (neural networks) algorithm for solving a least squares problem with orthogonality constraints, which has wide applications in computer vision and signal processing. A rigorous analysis for the convergence and stability of the algorithm was provided. Moreover, a so called zero extension technique was presented to keep the algorithm always convergent to the needed result for any randomly chosen initial data. Numerical experiments illustrate the effectiveness and efficiency of the algorithm.展开更多
In this paper the new notion of multivariate least-squares orthogonal poly-nomials from the rectangular form is introduced. Their existence and uniqueness isstudied and some methods for their recursive computation are...In this paper the new notion of multivariate least-squares orthogonal poly-nomials from the rectangular form is introduced. Their existence and uniqueness isstudied and some methods for their recursive computation are given. As an applica-is constructed.展开更多
This paper introduces a new notion of weighted least-square orthogonal polynomials in multivariables from the triangular form. Their existence and uniqueness is studied and some methods for their recursive computation...This paper introduces a new notion of weighted least-square orthogonal polynomials in multivariables from the triangular form. Their existence and uniqueness is studied and some methods for their recursive computation are given. As an application, this paper constructs a new family of Pade-type approximates in multi-variables from the triangular form.展开更多
This paper is concerned with the application of forward Orthogonal Least Squares (OLS) algorithm to the design of Finite Impulse Response (FIR) filters. The focus of this study is a new FIR filter design procedure and...This paper is concerned with the application of forward Orthogonal Least Squares (OLS) algorithm to the design of Finite Impulse Response (FIR) filters. The focus of this study is a new FIR filter design procedure and to compare this with traditional methods known as the fir2() routine provided by MATLAB.展开更多
The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge...The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical nonlinear data modelling is the parsimonious principle of ensuring the smallest possible model that explains the training data. There exists a vast amount of works in the area of sparse modelling, and a widely adopted approach is based on the linear-in-the-parameters data modelling that include the radial basis function network, the neurofuzzy network and all the sparse kernel modelling techniques. A well tested strategy for parsimonious modelling from data is the orthogonal least squares (OLS) algorithm for forward selection modelling, which is capable of constructing sparse models that generalise well. This contribution continues this theme and provides a unified framework for sparse modelling from data that includes regression and classification, which belong to supervised learning, and probability density function estimation, which is an unsupervised learning problem. The OLS forward selection method based on the leave-one-out test criteria is presented within this unified data-modelling framework. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic parsimonious modelling approach from data.展开更多
This paper presents a two-level learning method for designing an optimal Radial Basis Function Network (RBFN) using Adaptive Velocity Update Relaxation Particle Swarm Optimization algorithm (AVURPSO) and Orthogonal Le...This paper presents a two-level learning method for designing an optimal Radial Basis Function Network (RBFN) using Adaptive Velocity Update Relaxation Particle Swarm Optimization algorithm (AVURPSO) and Orthogonal Least Squares algorithm (OLS) called as OLS-AVURPSO method. The novelty is to develop an AVURPSO algorithm to form the hybrid OLS-AVURPSO method for designing an optimal RBFN. The proposed method at the upper level finds the global optimum of the spread factor parameter using AVURPSO while at the lower level automatically constructs the RBFN using OLS algorithm. Simulation results confirm that the RBFN is superior to Multilayered Perceptron Network (MLPN) in terms of network size and computing time. To demonstrate the effectiveness of proposed OLS-AVURPSO in the design of RBFN, the Mackey-Glass Chaotic Time-Series as an example is modeled by both MLPN and RBFN.展开更多
This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>&...This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>×<em>n</em> matrix, <em>A</em>. This method has important applications in nonnegative matrix factorizations, in matrix completion problems, and in tensor approximations. The second method is called Orthogonal Iterations. Other names of this method are Subspace Iterations, Simultaneous Iterations, and block-Power method. Given a real symmetric matrix, <em>G</em>, this method computes<em> k</em> dominant eigenvectors of <em>G</em>. To see the relation between these methods we assume that <em>G </em>=<em> A</em><sup>T</sup> <em>A</em>. It is shown that in this case the two methods generate the same sequence of subspaces, and the same sequence of low-rank approximations. This equivalence provides new insight into the convergence properties of both methods.展开更多
A decomposition of a graph H is a partition of the edge set of H into edge-disjoint subgraphs . If for all , then G is a decomposition of H by G. Two decompositions and of the complete bipartite graph are orthogonal i...A decomposition of a graph H is a partition of the edge set of H into edge-disjoint subgraphs . If for all , then G is a decomposition of H by G. Two decompositions and of the complete bipartite graph are orthogonal if, for all . A set of decompositions of is a set of k mutually orthogonal graph squares (MOGS) if and are orthogonal for all and . For any bipartite graph G with n edges, denotes the maximum number k in a largest possible set of MOGS of by G. Our objective in this paper is to compute where is a path of length d with d + 1 vertices (i.e. Every edge of this path is one-to-one corresponding to an isomorphic to a certain graph F).展开更多
Analysis of stock recruitment (SR) data is most often done by fitting various SR relationship curves to the data. Fish population dynamics data often have stochastic variations and measurement errors, which usually re...Analysis of stock recruitment (SR) data is most often done by fitting various SR relationship curves to the data. Fish population dynamics data often have stochastic variations and measurement errors, which usually result in a biased regression analysis. This paper presents a robust regression method, least median of squared orthogonal distance (LMD), which is insensitive to abnormal values in the dependent and independent variables in a regression analysis. Outliers that have significantly different variance from the rest of the data can be identified in a residual analysis. Then, the least squares (LS) method is applied to the SR data with defined outliers being down weighted. The application of LMD and LMD based Reweighted Least Squares (RLS) method to simulated and real fisheries SR data is explored.展开更多
For square contingency tables with ordered categories, this article proposes new models, which are the extension of Tomizawa’s [1] diagonal exponent symmetry model. Also it gives the decomposition of proposed model, ...For square contingency tables with ordered categories, this article proposes new models, which are the extension of Tomizawa’s [1] diagonal exponent symmetry model. Also it gives the decomposition of proposed model, and shows the orthogonality of the test statistics for decomposed models. Examples are given and the simulation studies based on the bivariate normal distribution are also given.展开更多
A greedy algorithm used for the recovery of sparse signals,multiple orthogonal least squares(MOLS)have recently attracted quite a big of attention.In this paper,we consider the number of iterations required for the MO...A greedy algorithm used for the recovery of sparse signals,multiple orthogonal least squares(MOLS)have recently attracted quite a big of attention.In this paper,we consider the number of iterations required for the MOLS algorithm for recovery of a K-sparse signal x∈R^(n).We show that MOLS provides stable reconstruction of all K-sparse signals x from y=Ax+w in|6K/ M|iterations when the matrix A satisfies the restricted isometry property(RIP)with isometry constantδ_(7K)≤0.094.Compared with the existing results,our sufficient condition is not related to the sparsity level K.展开更多
We present a numerical method for solving the indefinite least squares problem. We first normalize the coefficient matrix. Then we compute the hyperbolic QR factorization of the normalized matrix. Finally we compute t...We present a numerical method for solving the indefinite least squares problem. We first normalize the coefficient matrix. Then we compute the hyperbolic QR factorization of the normalized matrix. Finally we compute the solution by solving several triangular systems. We give the first order error analysis to show that the method is backward stable. The method is more efficient than the backward stable method proposed by Chandrasekaran, Gu and Sayed.展开更多
Lonicerae Japonicae Flos is a significant food and traditional Chinese medicine,known as plant antibiotics.It has rich chemical constituents and significant pharmacological effects.The antitumor activity of Lonicerae ...Lonicerae Japonicae Flos is a significant food and traditional Chinese medicine,known as plant antibiotics.It has rich chemical constituents and significant pharmacological effects.The antitumor activity of Lonicerae Japonicae Flos has been clarified,but the study on its spectrum-effect relationship has not been reported.The compounds responsible for its antitumor activity are still unknown.In this study,processed products of Lonicerae Japonicae Flos at different temperatures were taken as experimental materials,and SMMC-7721,A549,andMGC80-3 cells were tested.The orthogonal partial least squares regressionmethod was used to analyze the common compounds in different processed products and the antitumor activity.The results show that processed products have a stronger inhibitory effect on A549 cells and MGC80-3 cells than SMMC-7721 cells.Compounds such as secologanic acid,isochlorogenic acid A,serotonin,and chlorogenic acid play an important role in their antitumor effects.展开更多
基金National Natural Science Foundation of China (No. 1990 10 18)
文摘This paper introduced a dynamical system (neural networks) algorithm for solving a least squares problem with orthogonality constraints, which has wide applications in computer vision and signal processing. A rigorous analysis for the convergence and stability of the algorithm was provided. Moreover, a so called zero extension technique was presented to keep the algorithm always convergent to the needed result for any randomly chosen initial data. Numerical experiments illustrate the effectiveness and efficiency of the algorithm.
基金This work is supported by NNSF(10271022)of China.
文摘In this paper the new notion of multivariate least-squares orthogonal poly-nomials from the rectangular form is introduced. Their existence and uniqueness isstudied and some methods for their recursive computation are given. As an applica-is constructed.
文摘This paper introduces a new notion of weighted least-square orthogonal polynomials in multivariables from the triangular form. Their existence and uniqueness is studied and some methods for their recursive computation are given. As an application, this paper constructs a new family of Pade-type approximates in multi-variables from the triangular form.
文摘This paper is concerned with the application of forward Orthogonal Least Squares (OLS) algorithm to the design of Finite Impulse Response (FIR) filters. The focus of this study is a new FIR filter design procedure and to compare this with traditional methods known as the fir2() routine provided by MATLAB.
文摘The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical nonlinear data modelling is the parsimonious principle of ensuring the smallest possible model that explains the training data. There exists a vast amount of works in the area of sparse modelling, and a widely adopted approach is based on the linear-in-the-parameters data modelling that include the radial basis function network, the neurofuzzy network and all the sparse kernel modelling techniques. A well tested strategy for parsimonious modelling from data is the orthogonal least squares (OLS) algorithm for forward selection modelling, which is capable of constructing sparse models that generalise well. This contribution continues this theme and provides a unified framework for sparse modelling from data that includes regression and classification, which belong to supervised learning, and probability density function estimation, which is an unsupervised learning problem. The OLS forward selection method based on the leave-one-out test criteria is presented within this unified data-modelling framework. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic parsimonious modelling approach from data.
文摘This paper presents a two-level learning method for designing an optimal Radial Basis Function Network (RBFN) using Adaptive Velocity Update Relaxation Particle Swarm Optimization algorithm (AVURPSO) and Orthogonal Least Squares algorithm (OLS) called as OLS-AVURPSO method. The novelty is to develop an AVURPSO algorithm to form the hybrid OLS-AVURPSO method for designing an optimal RBFN. The proposed method at the upper level finds the global optimum of the spread factor parameter using AVURPSO while at the lower level automatically constructs the RBFN using OLS algorithm. Simulation results confirm that the RBFN is superior to Multilayered Perceptron Network (MLPN) in terms of network size and computing time. To demonstrate the effectiveness of proposed OLS-AVURPSO in the design of RBFN, the Mackey-Glass Chaotic Time-Series as an example is modeled by both MLPN and RBFN.
文摘This note explores the relations between two different methods. The first one is the Alternating Least Squares (ALS) method for calculating a rank<em>-k</em> approximation of a real <em>m</em>×<em>n</em> matrix, <em>A</em>. This method has important applications in nonnegative matrix factorizations, in matrix completion problems, and in tensor approximations. The second method is called Orthogonal Iterations. Other names of this method are Subspace Iterations, Simultaneous Iterations, and block-Power method. Given a real symmetric matrix, <em>G</em>, this method computes<em> k</em> dominant eigenvectors of <em>G</em>. To see the relation between these methods we assume that <em>G </em>=<em> A</em><sup>T</sup> <em>A</em>. It is shown that in this case the two methods generate the same sequence of subspaces, and the same sequence of low-rank approximations. This equivalence provides new insight into the convergence properties of both methods.
文摘A decomposition of a graph H is a partition of the edge set of H into edge-disjoint subgraphs . If for all , then G is a decomposition of H by G. Two decompositions and of the complete bipartite graph are orthogonal if, for all . A set of decompositions of is a set of k mutually orthogonal graph squares (MOGS) if and are orthogonal for all and . For any bipartite graph G with n edges, denotes the maximum number k in a largest possible set of MOGS of by G. Our objective in this paper is to compute where is a path of length d with d + 1 vertices (i.e. Every edge of this path is one-to-one corresponding to an isomorphic to a certain graph F).
文摘Analysis of stock recruitment (SR) data is most often done by fitting various SR relationship curves to the data. Fish population dynamics data often have stochastic variations and measurement errors, which usually result in a biased regression analysis. This paper presents a robust regression method, least median of squared orthogonal distance (LMD), which is insensitive to abnormal values in the dependent and independent variables in a regression analysis. Outliers that have significantly different variance from the rest of the data can be identified in a residual analysis. Then, the least squares (LS) method is applied to the SR data with defined outliers being down weighted. The application of LMD and LMD based Reweighted Least Squares (RLS) method to simulated and real fisheries SR data is explored.
文摘For square contingency tables with ordered categories, this article proposes new models, which are the extension of Tomizawa’s [1] diagonal exponent symmetry model. Also it gives the decomposition of proposed model, and shows the orthogonality of the test statistics for decomposed models. Examples are given and the simulation studies based on the bivariate normal distribution are also given.
基金supported by the National Natural Science Foundation of China(61907014,11871248,11701410,61901160)Youth Science Foundation of Henan Normal University(2019QK03).
文摘A greedy algorithm used for the recovery of sparse signals,multiple orthogonal least squares(MOLS)have recently attracted quite a big of attention.In this paper,we consider the number of iterations required for the MOLS algorithm for recovery of a K-sparse signal x∈R^(n).We show that MOLS provides stable reconstruction of all K-sparse signals x from y=Ax+w in|6K/ M|iterations when the matrix A satisfies the restricted isometry property(RIP)with isometry constantδ_(7K)≤0.094.Compared with the existing results,our sufficient condition is not related to the sparsity level K.
文摘We present a numerical method for solving the indefinite least squares problem. We first normalize the coefficient matrix. Then we compute the hyperbolic QR factorization of the normalized matrix. Finally we compute the solution by solving several triangular systems. We give the first order error analysis to show that the method is backward stable. The method is more efficient than the backward stable method proposed by Chandrasekaran, Gu and Sayed.
基金supported by the Scientific and Technological Research Project of Henan Province(grant no.242102310549)the Key Research and Development Programme of Henan Province(grant no.231111312700)+2 种基金the National Natural Science Foundation of China(grant no.82104329)theNational Key Research andDevelopment Programme of China(grant no.2017YFC1702800)the special funds for starting scientific research of Henan University of Chinese Medicine(grant no.00104311-2021-1-41).
文摘Lonicerae Japonicae Flos is a significant food and traditional Chinese medicine,known as plant antibiotics.It has rich chemical constituents and significant pharmacological effects.The antitumor activity of Lonicerae Japonicae Flos has been clarified,but the study on its spectrum-effect relationship has not been reported.The compounds responsible for its antitumor activity are still unknown.In this study,processed products of Lonicerae Japonicae Flos at different temperatures were taken as experimental materials,and SMMC-7721,A549,andMGC80-3 cells were tested.The orthogonal partial least squares regressionmethod was used to analyze the common compounds in different processed products and the antitumor activity.The results show that processed products have a stronger inhibitory effect on A549 cells and MGC80-3 cells than SMMC-7721 cells.Compounds such as secologanic acid,isochlorogenic acid A,serotonin,and chlorogenic acid play an important role in their antitumor effects.