This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid ftmction as the nonlinear mapping fun...This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid ftmction as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.展开更多
In order to construct estimating functions in some parametric models, this paper introducestwo classes of information matrices. Some necessary and sufficient conditions for the informationmatrices achieving their uppe...In order to construct estimating functions in some parametric models, this paper introducestwo classes of information matrices. Some necessary and sufficient conditions for the informationmatrices achieving their upper bounds are given. For the problem of estimating the median,some optimum estimating functions based on the information matrices are acquired. Undersome regularity conditions, an approach to carrying out the best basis function is introduced. Innonlinear regression models, an optimum estimating function based on the information matricesis obtained. Some examples are given to illustrate the results. Finally, the concept of optimumestimating function and the methods of constructing optimum estimating function are developedin more general statistical models.展开更多
文摘This letter investigates an improved blind source separation algorithm based on Maximum Entropy (ME) criteria. The original ME algorithm chooses the fixed exponential or sigmoid ftmction as the nonlinear mapping function which can not match the original signal very well. A parameter estimation method is employed in this letter to approach the probability of density function of any signal with parameter-steered generalized exponential function. An improved learning rule and a natural gradient update formula of unmixing matrix are also presented. The algorithm of this letter can separate the mixture of super-Gaussian signals and also the mixture of sub-Gaussian signals. The simulation experiment demonstrates the efficiency of the algorithm.
基金Project supported by the National Natural Science Foundation of China(No.10171051)and the Youth Teacher Foundation of Nankai University.
文摘In order to construct estimating functions in some parametric models, this paper introducestwo classes of information matrices. Some necessary and sufficient conditions for the informationmatrices achieving their upper bounds are given. For the problem of estimating the median,some optimum estimating functions based on the information matrices are acquired. Undersome regularity conditions, an approach to carrying out the best basis function is introduced. Innonlinear regression models, an optimum estimating function based on the information matricesis obtained. Some examples are given to illustrate the results. Finally, the concept of optimumestimating function and the methods of constructing optimum estimating function are developedin more general statistical models.