Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
This paper proposes to apply the genetic algorithm and the firefly algorithm to enhance the estimation of the direction of arrival (DOA) angle of electromagnetic signals of a smart antenna array. This estimation is es...This paper proposes to apply the genetic algorithm and the firefly algorithm to enhance the estimation of the direction of arrival (DOA) angle of electromagnetic signals of a smart antenna array. This estimation is essential for beamforming, where the antenna array radiating pattern is steered to provide faster and reliable data transmission with increased coverage. This work proposes using metaheuristics to improve a maximum likelihood DOA estimator for an antenna array arranged in a uniform cuboidal geometry. The DOA estimation performance of the proposed algorithm was compared to that of MUSIC on different two dimensions scenarios. The metaheuristic algorithms present better performance than the well-known MUSIC algorithm.展开更多
In this article, we consider a lifetime distribution, the Weibull-Logarithmic distri- bution introduced by [6]. We investigate some new statistical characterizations and properties. We develop the maximum likelihood i...In this article, we consider a lifetime distribution, the Weibull-Logarithmic distri- bution introduced by [6]. We investigate some new statistical characterizations and properties. We develop the maximum likelihood inference using EM algorithm. Asymptotic properties of the MLEs are obtained and extensive simulations are conducted to assess the performance of parameter estimation. A numerical example is used to illustrate the application.展开更多
Maximum likelihood estimation is a method of estimating the parameters of a statistical model in statistics. It has been widely used in a good many multi-disciplines such as econometrics, data modelling in nuclear and...Maximum likelihood estimation is a method of estimating the parameters of a statistical model in statistics. It has been widely used in a good many multi-disciplines such as econometrics, data modelling in nuclear and particle physics, and geographical satellite image classification, and so forth. Over the past decade, although many conventional numerical approximation approaches have been most successfully developed to solve the problems of maximum likelihood parameter estimation, bio-inspired optimization techniques have shown promising performance and gained an incredible recognition as an attractive solution to such problems. This review paper attempts to offer a comprehensive perspective of conventional and bio-inspired optimization techniques in maximum likelihood parameter estimation so as to highlight the challenges and key issues and encourage the researches for further progress.展开更多
This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estima...This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.展开更多
Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuri...Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuristic techniques were employed to search for radiation source parameters that provide the maximum likelihood by using a network of sensors.Hence,the time consumption of MLE would be effectively reduced.First,the radiation source was detected using the k-sigma method.Subsequently,the MLE was applied for parameter estimation using the readings and positions of the detectors that have detected the radiation source.A comparative study was performed in which the estimation accuracy and time consump-tion of the MLE were evaluated for traditional methods and heuristic techniques.The traditional MLE was performed via a grid search method using fixed and multiple resolutions.Additionally,four commonly used heuristic algorithms were applied:the firefly algorithm(FFA),particle swarm optimization(PSO),ant colony optimization(ACO),and artificial bee colony(ABC).The experiment was conducted using real data collected by the Low Scatter Irradiator facility at the Savannah River National Laboratory as part of the Intelligent Radiation Sensing System program.The comparative study showed that the estimation time was 3.27 s using fixed resolution MLE and 0.59 s using multi-resolution MLE.The time consumption for the heuristic-based MLE was 0.75,0.03,0.02,and 0.059 s for FFA,PSO,ACO,and ABC,respectively.The location estimation error was approximately 0.4 m using either the grid search-based MLE or the heuristic-based MLE.Hence,heuristic-based MLE can provide comparable estimation accuracy through a less time-consuming process than traditional MLE.展开更多
According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out...According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.展开更多
Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of ...Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of the new probability distribution function are estimated by the maximum likelihood method under progressive type II censored data via expectation maximization algorithm.展开更多
A nonzero intermediate frequency (IF) likelihood acquisition scheme designed for S-band Single Access (SSA) link of China’s Tracking and Data Relay Satellite System (CTDRSS) is introduced. The received signal is down...A nonzero intermediate frequency (IF) likelihood acquisition scheme designed for S-band Single Access (SSA) link of China’s Tracking and Data Relay Satellite System (CTDRSS) is introduced. The received signal is downconverted to IF, and then direct sampled in IF using a 1-bit A/D. After the digitalization, the sampled data is detected using a hybrid likelihood acquisition scheme. Using this structure, large noise figure of the analog mixer or active filters, amplitude and phase imbalance between low-frequency in-phase and quandrature-phase channel can be avoided. An easy designing algorithm of the acquisition scheme is also derived. The performance and algorithm are verified by computer simulation.展开更多
Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. Th...Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. The minimum negative log-likelihood function was used as the objective function to optimize instead of using iterative method to solve complex system of equations,and the problem of parameter estimation of improved NHPP model was solved by immune clone algorithm. And the interval estimation of reliability indices was given by using fisher information matrix method and delta method. An example of failure truncated data from multiple numerical control( NC) machine tools was taken to prove the method. and the results show that the algorithm has a higher convergence rate and computational accuracy, which demonstrates the feasibility of the method.展开更多
Normal mixture regression models are one of the most important statistical data analysis tools in a heterogeneous population. When the data set under consideration involves asymmetric outcomes, in the last two decades...Normal mixture regression models are one of the most important statistical data analysis tools in a heterogeneous population. When the data set under consideration involves asymmetric outcomes, in the last two decades, the skew normal distribution has been shown beneficial in dealing with asymmetric data in various theoretic and applied problems. In this paper, we propose and study a novel class of models: a skew-normal mixture of joint location, scale and skewness models to analyze the heteroscedastic skew-normal data coming from a heterogeneous population. The issues of maximum likelihood estimation are addressed. In particular, an Expectation-Maximization (EM) algorithm for estimating the model parameters is developed. Properties of the estimators of the regression coefficients are evaluated through Monte Carlo experiments. Results from the analysis of a real data set from the Body Mass Index (BMI) data are presented.展开更多
The decoupled coherent Maximum Likelihood (ML) detection algorithm presented in this letter can sharply reduce the complexity of the receiver as well as provide better error performance under the precondition that cha...The decoupled coherent Maximum Likelihood (ML) detection algorithm presented in this letter can sharply reduce the complexity of the receiver as well as provide better error performance under the precondition that channel should be estimated first. Considering the bandwidth inefficiency of Frequency Shift Keying (FSK), the acquisition of channel state information through training sequences will further decrease the transmission efficiency. This letter presents a blind channel estimation algorithm based on noise subspace theory which can acquire channel information without any training symbols. The simulation shows that the algorithm brings about fewer channel estimation errors while the frequency efficiency can be increased.展开更多
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘This paper proposes to apply the genetic algorithm and the firefly algorithm to enhance the estimation of the direction of arrival (DOA) angle of electromagnetic signals of a smart antenna array. This estimation is essential for beamforming, where the antenna array radiating pattern is steered to provide faster and reliable data transmission with increased coverage. This work proposes using metaheuristics to improve a maximum likelihood DOA estimator for an antenna array arranged in a uniform cuboidal geometry. The DOA estimation performance of the proposed algorithm was compared to that of MUSIC on different two dimensions scenarios. The metaheuristic algorithms present better performance than the well-known MUSIC algorithm.
基金Supported by the program for the Fundamental Research Funds for the Central Universities(2014RC042,2015JBM109)
文摘In this article, we consider a lifetime distribution, the Weibull-Logarithmic distri- bution introduced by [6]. We investigate some new statistical characterizations and properties. We develop the maximum likelihood inference using EM algorithm. Asymptotic properties of the MLEs are obtained and extensive simulations are conducted to assess the performance of parameter estimation. A numerical example is used to illustrate the application.
文摘Maximum likelihood estimation is a method of estimating the parameters of a statistical model in statistics. It has been widely used in a good many multi-disciplines such as econometrics, data modelling in nuclear and particle physics, and geographical satellite image classification, and so forth. Over the past decade, although many conventional numerical approximation approaches have been most successfully developed to solve the problems of maximum likelihood parameter estimation, bio-inspired optimization techniques have shown promising performance and gained an incredible recognition as an attractive solution to such problems. This review paper attempts to offer a comprehensive perspective of conventional and bio-inspired optimization techniques in maximum likelihood parameter estimation so as to highlight the challenges and key issues and encourage the researches for further progress.
文摘This paper discusses the estimation of parameters in the zero-inflated Poisson (ZIP) model by the method of moments. The method of moments estimators (MMEs) are analytically compared with the maximum likelihood estimators (MLEs). The results of a modest simulation study are presented.
文摘Maximum likelihood estimation(MLE)is an effective method for localizing radioactive sources in a given area.However,it requires an exhaustive search for parameter estimation,which is time-consuming.In this study,heuristic techniques were employed to search for radiation source parameters that provide the maximum likelihood by using a network of sensors.Hence,the time consumption of MLE would be effectively reduced.First,the radiation source was detected using the k-sigma method.Subsequently,the MLE was applied for parameter estimation using the readings and positions of the detectors that have detected the radiation source.A comparative study was performed in which the estimation accuracy and time consump-tion of the MLE were evaluated for traditional methods and heuristic techniques.The traditional MLE was performed via a grid search method using fixed and multiple resolutions.Additionally,four commonly used heuristic algorithms were applied:the firefly algorithm(FFA),particle swarm optimization(PSO),ant colony optimization(ACO),and artificial bee colony(ABC).The experiment was conducted using real data collected by the Low Scatter Irradiator facility at the Savannah River National Laboratory as part of the Intelligent Radiation Sensing System program.The comparative study showed that the estimation time was 3.27 s using fixed resolution MLE and 0.59 s using multi-resolution MLE.The time consumption for the heuristic-based MLE was 0.75,0.03,0.02,and 0.059 s for FFA,PSO,ACO,and ABC,respectively.The location estimation error was approximately 0.4 m using either the grid search-based MLE or the heuristic-based MLE.Hence,heuristic-based MLE can provide comparable estimation accuracy through a less time-consuming process than traditional MLE.
基金the National Natural Science Foundation of China
文摘According to the principle, “The failure data is the basis of software reliability analysis”, we built a software reliability expert system (SRES) by adopting the artificial intelligence technology. By reasoning out the conclusion from the fitting results of failure data of a software project, the SRES can recommend users “the most suitable model” as a software reliability measurement model. We believe that the SRES can overcome the inconsistency in applications of software reliability models well. We report investigation results of singularity and parameter estimation methods of experimental models in SRES.
文摘Exponentiated Generalized Weibull distribution is a probability distribution which generalizes the Weibull distribution introducing two more shapes parameters to best adjust the non-monotonic shape. The parameters of the new probability distribution function are estimated by the maximum likelihood method under progressive type II censored data via expectation maximization algorithm.
文摘A nonzero intermediate frequency (IF) likelihood acquisition scheme designed for S-band Single Access (SSA) link of China’s Tracking and Data Relay Satellite System (CTDRSS) is introduced. The received signal is downconverted to IF, and then direct sampled in IF using a 1-bit A/D. After the digitalization, the sampled data is detected using a hybrid likelihood acquisition scheme. Using this structure, large noise figure of the analog mixer or active filters, amplitude and phase imbalance between low-frequency in-phase and quandrature-phase channel can be avoided. An easy designing algorithm of the acquisition scheme is also derived. The performance and algorithm are verified by computer simulation.
基金National CNC Special Project,China(No.2010ZX04001-032)the Youth Science and Technology Foundation of Gansu Province,China(No.145RJYA307)
文摘Aiming at the solving problem of improved nonhomogeneous Poisson process( NHPP) model in engineering application,the immune clone maximum likelihood estimation( MLE)method for solving model parameters was proposed. The minimum negative log-likelihood function was used as the objective function to optimize instead of using iterative method to solve complex system of equations,and the problem of parameter estimation of improved NHPP model was solved by immune clone algorithm. And the interval estimation of reliability indices was given by using fisher information matrix method and delta method. An example of failure truncated data from multiple numerical control( NC) machine tools was taken to prove the method. and the results show that the algorithm has a higher convergence rate and computational accuracy, which demonstrates the feasibility of the method.
基金Supported by the National Natural Science Foundation of China(11261025,11561075)the Natural Science Foundation of Yunnan Province(2016FB005)the Program for Middle-aged Backbone Teacher,Yunnan University
文摘Normal mixture regression models are one of the most important statistical data analysis tools in a heterogeneous population. When the data set under consideration involves asymmetric outcomes, in the last two decades, the skew normal distribution has been shown beneficial in dealing with asymmetric data in various theoretic and applied problems. In this paper, we propose and study a novel class of models: a skew-normal mixture of joint location, scale and skewness models to analyze the heteroscedastic skew-normal data coming from a heterogeneous population. The issues of maximum likelihood estimation are addressed. In particular, an Expectation-Maximization (EM) algorithm for estimating the model parameters is developed. Properties of the estimators of the regression coefficients are evaluated through Monte Carlo experiments. Results from the analysis of a real data set from the Body Mass Index (BMI) data are presented.
文摘The decoupled coherent Maximum Likelihood (ML) detection algorithm presented in this letter can sharply reduce the complexity of the receiver as well as provide better error performance under the precondition that channel should be estimated first. Considering the bandwidth inefficiency of Frequency Shift Keying (FSK), the acquisition of channel state information through training sequences will further decrease the transmission efficiency. This letter presents a blind channel estimation algorithm based on noise subspace theory which can acquire channel information without any training symbols. The simulation shows that the algorithm brings about fewer channel estimation errors while the frequency efficiency can be increased.