In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditiona...In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditional Maxinnization (ECM) algorithm to estimate parameters and conduct numerical simulation, and performs fitting analysis on the test scores of Linear Algebra and Advanced Mathematics of F University. The empirical results show that the two-component mixed generalized normal distribution is better than the commonly used two-component mixed normal distribution in fitting college students’ test data, and has good application value.展开更多
This paper proposes a new pre-processing technique to separate the most effective features from those that might deteriorate the performance of the machine learning classifiers in terms of computational costs and clas...This paper proposes a new pre-processing technique to separate the most effective features from those that might deteriorate the performance of the machine learning classifiers in terms of computational costs and classification accuracy because of their irrelevance,redundancy,or less information;this pre-processing process is often known as feature selection.This technique is based on adopting a new optimization algorithm known as generalized normal distribution optimization(GNDO)supported by the conversion of the normal distribution to a binary one using the arctangent transfer function to convert the continuous values into binary values.Further,a novel restarting strategy(RS)is proposed to preserve the diversity among the solutions within the population by identifying the solutions that exceed a specific distance from the best-so-far and replace them with the others created using an effective updating scheme.This strategy is integrated with GNDO to propose another binary variant having a high ability to preserve the diversity of the solutions for avoiding becoming stuck in local minima and accelerating convergence,namely improved GNDO(IGNDO).The proposed GNDO and IGNDO algorithms are extensively compared with seven state-of-the-art algorithms to verify their performance on thirteen medical instances taken from the UCI repository.IGNDO is shown to be superior in terms of fitness value and classification accuracy and competitive with the others in terms of the selected features.Since the principal goal in solving the FS problem is to find the appropriate subset of features that maximize classification accuracy,IGNDO is considered the best.展开更多
GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characte...GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characteristic function of other GMM procedures in the literature. The procedures are relatively simple to implement and are less simulation-oriented than simulated methods of inferences yet have the potential of good efficiencies for models with densities without closed form. The procedures also yield better estimators than method of moment estimators for models with more than three parameters as higher order sample moments tend to be unstable.展开更多
文摘In order to improve the fitting accuracy of college students’ test scores, this paper proposes two-component mixed generalized normal distribution, uses maximum likelihood estimation method and Expectation Conditional Maxinnization (ECM) algorithm to estimate parameters and conduct numerical simulation, and performs fitting analysis on the test scores of Linear Algebra and Advanced Mathematics of F University. The empirical results show that the two-component mixed generalized normal distribution is better than the commonly used two-component mixed normal distribution in fitting college students’ test data, and has good application value.
基金This work has supported by the National Research Foundation of Korea(NRF)grant funded by the Korea government(MSIT)(No.NRF-2021R1A2C1010362)and the Soonchunhyang University Research Fund.
文摘This paper proposes a new pre-processing technique to separate the most effective features from those that might deteriorate the performance of the machine learning classifiers in terms of computational costs and classification accuracy because of their irrelevance,redundancy,or less information;this pre-processing process is often known as feature selection.This technique is based on adopting a new optimization algorithm known as generalized normal distribution optimization(GNDO)supported by the conversion of the normal distribution to a binary one using the arctangent transfer function to convert the continuous values into binary values.Further,a novel restarting strategy(RS)is proposed to preserve the diversity among the solutions within the population by identifying the solutions that exceed a specific distance from the best-so-far and replace them with the others created using an effective updating scheme.This strategy is integrated with GNDO to propose another binary variant having a high ability to preserve the diversity of the solutions for avoiding becoming stuck in local minima and accelerating convergence,namely improved GNDO(IGNDO).The proposed GNDO and IGNDO algorithms are extensively compared with seven state-of-the-art algorithms to verify their performance on thirteen medical instances taken from the UCI repository.IGNDO is shown to be superior in terms of fitness value and classification accuracy and competitive with the others in terms of the selected features.Since the principal goal in solving the FS problem is to find the appropriate subset of features that maximize classification accuracy,IGNDO is considered the best.
文摘GMM inference procedures based on the square of the modulus of the model characteristic function are developed using sample moments selected using estimating function theory and bypassing the use of empirical characteristic function of other GMM procedures in the literature. The procedures are relatively simple to implement and are less simulation-oriented than simulated methods of inferences yet have the potential of good efficiencies for models with densities without closed form. The procedures also yield better estimators than method of moment estimators for models with more than three parameters as higher order sample moments tend to be unstable.