There are four serious problems in the discriminant analysis. We developed an optimal linear discriminant function (optimal LDF) based on the minimum number of misclassification (minimum NM) using integer programm...There are four serious problems in the discriminant analysis. We developed an optimal linear discriminant function (optimal LDF) based on the minimum number of misclassification (minimum NM) using integer programming (IP). We call this LDF as Revised IP-OLDF. Only this LDF can discriminate the cases on the discriminant hyperplane (Probleml). This LDF and a hard-margin SVM (H-SVM) can discriminate the lineary separable data (LSD) exactly. Another LDFs may not discriminate the LSD theoretically (Problem2). When Revised IP-OLDF discriminate the Swiss banknote data with six variables, we find MNM of two-variables model such as (X4, X6) is zero. Because MNMk decreases monotounusly (MNMk 〉= MNM(k+1)), sixteen MNMs including (X4, X6) are zero. Until now, because there is no research of the LSD, we surveyed another three linear separable data sets such as: 18 exam scores data sets, the Japanese 44 cars data and six microarray datasets. When we discriminate the exam scores with MNM=0, we find the generalized inverse matrix technique causes the serious Problem3 and confirmed this fact by the cars data. At last, we claim the discriminant analysis is not the inferential statistics because there is no standard errors (SEs) of error rates and discriminant coefficients (Problem4). Therefore, we poroposed the "100-fold cross validation for the small sample" method (the method). By this break-through, we can choose the best model having minimum mean of error rate (M2) in the validation sample and obtaine two 95% confidence intervals (CIs) of error rate and discriminant coefficients. When we discriminate the exam scores by this new method, we obtaine the surprising results seven LDFs except for Fisher's LDF are almost the same as the trivial LDFs. In this research, we discriminate the Japanese 44 cars data because we can discuss four problems. There are six independent variables to discriminate 29 regular cars and 15 small cars. This data is linear separable by the emission rate (X1) and the number of seats (X3). We examine the validity of the new model selection procedure of the discriminant analysis. We proposed the model with minimum mean of error rates (M2) in the validation samples is the best model. We had examined this procedure by the exam scores, and we obtain good results. Moreover, the 95% CI of eight LDFs offers us real perception of the discriminant theory. However, the exam scores are different from the ordinal data. Therefore, we apply our theory and procedure to the Japanese 44 cars data and confirmed the same conclution.展开更多
In this paper,the image space analysis (for short,ISA) is employed to investigate variational in- equalities (for short,VI) with cone constraints.Linear separation for VI with cone constraints is characterized by usin...In this paper,the image space analysis (for short,ISA) is employed to investigate variational in- equalities (for short,VI) with cone constraints.Linear separation for VI with cone constraints is characterized by using the normal cone to a regularization of the image,and saddle points of the generalized Lagrangian func- tion.Lagrangian-type necessary and sufficient optimality conditions for VI with cone constraints are presented by using a separation theorem.Gap functions and weak sharpness for VI with cone constraints are also investi- gated.Finally,the obtained results are applied to standard and time-dependent traffic equilibria introduced by Daniele,Maugeri and Oettli.展开更多
In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagra...In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.展开更多
文摘There are four serious problems in the discriminant analysis. We developed an optimal linear discriminant function (optimal LDF) based on the minimum number of misclassification (minimum NM) using integer programming (IP). We call this LDF as Revised IP-OLDF. Only this LDF can discriminate the cases on the discriminant hyperplane (Probleml). This LDF and a hard-margin SVM (H-SVM) can discriminate the lineary separable data (LSD) exactly. Another LDFs may not discriminate the LSD theoretically (Problem2). When Revised IP-OLDF discriminate the Swiss banknote data with six variables, we find MNM of two-variables model such as (X4, X6) is zero. Because MNMk decreases monotounusly (MNMk 〉= MNM(k+1)), sixteen MNMs including (X4, X6) are zero. Until now, because there is no research of the LSD, we surveyed another three linear separable data sets such as: 18 exam scores data sets, the Japanese 44 cars data and six microarray datasets. When we discriminate the exam scores with MNM=0, we find the generalized inverse matrix technique causes the serious Problem3 and confirmed this fact by the cars data. At last, we claim the discriminant analysis is not the inferential statistics because there is no standard errors (SEs) of error rates and discriminant coefficients (Problem4). Therefore, we poroposed the "100-fold cross validation for the small sample" method (the method). By this break-through, we can choose the best model having minimum mean of error rate (M2) in the validation sample and obtaine two 95% confidence intervals (CIs) of error rate and discriminant coefficients. When we discriminate the exam scores by this new method, we obtaine the surprising results seven LDFs except for Fisher's LDF are almost the same as the trivial LDFs. In this research, we discriminate the Japanese 44 cars data because we can discuss four problems. There are six independent variables to discriminate 29 regular cars and 15 small cars. This data is linear separable by the emission rate (X1) and the number of seats (X3). We examine the validity of the new model selection procedure of the discriminant analysis. We proposed the model with minimum mean of error rates (M2) in the validation samples is the best model. We had examined this procedure by the exam scores, and we obtain good results. Moreover, the 95% CI of eight LDFs offers us real perception of the discriminant theory. However, the exam scores are different from the ordinal data. Therefore, we apply our theory and procedure to the Japanese 44 cars data and confirmed the same conclution.
基金supported by National Natural Science Foundation of China (Grants Nos. 60804065, 70831005)the Key Project of Chinese Ministry of Education (Grant No. 211163)Sichuan Youth Science and Technology Foundation and the Research Foundation of China West Normal University (Grant No. 08B075)
文摘In this paper,the image space analysis (for short,ISA) is employed to investigate variational in- equalities (for short,VI) with cone constraints.Linear separation for VI with cone constraints is characterized by using the normal cone to a regularization of the image,and saddle points of the generalized Lagrangian func- tion.Lagrangian-type necessary and sufficient optimality conditions for VI with cone constraints are presented by using a separation theorem.Gap functions and weak sharpness for VI with cone constraints are also investi- gated.Finally,the obtained results are applied to standard and time-dependent traffic equilibria introduced by Daniele,Maugeri and Oettli.
基金This research is partly supported by the National Natural Sci-ence Foundation of China(Grant No.11671217)Natural Science Foundation of Xinjiang(Grant No.2017D01A14)。
文摘In recent years,alternating direction method of multipliers(ADMM)and its variants are popular for the extensive use in image processing and statistical learning.A variant of ADMM:symmetric ADMM,which updates the Lagrange mul-tiplier twice in one iteration,is always faster whenever it converges.In this paper,combined with Nesterov’s accelerating strategy,an accelerated symmetric ADMM is proposed.We prove its O(1/k^(2))convergence rate under strongly convex condition.For the general situation,an accelerated method with a restart rule is proposed.Some preliminary numerical experiments show the efficiency of our algorithms.