In microarray-based cancer classification, gene selection is an important issue owing to the large number of variables and small number of samples as well as its non-linearity. It is difficult to get satisfying result...In microarray-based cancer classification, gene selection is an important issue owing to the large number of variables and small number of samples as well as its non-linearity. It is difficult to get satisfying results by using conventional linear sta- tistical methods. Recursive feature elimination based on support vector machine (SVM RFE) is an effective algorithm for gene selection and cancer classification, which are integrated into a consistent framework. In this paper, we propose a new method to select parameters of the aforementioned algorithm implemented with Gaussian kernel SVMs as better alternatives to the common practice of selecting the apparently best parameters by using a genetic algorithm to search for a couple of optimal parameter. Fast implementation issues for this method are also discussed for pragmatic reasons. The proposed method was tested on two repre- sentative hereditary breast cancer and acute leukaemia datasets. The experimental results indicate that the proposed method per- forms well in selecting genes and achieves high classification accuracies with these genes.展开更多
Motivated by the fact that automatic parameters selection for Support Vector Machine(SVM) is an important issue to make SVM practically useful and the common used Leave-One-Out(LOO) method is complex calculation and t...Motivated by the fact that automatic parameters selection for Support Vector Machine(SVM) is an important issue to make SVM practically useful and the common used Leave-One-Out(LOO) method is complex calculation and time consuming,an effective strategy for automatic parameters selection for SVM is proposed by using the Particle Swarm Optimization(PSO) in this paper.Simulation results of practice data model demonstrate the effectiveness and high efficiency of the proposed approach.展开更多
The support vector machine (SVM) is a novel machine learning method, which has the ability to approximate nonlinear functions with arbitrary accuracy. Setting parameters well is very crucial for SVM learning results...The support vector machine (SVM) is a novel machine learning method, which has the ability to approximate nonlinear functions with arbitrary accuracy. Setting parameters well is very crucial for SVM learning results and generalization ability, and now there is no systematic, general method for parameter selection. In this article, the SVM parameter selection for function approximation is regarded as a compound optimization problem and a mutative scale chaos optimization algorithm is employed to search for optimal paraxneter values. The chaos optimization algorithm is an effective way for global optimal and the mutative scale chaos algorithm could improve the search efficiency and accuracy. Several simulation examples show the sensitivity of the SVM parameters and demonstrate the superiority of this proposed method for nonlinear function approximation.展开更多
Selecting the optimal parameters for support vector machine (SVM) has long been a hot research topic. Aiming for support vector classification/regression (SVC/SVR) with the radial basis function (RBF) kernel, we summa...Selecting the optimal parameters for support vector machine (SVM) has long been a hot research topic. Aiming for support vector classification/regression (SVC/SVR) with the radial basis function (RBF) kernel, we summarize the rough line rule of the penalty parameter and kernel width, and propose a novel linear search method to obtain these two optimal parameters. We use a direct-setting method with thresholds to set the epsilon parameter of SVR. The proposed method directly locates the right search field, which greatly saves computing time and achieves a stable, high accuracy. The method is more competitive for both SVC and SVR. It is easy to use and feasible for a new data set without any adjustments, since it requires no parameters to set.展开更多
Support Vector Machine(SVM)has become one of the traditional machine learning algorithms the most used in prediction and classification tasks.However,its behavior strongly depends on some parameters,making tuning thes...Support Vector Machine(SVM)has become one of the traditional machine learning algorithms the most used in prediction and classification tasks.However,its behavior strongly depends on some parameters,making tuning these parameters a sensitive step to maintain a good performance.On the other hand,and as any other classifier,the performance of SVM is also affected by the input set of features used to build the learning model,which makes the selection of relevant features an important task not only to preserve a good classification accuracy but also to reduce the dimensionality of datasets.In this paper,the MRFO+SVM algorithm is introduced by investigating the recent manta ray foraging optimizer to fine-tune the SVM parameters and identify the optimal feature subset simultaneously.The proposed approach is validated and compared with four SVM-based algorithms over eight benchmarking datasets.Additionally,it is applied to a disease Covid-19 dataset.The experimental results show the high ability of the proposed algorithm to find the appropriate SVM’s parameters,and its acceptable performance to deal with feature selection problem.展开更多
基金Project supported by the National Basic Research Program (973) of China (No. 2002CB312200) and the Center for Bioinformatics Pro-gram Grant of Harvard Center of Neurodegeneration and Repair,Harvard Medical School, Harvard University, Boston, USA
文摘In microarray-based cancer classification, gene selection is an important issue owing to the large number of variables and small number of samples as well as its non-linearity. It is difficult to get satisfying results by using conventional linear sta- tistical methods. Recursive feature elimination based on support vector machine (SVM RFE) is an effective algorithm for gene selection and cancer classification, which are integrated into a consistent framework. In this paper, we propose a new method to select parameters of the aforementioned algorithm implemented with Gaussian kernel SVMs as better alternatives to the common practice of selecting the apparently best parameters by using a genetic algorithm to search for a couple of optimal parameter. Fast implementation issues for this method are also discussed for pragmatic reasons. The proposed method was tested on two repre- sentative hereditary breast cancer and acute leukaemia datasets. The experimental results indicate that the proposed method per- forms well in selecting genes and achieves high classification accuracies with these genes.
文摘Motivated by the fact that automatic parameters selection for Support Vector Machine(SVM) is an important issue to make SVM practically useful and the common used Leave-One-Out(LOO) method is complex calculation and time consuming,an effective strategy for automatic parameters selection for SVM is proposed by using the Particle Swarm Optimization(PSO) in this paper.Simulation results of practice data model demonstrate the effectiveness and high efficiency of the proposed approach.
基金the National Nature Science Foundation of China (60775047, 60402024)
文摘The support vector machine (SVM) is a novel machine learning method, which has the ability to approximate nonlinear functions with arbitrary accuracy. Setting parameters well is very crucial for SVM learning results and generalization ability, and now there is no systematic, general method for parameter selection. In this article, the SVM parameter selection for function approximation is regarded as a compound optimization problem and a mutative scale chaos optimization algorithm is employed to search for optimal paraxneter values. The chaos optimization algorithm is an effective way for global optimal and the mutative scale chaos algorithm could improve the search efficiency and accuracy. Several simulation examples show the sensitivity of the SVM parameters and demonstrate the superiority of this proposed method for nonlinear function approximation.
基金supported by the National Basic Research Program (973) of China (No. 2009CB724006)the National Natural Science Foun-dation of China (No. 60977010)
文摘Selecting the optimal parameters for support vector machine (SVM) has long been a hot research topic. Aiming for support vector classification/regression (SVC/SVR) with the radial basis function (RBF) kernel, we summarize the rough line rule of the penalty parameter and kernel width, and propose a novel linear search method to obtain these two optimal parameters. We use a direct-setting method with thresholds to set the epsilon parameter of SVR. The proposed method directly locates the right search field, which greatly saves computing time and achieves a stable, high accuracy. The method is more competitive for both SVC and SVR. It is easy to use and feasible for a new data set without any adjustments, since it requires no parameters to set.
文摘Support Vector Machine(SVM)has become one of the traditional machine learning algorithms the most used in prediction and classification tasks.However,its behavior strongly depends on some parameters,making tuning these parameters a sensitive step to maintain a good performance.On the other hand,and as any other classifier,the performance of SVM is also affected by the input set of features used to build the learning model,which makes the selection of relevant features an important task not only to preserve a good classification accuracy but also to reduce the dimensionality of datasets.In this paper,the MRFO+SVM algorithm is introduced by investigating the recent manta ray foraging optimizer to fine-tune the SVM parameters and identify the optimal feature subset simultaneously.The proposed approach is validated and compared with four SVM-based algorithms over eight benchmarking datasets.Additionally,it is applied to a disease Covid-19 dataset.The experimental results show the high ability of the proposed algorithm to find the appropriate SVM’s parameters,and its acceptable performance to deal with feature selection problem.