Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the ne...Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.展开更多
Magnetite nanoparticles show promising applications in drug delivery,catalysis,and spintronics.The surface of magnetite plays an important role in these applications.Therefore,it is critical to understand the surface ...Magnetite nanoparticles show promising applications in drug delivery,catalysis,and spintronics.The surface of magnetite plays an important role in these applications.Therefore,it is critical to understand the surface structure of Fe_(3)O_(4)at atomic scale.Here,using a combination of first-principles calculations,particle swarm optimization(PSO)method and machine learning,we investigate the possible reconstruction and stability of Fe_(3)O_(4)(001)surface.The results show that besides the subsurface cation vacancy(SCV)reconstruction,an A layer with Fe vacancy(A-layer-V_(Fe))reconstruction of the(001)surface also shows very low surface energy especially at oxygen poor condition.Molecular dynamics simulation based on the iron–oxygen interaction potential function fitted by machine learning further confirms the thermodynamic stability of the A-layer-V_(Fe)reconstruction.Our results are also instructive for the study of surface reconstruction of other metal oxides.展开更多
Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the rece...Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms.展开更多
Power transformer is one of the most crucial devices in power grid.It is significant to determine incipient faults of power transformers fast and accurately.Input features play critical roles in fault diagnosis accura...Power transformer is one of the most crucial devices in power grid.It is significant to determine incipient faults of power transformers fast and accurately.Input features play critical roles in fault diagnosis accuracy.In order to further improve the fault diagnosis performance of power trans-formers,a random forest feature selection method coupled with optimized kernel extreme learning machine is presented in this study.Firstly,the random forest feature selection approach is adopted to rank 42 related input features derived from gas concentration,gas ratio and energy-weighted dissolved gas analysis.Afterwards,a kernel extreme learning machine tuned by the Aquila optimization algorithm is implemented to adjust crucial parameters and select the optimal feature subsets.The diagnosis accuracy is used to assess the fault diagnosis capability of concerned feature subsets.Finally,the optimal feature subsets are applied to establish fault diagnosis model.According to the experimental results based on two public datasets and comparison with 5 conventional approaches,it can be seen that the average accuracy of the pro-posed method is up to 94.5%,which is superior to that of other conventional approaches.Fault diagnosis performances verify that the optimum feature subset obtained by the presented method can dramatically improve power transformers fault diagnosis accuracy.展开更多
Extreme Learning Machine(ELM)is popular in batch learning,sequential learning,and progressive learning,due to its speed,easy integration,and generalization ability.While,Traditional ELM cannot train massive data rapid...Extreme Learning Machine(ELM)is popular in batch learning,sequential learning,and progressive learning,due to its speed,easy integration,and generalization ability.While,Traditional ELM cannot train massive data rapidly and efficiently due to its memory residence,high time and space complexity.In ELM,the hidden layer typically necessitates a huge number of nodes.Furthermore,there is no certainty that the arrangement of weights and biases within the hidden layer is optimal.To solve this problem,the traditional ELM has been hybridized with swarm intelligence optimization techniques.This paper displays five proposed hybrid Algorithms“Salp Swarm Algorithm(SSA-ELM),Grasshopper Algorithm(GOA-ELM),Grey Wolf Algorithm(GWO-ELM),Whale optimizationAlgorithm(WOA-ELM)andMoth Flame Optimization(MFO-ELM)”.These five optimizers are hybridized with standard ELM methodology for resolving the tumor type classification using gene expression data.The proposed models applied to the predication of electricity loading data,that describes the energy use of a single residence over a fouryear period.In the hidden layer,Swarm algorithms are used to pick a smaller number of nodes to speed up the execution of ELM.The best weights and preferences were calculated by these algorithms for the hidden layer.Experimental results demonstrated that the proposed MFO-ELM achieved 98.13%accuracy and this is the highest model in accuracy in tumor type classification gene expression data.While in predication,the proposed GOA-ELM achieved 0.397which is least RMSE compared to the other models.展开更多
The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corros...The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corrosion rate.However,a better understanding of the correlation between the FSP process parameters and the corrosion rate is still lacking.The current study used machine learning to establish the relationship between the corrosion rate and FSP process parameters(rotational speed,traverse speed,and shoulder diameter)for WE43 alloy.The Taguchi L27 design of experiments was used for the experimental analysis.In addition,synthetic data was generated using particle swarm optimization for virtual sample generation(VSG).The application of VSG has led to an increase in the prediction accuracy of machine learning models.A sensitivity analysis was performed using Shapley Additive Explanations to determine the key factors affecting the corrosion rate.The shoulder diameter had a significant impact in comparison to the traverse speed.A graphical user interface(GUI)has been created to predict the corrosion rate using the identified factors.This study focuses on the WE43 alloy,but its findings can also be used to predict the corrosion rate of other magnesium alloys.展开更多
Image classification is a core field in the research area of image proces-sing and computer vision in which vehicle classification is a critical domain.The purpose of vehicle categorization is to formulate a compact s...Image classification is a core field in the research area of image proces-sing and computer vision in which vehicle classification is a critical domain.The purpose of vehicle categorization is to formulate a compact system to assist in real-world problems and applications such as security,traffic analysis,and self-driving and autonomous vehicles.The recent revolution in the field of machine learning and artificial intelligence has provided an immense amount of support for image processing related problems and has overtaken the conventional,and handcrafted means of solving image analysis problems.In this paper,a combina-tion of pre-trained CNN GoogleNet and a nature-inspired problem optimization scheme,particle swarm optimization(PSO),was employed for autonomous vehi-cle classification.The model was trained on a vehicle image dataset obtained from Kaggle that has been suitably augmented.The trained model was classified using several classifiers;however,the Cubic SVM(CSVM)classifier was found to out-perform the others in both time consumption and accuracy(94.8%).The results obtained from empirical evaluations and statistical tests reveal that the model itself has shown to outperform the other related models not only in terms of accu-racy(94.8%)but also in terms of training time(82.7 s)and speed prediction(380 obs/sec).展开更多
Aero-engine direct thrust control can not only improve the thrust control precision but also save the operating cost by reducing the reserved margin in design and making full use of aircraft engine potential performan...Aero-engine direct thrust control can not only improve the thrust control precision but also save the operating cost by reducing the reserved margin in design and making full use of aircraft engine potential performance.However,it is a big challenge to estimate engine thrust accurately.To tackle this problem,this paper proposes an ensemble of improved wavelet extreme learning machine(EW-ELM)for aircraft engine thrust estimation.Extreme learning machine(ELM)has been proved as an emerging learning technique with high efficiency.Since the combination of ELM and wavelet theory has the both excellent properties,wavelet activation functions are used in the hidden nodes to enhance non-linearity dealing ability.Besides,as original ELM may result in ill-condition and robustness problems due to the random determination of the parameters for hidden nodes,particle swarm optimization(PSO)algorithm is adopted to select the input weights and hidden biases.Furthermore,the ensemble of the improved wavelet ELM is utilized to construct the relationship between the sensor measurements and thrust.The simulation results verify the effectiveness and efficiency of the developed method and show that aero-engine thrust estimation using EW-ELM can satisfy the requirements of direct thrust control in terms of estimation accuracy and computation time.展开更多
Extreme learning machine(ELM)allows for fast learning and better generalization performance than conventional gradient-based learning.However,the possible inclusion of non-optimal weight and bias due to random selecti...Extreme learning machine(ELM)allows for fast learning and better generalization performance than conventional gradient-based learning.However,the possible inclusion of non-optimal weight and bias due to random selection and the need for more hidden neurons adversely influence network usability.Further,choosing the optimal number of hidden nodes for a network usually requires intensive human intervention,which may lead to an ill-conditioned situation.In this context,chemical reaction optimization(CRO)is a meta-heuristic paradigm with increased success in a large number of application areas.It is characterized by faster convergence capability and requires fewer tunable parameters.This study develops a learning framework combining the advantages of ELM and CRO,called extreme learning with chemical reaction optimization(ELCRO).ELCRO simultaneously optimizes the weight and bias vector and number of hidden neurons of a single layer feed-forward neural network without compromising prediction accuracy.We evaluate its performance by predicting the daily volatility and closing prices of BSE indices.Additionally,its performance is compared with three other similarly developed models—ELM based on particle swarm optimization,genetic algorithm,and gradient descent—and find the performance of the proposed algorithm superior.Wilcoxon signed-rank and Diebold–Mariano tests are then conducted to verify the statistical significance of the proposed model.Hence,this model can be used as a promising tool for financial forecasting.展开更多
An improved particle swarm optimization (PSO) algorithm based on ensemble technique is presented. The algorithm combines some previous best positions (pbest) of the particles to get an ensemble position (Epbest), whic...An improved particle swarm optimization (PSO) algorithm based on ensemble technique is presented. The algorithm combines some previous best positions (pbest) of the particles to get an ensemble position (Epbest), which is used to replace the global best position (gbest). It is compared with the standard PSO algorithm invented by Kennedy and Eberhart and some improved PSO algorithms based on three different benchmark functions. The simulation results show that the improved PSO based on ensemble technique can get better solutions than the standard PSO and some other improved algorithms under all test cases.展开更多
Deep deterministic policy gradient(DDPG)has been proved to be effective in optimizing particle swarm optimization(PSO),but whether DDPG can optimize multi-objective discrete particle swarm optimization(MODPSO)remains ...Deep deterministic policy gradient(DDPG)has been proved to be effective in optimizing particle swarm optimization(PSO),but whether DDPG can optimize multi-objective discrete particle swarm optimization(MODPSO)remains to be determined.The present work aims to probe into this topic.Experiments showed that the DDPG can not only quickly improve the convergence speed of MODPSO,but also overcome the problem of local optimal solution that MODPSO may suffer.The research findings are of great significance for the theoretical research and application of MODPSO.展开更多
During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential ...During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential floating will increase the initial stress on the segments and bolts which is harmful to the service performance of the tunnel.In this study we used a random forest(RF)algorithm combined particle swarm optimization(PSO)and 5-fold cross-validation(5-fold CV)to predict the maximum upward displacement of tunnel linings induced by shield tunnel excavation.The mechanism and factors causing upward movement of the tunnel lining are comprehensively summarized.Twelve input variables were selected according to results from analysis of influencing factors.The prediction performance of two models,PSO-RF and RF(default)were compared.The Gini value was obtained to represent the relative importance of the influencing factors to the upward displacement of linings.The PSO-RF model successfully predicted the maximum upward displacement of the tunnel linings with a low error(mean absolute error(MAE)=4.04 mm,root mean square error(RMSE)=5.67 mm)and high correlation(R^(2)=0.915).The thrust and depth of the tunnel were the most important factors in the prediction model influencing the upward displacement of the tunnel linings.展开更多
Fault diagnosis technology plays an important role in the industries due to the emergency fault of a machine could bring the heavy lost for the people and the company. A fault diagnosis model based on multi-manifold l...Fault diagnosis technology plays an important role in the industries due to the emergency fault of a machine could bring the heavy lost for the people and the company. A fault diagnosis model based on multi-manifold learning and particle swarm optimization support vector machine(PSO-SVM) is studied. This fault diagnosis model is used for a rolling bearing experimental of three kinds faults. The results are verified that this model based on multi-manifold learning and PSO-SVM is good at the fault sensitive features acquisition with effective accuracy.展开更多
Objective: By optimizing the extreme learning machine network with particle swarm optimization, we established a syndrome classification and prediction model for primary liver cancer(PLC), classified and predicted the...Objective: By optimizing the extreme learning machine network with particle swarm optimization, we established a syndrome classification and prediction model for primary liver cancer(PLC), classified and predicted the syndrome diagnosis of medical record data for PLC and compared and analyzed the prediction results with different algorithms and the clinical diagnosis results. This paper provides modern technical support for clinical diagnosis and treatment, and improves the objectivity, accuracy and rigor of the classification of traditional Chinese medicine(TCM) syndromes.Methods: From three top-level TCM hospitals in Nanchang, 10,602 electronic medical records from patients with PLC were collected, dating from January 2009 to May 2020. We removed the electronic medical records of 542 cases of syndromes and adopted the cross-validation method in the remaining10,060 electronic medical records, which were randomly divided into a training set and a test set.Based on fuzzy mathematics theory, we quantified the syndrome-related factors of TCM symptoms and signs, and information from the TCM four diagnostic methods. Next, using an extreme learning machine network with particle swarm optimization, we constructed a neural network syndrome classification and prediction model that used "TCM symptoms + signs + tongue diagnosis information + pulse diagnosis information" as input, and PLC syndrome as output. This approach was used to mine the nonlinear relationship between clinical data in electronic medical records and different syndrome types. The accuracy rate of classification was used to compare this model to other machine learning classification models.Results: The classification accuracy rate of the model developed here was 86.26%. The classification accuracy rates of models using support vector machine and Bayesian networks were 82.79% and 85.84%,respectively. The classification accuracy rates of the models for all syndromes in this paper were between82.15% and 93.82%.Conclusion: Compared with the case of data processed using traditional binary inputs, the experiment shows that the medical record data processed by fuzzy mathematics was more accurate, and closer to clinical findings. In addition, the model developed here was more refined, more accurate, and quicker than other classification models. This model provides reliable diagnosis for clinical treatment of PLC and a method to study of the rules of syndrome differentiation and treatment in TCM.展开更多
Global warming is one of the most complicated challenges of our time causing considerable tension on our societies and on the environment.The impacts of global warming are felt unprecedentedly in a wide variety of way...Global warming is one of the most complicated challenges of our time causing considerable tension on our societies and on the environment.The impacts of global warming are felt unprecedentedly in a wide variety of ways from shifting weather patterns that threatens food production,to rising sea levels that deteriorates the risk of catastrophic flooding.Among all aspects related to global warming,there is a growing concern on water resource management.This field is targeted at preventing future water crisis threatening human beings.The very first stage in such management is to recognize the prospective climate parameters influencing the future water resource conditions.Numerous prediction models,methods and tools,in this case,have been developed and applied so far.In line with trend,the current study intends to compare three optimization algorithms on the platform of a multilayer perceptron(MLP)network to explore any meaningful connection between large-scale climate indices(LSCIs)and precipitation in the capital of Iran,a country which is located in an arid and semi-arid region and suffers from severe water scarcity caused by mismanagement over years and intensified by global warming.This situation has propelled a great deal of population to immigrate towards more developed cities within the country especially towards Tehran.Therefore,the current and future environmental conditions of this city especially its water supply conditions are of great importance.To tackle this complication an outlook for the future precipitation should be provided and appropriate forecasting trajectories compatible with this region's characteristics should be developed.To this end,the present study investigates three training methods namely backpropagation(BP),genetic algorithms(GAs),and particle swarm optimization(PSO)algorithms on a MLP platform.Two frameworks distinguished by their input compositions are denoted in this study:Concurrent Model Framework(CMF)and Integrated Model Framework(IMF).Through these two frameworks,13 cases are generated:12 cases within CMF,each of which contains all selected LSCIs in the same lead-times,and one case within IMF that is constituted from the combination of the most correlated LSCIs with Tehran precipitation in each lead-time.Following the evaluation of all model performances through related statistical tests,Taylor diagram is implemented to make comparison among the final selected models in all three optimization algorithms,the best of which is found to be MLP-PSO in IMF.展开更多
The variable air volume(VAV)air conditioning system is with strong coupling and large time delay,for which model predictive control(MPC)is normally used to pursue performance improvement.Aiming at the difficulty of th...The variable air volume(VAV)air conditioning system is with strong coupling and large time delay,for which model predictive control(MPC)is normally used to pursue performance improvement.Aiming at the difficulty of the parameter selection of VAV MPC controller which is difficult to make the system have a desired response,a novel tuning method based on machine learning and improved particle swarm optimization(PSO)is proposed.In this method,the relationship between MPC controller parameters and time domain performance indices is established via machine learning.Then the PSO is used to optimize MPC controller parameters to get better performance in terms of time domain indices.In addition,the PSO algorithm is further modified under the principle of population attenuation and event triggering to tune parameters of MPC and reduce the computation time of tuning method.Finally,the effectiveness of the proposed method is validated via a hardware-in-the-loop VAV system.展开更多
文摘Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.
基金the National Natural Science Foundation of China(Grant Nos.12004064,12074053,and 91961204)the Fundamental Research Funds for the Central Universities(Grant No.DUT22LK11)XingLiaoYingCai Project of Liaoning Province,China(Grant No.XLYC1907163)。
文摘Magnetite nanoparticles show promising applications in drug delivery,catalysis,and spintronics.The surface of magnetite plays an important role in these applications.Therefore,it is critical to understand the surface structure of Fe_(3)O_(4)at atomic scale.Here,using a combination of first-principles calculations,particle swarm optimization(PSO)method and machine learning,we investigate the possible reconstruction and stability of Fe_(3)O_(4)(001)surface.The results show that besides the subsurface cation vacancy(SCV)reconstruction,an A layer with Fe vacancy(A-layer-V_(Fe))reconstruction of the(001)surface also shows very low surface energy especially at oxygen poor condition.Molecular dynamics simulation based on the iron–oxygen interaction potential function fitted by machine learning further confirms the thermodynamic stability of the A-layer-V_(Fe)reconstruction.Our results are also instructive for the study of surface reconstruction of other metal oxides.
基金supported in part by the National Natural Science Foundation of China(U2001213 and 61971191)in part by the Beijing Natural Science Foundation under Grant L182018 and L201011+2 种基金in part by National Key Research and Development Project(2020YFB1807204)in part by the Key project of Natural Science Foundation of Jiangxi Province(20202ACBL202006)in part by the Innovation Fund Designated for Graduate Students of Jiangxi Province(YC2020-S321)。
文摘Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms.
基金support of national natural science foundation of China(No.52067021)natural science foundation of Xinjiang(2022D01C35)+1 种基金excellent youth scientific and technological talents plan of Xinjiang(No.2019Q012)major science and technology special project of Xinjiang Uygur Autonomous Region(2022A01002-2).
文摘Power transformer is one of the most crucial devices in power grid.It is significant to determine incipient faults of power transformers fast and accurately.Input features play critical roles in fault diagnosis accuracy.In order to further improve the fault diagnosis performance of power trans-formers,a random forest feature selection method coupled with optimized kernel extreme learning machine is presented in this study.Firstly,the random forest feature selection approach is adopted to rank 42 related input features derived from gas concentration,gas ratio and energy-weighted dissolved gas analysis.Afterwards,a kernel extreme learning machine tuned by the Aquila optimization algorithm is implemented to adjust crucial parameters and select the optimal feature subsets.The diagnosis accuracy is used to assess the fault diagnosis capability of concerned feature subsets.Finally,the optimal feature subsets are applied to establish fault diagnosis model.According to the experimental results based on two public datasets and comparison with 5 conventional approaches,it can be seen that the average accuracy of the pro-posed method is up to 94.5%,which is superior to that of other conventional approaches.Fault diagnosis performances verify that the optimum feature subset obtained by the presented method can dramatically improve power transformers fault diagnosis accuracy.
文摘Extreme Learning Machine(ELM)is popular in batch learning,sequential learning,and progressive learning,due to its speed,easy integration,and generalization ability.While,Traditional ELM cannot train massive data rapidly and efficiently due to its memory residence,high time and space complexity.In ELM,the hidden layer typically necessitates a huge number of nodes.Furthermore,there is no certainty that the arrangement of weights and biases within the hidden layer is optimal.To solve this problem,the traditional ELM has been hybridized with swarm intelligence optimization techniques.This paper displays five proposed hybrid Algorithms“Salp Swarm Algorithm(SSA-ELM),Grasshopper Algorithm(GOA-ELM),Grey Wolf Algorithm(GWO-ELM),Whale optimizationAlgorithm(WOA-ELM)andMoth Flame Optimization(MFO-ELM)”.These five optimizers are hybridized with standard ELM methodology for resolving the tumor type classification using gene expression data.The proposed models applied to the predication of electricity loading data,that describes the energy use of a single residence over a fouryear period.In the hidden layer,Swarm algorithms are used to pick a smaller number of nodes to speed up the execution of ELM.The best weights and preferences were calculated by these algorithms for the hidden layer.Experimental results demonstrated that the proposed MFO-ELM achieved 98.13%accuracy and this is the highest model in accuracy in tumor type classification gene expression data.While in predication,the proposed GOA-ELM achieved 0.397which is least RMSE compared to the other models.
文摘The corrosion rate is a crucial factor that impacts the longevity of materials in different applications.After undergoing friction stir processing(FSP),the refined grain structure leads to a notable decrease in corrosion rate.However,a better understanding of the correlation between the FSP process parameters and the corrosion rate is still lacking.The current study used machine learning to establish the relationship between the corrosion rate and FSP process parameters(rotational speed,traverse speed,and shoulder diameter)for WE43 alloy.The Taguchi L27 design of experiments was used for the experimental analysis.In addition,synthetic data was generated using particle swarm optimization for virtual sample generation(VSG).The application of VSG has led to an increase in the prediction accuracy of machine learning models.A sensitivity analysis was performed using Shapley Additive Explanations to determine the key factors affecting the corrosion rate.The shoulder diameter had a significant impact in comparison to the traverse speed.A graphical user interface(GUI)has been created to predict the corrosion rate using the identified factors.This study focuses on the WE43 alloy,but its findings can also be used to predict the corrosion rate of other magnesium alloys.
文摘Image classification is a core field in the research area of image proces-sing and computer vision in which vehicle classification is a critical domain.The purpose of vehicle categorization is to formulate a compact system to assist in real-world problems and applications such as security,traffic analysis,and self-driving and autonomous vehicles.The recent revolution in the field of machine learning and artificial intelligence has provided an immense amount of support for image processing related problems and has overtaken the conventional,and handcrafted means of solving image analysis problems.In this paper,a combina-tion of pre-trained CNN GoogleNet and a nature-inspired problem optimization scheme,particle swarm optimization(PSO),was employed for autonomous vehi-cle classification.The model was trained on a vehicle image dataset obtained from Kaggle that has been suitably augmented.The trained model was classified using several classifiers;however,the Cubic SVM(CSVM)classifier was found to out-perform the others in both time consumption and accuracy(94.8%).The results obtained from empirical evaluations and statistical tests reveal that the model itself has shown to outperform the other related models not only in terms of accu-racy(94.8%)but also in terms of training time(82.7 s)and speed prediction(380 obs/sec).
基金supported by the National Natural Science Foundation of China (Nos.51176075,51576097)the Fouding of Jiangsu Innovation Program for Graduate Education(No.KYLX_0305)
文摘Aero-engine direct thrust control can not only improve the thrust control precision but also save the operating cost by reducing the reserved margin in design and making full use of aircraft engine potential performance.However,it is a big challenge to estimate engine thrust accurately.To tackle this problem,this paper proposes an ensemble of improved wavelet extreme learning machine(EW-ELM)for aircraft engine thrust estimation.Extreme learning machine(ELM)has been proved as an emerging learning technique with high efficiency.Since the combination of ELM and wavelet theory has the both excellent properties,wavelet activation functions are used in the hidden nodes to enhance non-linearity dealing ability.Besides,as original ELM may result in ill-condition and robustness problems due to the random determination of the parameters for hidden nodes,particle swarm optimization(PSO)algorithm is adopted to select the input weights and hidden biases.Furthermore,the ensemble of the improved wavelet ELM is utilized to construct the relationship between the sensor measurements and thrust.The simulation results verify the effectiveness and efficiency of the developed method and show that aero-engine thrust estimation using EW-ELM can satisfy the requirements of direct thrust control in terms of estimation accuracy and computation time.
文摘Extreme learning machine(ELM)allows for fast learning and better generalization performance than conventional gradient-based learning.However,the possible inclusion of non-optimal weight and bias due to random selection and the need for more hidden neurons adversely influence network usability.Further,choosing the optimal number of hidden nodes for a network usually requires intensive human intervention,which may lead to an ill-conditioned situation.In this context,chemical reaction optimization(CRO)is a meta-heuristic paradigm with increased success in a large number of application areas.It is characterized by faster convergence capability and requires fewer tunable parameters.This study develops a learning framework combining the advantages of ELM and CRO,called extreme learning with chemical reaction optimization(ELCRO).ELCRO simultaneously optimizes the weight and bias vector and number of hidden neurons of a single layer feed-forward neural network without compromising prediction accuracy.We evaluate its performance by predicting the daily volatility and closing prices of BSE indices.Additionally,its performance is compared with three other similarly developed models—ELM based on particle swarm optimization,genetic algorithm,and gradient descent—and find the performance of the proposed algorithm superior.Wilcoxon signed-rank and Diebold–Mariano tests are then conducted to verify the statistical significance of the proposed model.Hence,this model can be used as a promising tool for financial forecasting.
文摘An improved particle swarm optimization (PSO) algorithm based on ensemble technique is presented. The algorithm combines some previous best positions (pbest) of the particles to get an ensemble position (Epbest), which is used to replace the global best position (gbest). It is compared with the standard PSO algorithm invented by Kennedy and Eberhart and some improved PSO algorithms based on three different benchmark functions. The simulation results show that the improved PSO based on ensemble technique can get better solutions than the standard PSO and some other improved algorithms under all test cases.
文摘Deep deterministic policy gradient(DDPG)has been proved to be effective in optimizing particle swarm optimization(PSO),but whether DDPG can optimize multi-objective discrete particle swarm optimization(MODPSO)remains to be determined.The present work aims to probe into this topic.Experiments showed that the DDPG can not only quickly improve the convergence speed of MODPSO,but also overcome the problem of local optimal solution that MODPSO may suffer.The research findings are of great significance for the theoretical research and application of MODPSO.
基金supported by the Basic Science Center Program for Multiphase Evolution in Hyper Gravity of the National Natural Science Foundation of China(No.51988101)the National Natural Science Foundation of China(No.52178306)the Zhejiang Provincial Natural Science Foundation of China(No.LR19E080002).
文摘During construction,the shield linings of tunnels often face the problem of local or overall upward movement after leaving the shield tail in soft soil areas or during some large diameter shield projects.Differential floating will increase the initial stress on the segments and bolts which is harmful to the service performance of the tunnel.In this study we used a random forest(RF)algorithm combined particle swarm optimization(PSO)and 5-fold cross-validation(5-fold CV)to predict the maximum upward displacement of tunnel linings induced by shield tunnel excavation.The mechanism and factors causing upward movement of the tunnel lining are comprehensively summarized.Twelve input variables were selected according to results from analysis of influencing factors.The prediction performance of two models,PSO-RF and RF(default)were compared.The Gini value was obtained to represent the relative importance of the influencing factors to the upward displacement of linings.The PSO-RF model successfully predicted the maximum upward displacement of the tunnel linings with a low error(mean absolute error(MAE)=4.04 mm,root mean square error(RMSE)=5.67 mm)and high correlation(R^(2)=0.915).The thrust and depth of the tunnel were the most important factors in the prediction model influencing the upward displacement of the tunnel linings.
基金Beijing Natural Science Foundation(KZ201211232039)National Natural Science Foundation of China(51275052)+1 种基金Funding Project for Academic Human Resources Development in Institutions of Higher Learning under the Jurisdiction of Beijing Municipalipality(PHR201106132)PXM2014_014224_000080
文摘Fault diagnosis technology plays an important role in the industries due to the emergency fault of a machine could bring the heavy lost for the people and the company. A fault diagnosis model based on multi-manifold learning and particle swarm optimization support vector machine(PSO-SVM) is studied. This fault diagnosis model is used for a rolling bearing experimental of three kinds faults. The results are verified that this model based on multi-manifold learning and PSO-SVM is good at the fault sensitive features acquisition with effective accuracy.
基金financially supported by the National Natural Science Foundation (No. 81660727)。
文摘Objective: By optimizing the extreme learning machine network with particle swarm optimization, we established a syndrome classification and prediction model for primary liver cancer(PLC), classified and predicted the syndrome diagnosis of medical record data for PLC and compared and analyzed the prediction results with different algorithms and the clinical diagnosis results. This paper provides modern technical support for clinical diagnosis and treatment, and improves the objectivity, accuracy and rigor of the classification of traditional Chinese medicine(TCM) syndromes.Methods: From three top-level TCM hospitals in Nanchang, 10,602 electronic medical records from patients with PLC were collected, dating from January 2009 to May 2020. We removed the electronic medical records of 542 cases of syndromes and adopted the cross-validation method in the remaining10,060 electronic medical records, which were randomly divided into a training set and a test set.Based on fuzzy mathematics theory, we quantified the syndrome-related factors of TCM symptoms and signs, and information from the TCM four diagnostic methods. Next, using an extreme learning machine network with particle swarm optimization, we constructed a neural network syndrome classification and prediction model that used "TCM symptoms + signs + tongue diagnosis information + pulse diagnosis information" as input, and PLC syndrome as output. This approach was used to mine the nonlinear relationship between clinical data in electronic medical records and different syndrome types. The accuracy rate of classification was used to compare this model to other machine learning classification models.Results: The classification accuracy rate of the model developed here was 86.26%. The classification accuracy rates of models using support vector machine and Bayesian networks were 82.79% and 85.84%,respectively. The classification accuracy rates of the models for all syndromes in this paper were between82.15% and 93.82%.Conclusion: Compared with the case of data processed using traditional binary inputs, the experiment shows that the medical record data processed by fuzzy mathematics was more accurate, and closer to clinical findings. In addition, the model developed here was more refined, more accurate, and quicker than other classification models. This model provides reliable diagnosis for clinical treatment of PLC and a method to study of the rules of syndrome differentiation and treatment in TCM.
文摘Global warming is one of the most complicated challenges of our time causing considerable tension on our societies and on the environment.The impacts of global warming are felt unprecedentedly in a wide variety of ways from shifting weather patterns that threatens food production,to rising sea levels that deteriorates the risk of catastrophic flooding.Among all aspects related to global warming,there is a growing concern on water resource management.This field is targeted at preventing future water crisis threatening human beings.The very first stage in such management is to recognize the prospective climate parameters influencing the future water resource conditions.Numerous prediction models,methods and tools,in this case,have been developed and applied so far.In line with trend,the current study intends to compare three optimization algorithms on the platform of a multilayer perceptron(MLP)network to explore any meaningful connection between large-scale climate indices(LSCIs)and precipitation in the capital of Iran,a country which is located in an arid and semi-arid region and suffers from severe water scarcity caused by mismanagement over years and intensified by global warming.This situation has propelled a great deal of population to immigrate towards more developed cities within the country especially towards Tehran.Therefore,the current and future environmental conditions of this city especially its water supply conditions are of great importance.To tackle this complication an outlook for the future precipitation should be provided and appropriate forecasting trajectories compatible with this region's characteristics should be developed.To this end,the present study investigates three training methods namely backpropagation(BP),genetic algorithms(GAs),and particle swarm optimization(PSO)algorithms on a MLP platform.Two frameworks distinguished by their input compositions are denoted in this study:Concurrent Model Framework(CMF)and Integrated Model Framework(IMF).Through these two frameworks,13 cases are generated:12 cases within CMF,each of which contains all selected LSCIs in the same lead-times,and one case within IMF that is constituted from the combination of the most correlated LSCIs with Tehran precipitation in each lead-time.Following the evaluation of all model performances through related statistical tests,Taylor diagram is implemented to make comparison among the final selected models in all three optimization algorithms,the best of which is found to be MLP-PSO in IMF.
基金supported by the National Natural Science Foundation of China(No.61903291)Key Research and Development Program of Shaanxi Province(No.2022NY-094)。
文摘The variable air volume(VAV)air conditioning system is with strong coupling and large time delay,for which model predictive control(MPC)is normally used to pursue performance improvement.Aiming at the difficulty of the parameter selection of VAV MPC controller which is difficult to make the system have a desired response,a novel tuning method based on machine learning and improved particle swarm optimization(PSO)is proposed.In this method,the relationship between MPC controller parameters and time domain performance indices is established via machine learning.Then the PSO is used to optimize MPC controller parameters to get better performance in terms of time domain indices.In addition,the PSO algorithm is further modified under the principle of population attenuation and event triggering to tune parameters of MPC and reduce the computation time of tuning method.Finally,the effectiveness of the proposed method is validated via a hardware-in-the-loop VAV system.