BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr...BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration.展开更多
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode...Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.展开更多
Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experi...Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems.展开更多
Reducing the aerodynamic drag and noise levels of high-speed pantographs is important for promoting environmentally friendly,energy efficient and rapid advances in train technology.Using computational fluid dynamics t...Reducing the aerodynamic drag and noise levels of high-speed pantographs is important for promoting environmentally friendly,energy efficient and rapid advances in train technology.Using computational fluid dynamics theory and the K-FWH acoustic equation,a numerical simulation is conducted to investigate the aerodynamic characteristics of high-speed pantographs.A component optimization method is proposed as a possible solution to the problemof aerodynamic drag and noise in high-speed pantographs.The results of the study indicate that the panhead,base and insulator are the main contributors to aerodynamic drag and noise in high-speed pantographs.Therefore,a gradual optimization process is implemented to improve the most significant components that cause aerodynamic drag and noise.By optimizing the cross-sectional shape of the strips and insulators,the drag and noise caused by airflow separation and vortex shedding can be reduced.The aerodynamic drag of insulator with circular cross section and strips with rectangular cross section is the largest.Ellipsifying insulators and optimizing the chamfer angle and height of the windward surface of the strips can improve the aerodynamic performance of the pantograph.In addition,the streamlined fairing attached to the base can eliminate the complex flow and shield the radiated noise.In contrast to the original pantograph design,the improved pantograph shows a 21.1%reduction in aerodynamic drag and a 1.65 dBA reduction in aerodynamic noise.展开更多
This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while ...This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models.展开更多
The majority of the projectiles used in the hypersonic penetration study are solid flat-nosed cylindrical projectiles with a diameter of less than 20 mm.This study aims to fill the gap in the experimental and analytic...The majority of the projectiles used in the hypersonic penetration study are solid flat-nosed cylindrical projectiles with a diameter of less than 20 mm.This study aims to fill the gap in the experimental and analytical study of the evolution of the nose shape of larger hollow projectiles under hypersonic penetration.In the hypersonic penetration test,eight ogive-nose AerMet100 steel projectiles with a diameter of 40 mm were launched to hit concrete targets with impact velocities that ranged from 1351 to 1877 m/s.Severe erosion of the projectiles was observed during high-speed penetration of heterogeneous targets,and apparent localized mushrooming occurred in the front nose of recovered projectiles.By examining the damage to projectiles,a linear relationship was found between the relative length reduction rate and the initial kinetic energy of projectiles in different penetration tests.Furthermore,microscopic analysis revealed the forming mechanism of the localized mushrooming phenomenon for eroding penetration,i.e.,material spall erosion abrasion mechanism,material flow and redistribution abrasion mechanism and localized radial upsetting deformation mechanism.Finally,a model of highspeed penetration that included erosion was established on the basis of a model of the evolution of the projectile nose that considers radial upsetting;the model was validated by test data from the literature and the present study.Depending upon the impact velocity,v0,the projectile nose may behave as undistorted,radially distorted or hemispherical.Due to the effects of abrasion of the projectile and enhancement of radial upsetting on the duration and amplitude of the secondary rising segment in the pulse shape of projectile deceleration,the predicted DOP had an upper limit.展开更多
In this paper,application examples of high-speed electrical machines are presented,and the machine structures are categorized.Key issues of design and control for the high-speed permanent magnet machines are reviewed,...In this paper,application examples of high-speed electrical machines are presented,and the machine structures are categorized.Key issues of design and control for the high-speed permanent magnet machines are reviewed,including bearings selection,rotor dynamics analysis and design,rotor stress analysis and protection,thermal analysis and design,electromagnetic losses analysis and reduction,sensorless control strategies,as well as comparison and selection of sine-wave and square-wave drive modes.Some challenges are also discussed,so that future studies could be focused.展开更多
The reliable operation of high-speed wire rod finishing mills is crucial in the steel production enterprise.As complex system-level equipment,it is difficult for high-speed wire rod finishing mills to realize fault lo...The reliable operation of high-speed wire rod finishing mills is crucial in the steel production enterprise.As complex system-level equipment,it is difficult for high-speed wire rod finishing mills to realize fault location and real-time monitoring.To solve the above problems,an expert experience and data-driven-based hybrid fault diagnosis method for high-speed wire rod finishing mills is proposed in this paper.First,based on its mechanical structure,time and frequency domain analysis are improved in fault feature extraction.The approach of combining virtual value,peak value with kurtosis value index,is adopted in time domain analysis.Speed adjustment and side frequency analysis are proposed in frequency domain analysis to obtain accurate component characteristic frequency and its corresponding sideband.Then,according to time and frequency domain characteristics,fault location based on expert experience is proposed to get an accurate fault result.Finally,the proposed method is implemented in the equipment intelligent diagnosis system.By taking an equipment fault on site,for example,the effectiveness of the proposed method is illustrated in the system.展开更多
The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceu...The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations.展开更多
Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments an...Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research.展开更多
In recent years,the global surge of High-speed Railway(HSR)revolutionized ground transportation,providing secure,comfortable,and punctual services.The next-gen HSR,fueled by emerging services like video surveillance,e...In recent years,the global surge of High-speed Railway(HSR)revolutionized ground transportation,providing secure,comfortable,and punctual services.The next-gen HSR,fueled by emerging services like video surveillance,emergency communication,and real-time scheduling,demands advanced capabilities in real-time perception,automated driving,and digitized services,which accelerate the integration and application of Artificial Intelligence(AI)in the HSR system.This paper first provides a brief overview of AI,covering its origin,evolution,and breakthrough applications.A comprehensive review is then given regarding the most advanced AI technologies and applications in three macro application domains of the HSR system:mechanical manufacturing and electrical control,communication and signal control,and transportation management.The literature is categorized and compared across nine application directions labeled as intelligent manufacturing of trains and key components,forecast of railroad maintenance,optimization of energy consumption in railroads and trains,communication security,communication dependability,channel modeling and estimation,passenger scheduling,traffic flow forecasting,high-speed railway smart platform.Finally,challenges associated with the application of AI are discussed,offering insights for future research directions.展开更多
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique...Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.展开更多
The high rate of early recurrence in hepatocellular carcinoma(HCC)post curative surgical intervention poses a substantial clinical hurdle,impacting patient outcomes and complicating postoperative management.The advent...The high rate of early recurrence in hepatocellular carcinoma(HCC)post curative surgical intervention poses a substantial clinical hurdle,impacting patient outcomes and complicating postoperative management.The advent of machine learning provides a unique opportunity to harness vast datasets,identifying subtle patterns and factors that elude conventional prognostic methods.Machine learning models,equipped with the ability to analyse intricate relationships within datasets,have shown promise in predicting outcomes in various medical disciplines.In the context of HCC,the application of machine learning to predict early recurrence holds potential for personalized postoperative care strategies.This editorial comments on the study carried out exploring the merits and efficacy of random survival forests(RSF)in identifying significant risk factors for recurrence,stratifying patients at low and high risk of HCC recurrence and comparing this to traditional COX proportional hazard models(CPH).In doing so,the study demonstrated that the RSF models are superior to traditional CPH models in predicting recurrence of HCC and represent a giant leap towards precision medicine.展开更多
The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important prac...The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys.展开更多
Fires,including wildfires,harm air quality and essential public services like transportation,communication,and utilities.These fires can also influence atmospheric conditions,including temperature and aerosols,potenti...Fires,including wildfires,harm air quality and essential public services like transportation,communication,and utilities.These fires can also influence atmospheric conditions,including temperature and aerosols,potentially affecting severe convective storms.Here,we investigate the remote impacts of fires in the western United States(WUS)on the occurrence of large hail(size:≥2.54 cm)in the central US(CUS)over the 20-year period of 2001–20 using the machine learning(ML),Random Forest(RF),and Extreme Gradient Boosting(XGB)methods.The developed RF and XGB models demonstrate high accuracy(>90%)and F1 scores of up to 0.78 in predicting large hail occurrences when WUS fires and CUS hailstorms coincide,particularly in four states(Wyoming,South Dakota,Nebraska,and Kansas).The key contributing variables identified from both ML models include the meteorological variables in the fire region(temperature and moisture),the westerly wind over the plume transport path,and the fire features(i.e.,the maximum fire power and burned area).The results confirm a linkage between WUS fires and severe weather in the CUS,corroborating the findings of our previous modeling study conducted on case simulations with a detailed physics model.展开更多
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an...Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.展开更多
Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary w...Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.展开更多
As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality in...As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality into their consideration.The most straight-forward method to unlearn users'contribution is to retrain the model from the initial state,which is not realistic in high throughput applications with frequent unlearning requests.Though some machine unlearning frameworks have been proposed to speed up the retraining process,they fail to match decentralised learning scenarios.A decentralised unlearning framework called heterogeneous decentralised unlearning framework with seed(HDUS)is designed,which uses distilled seed models to construct erasable en-sembles for all clients.Moreover,the framework is compatible with heterogeneous on-device models,representing stronger scalability in real-world applications.Extensive experiments on three real-world datasets show that our HDUS achieves state-of-the-art performance.展开更多
Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the prope...Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns.展开更多
A large number of network security breaches in IoT networks have demonstrated the unreliability of current Network Intrusion Detection Systems(NIDSs).Consequently,network interruptions and loss of sensitive data have ...A large number of network security breaches in IoT networks have demonstrated the unreliability of current Network Intrusion Detection Systems(NIDSs).Consequently,network interruptions and loss of sensitive data have occurred,which led to an active research area for improving NIDS technologies.In an analysis of related works,it was observed that most researchers aim to obtain better classification results by using a set of untried combinations of Feature Reduction(FR)and Machine Learning(ML)techniques on NIDS datasets.However,these datasets are different in feature sets,attack types,and network design.Therefore,this paper aims to discover whether these techniques can be generalised across various datasets.Six ML models are utilised:a Deep Feed Forward(DFF),Convolutional Neural Network(CNN),Recurrent Neural Network(RNN),Decision Tree(DT),Logistic Regression(LR),and Naive Bayes(NB).The accuracy of three Feature Extraction(FE)algorithms is detected;Principal Component Analysis(PCA),Auto-encoder(AE),and Linear Discriminant Analysis(LDA),are evaluated using three benchmark datasets:UNSW-NB15,ToN-IoT and CSE-CIC-IDS2018.Although PCA and AE algorithms have been widely used,the determination of their optimal number of extracted dimensions has been overlooked.The results indicate that no clear FE method or ML model can achieve the best scores for all datasets.The optimal number of extracted dimensions has been identified for each dataset,and LDA degrades the performance of the ML models on two datasets.The variance is used to analyse the extracted dimensions of LDA and PCA.Finally,this paper concludes that the choice of datasets significantly alters the performance of the applied techniques.We believe that a universal(benchmark)feature set is needed to facilitate further advancement and progress of research in this field.展开更多
基金Supported by Science and Technology Support Program of Qiandongnan Prefecture,No.Qiandongnan Sci-Tech Support[2021]12Guizhou Province High-Level Innovative Talent Training Program,No.Qiannan Thousand Talents[2022]201701.
文摘BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration.
文摘Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems.
基金financially supported by the National Key Research and Development Program of China(No.2016YFB0701202,No.2017YFB0701500 and No.2020YFB1505901)National Natural Science Foundation of China(General Program No.51474149,52072240)+3 种基金Shanghai Science and Technology Committee(No.18511109300)Science and Technology Commission of the CMC(2019JCJQZD27300)financial support from the University of Michigan and Shanghai Jiao Tong University joint funding,China(AE604401)Science and Technology Commission of Shanghai Municipality(No.18511109302).
文摘Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems.
基金supported by National Natural Science Foundation of China(12372049)Science and Technology Program of China National Accreditation Service for Confor-mity Assessment(2022CNAS15)+1 种基金Sichuan Science and Technology Program(2023JDRC0062)Independent Project of State Key Laboratory of Rail Transit Vehicle System(2023TPL-T06).
文摘Reducing the aerodynamic drag and noise levels of high-speed pantographs is important for promoting environmentally friendly,energy efficient and rapid advances in train technology.Using computational fluid dynamics theory and the K-FWH acoustic equation,a numerical simulation is conducted to investigate the aerodynamic characteristics of high-speed pantographs.A component optimization method is proposed as a possible solution to the problemof aerodynamic drag and noise in high-speed pantographs.The results of the study indicate that the panhead,base and insulator are the main contributors to aerodynamic drag and noise in high-speed pantographs.Therefore,a gradual optimization process is implemented to improve the most significant components that cause aerodynamic drag and noise.By optimizing the cross-sectional shape of the strips and insulators,the drag and noise caused by airflow separation and vortex shedding can be reduced.The aerodynamic drag of insulator with circular cross section and strips with rectangular cross section is the largest.Ellipsifying insulators and optimizing the chamfer angle and height of the windward surface of the strips can improve the aerodynamic performance of the pantograph.In addition,the streamlined fairing attached to the base can eliminate the complex flow and shield the radiated noise.In contrast to the original pantograph design,the improved pantograph shows a 21.1%reduction in aerodynamic drag and a 1.65 dBA reduction in aerodynamic noise.
基金the National Key R&D Program of China(No.2021YFB3701705).
文摘This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models.
基金the National Natural Science Foundation of China(Grant No.12102050)the Open Fund of State Key Laboratory of Explosion Science and Technology(Grant No.SKLEST-ZZ-21-18).
文摘The majority of the projectiles used in the hypersonic penetration study are solid flat-nosed cylindrical projectiles with a diameter of less than 20 mm.This study aims to fill the gap in the experimental and analytical study of the evolution of the nose shape of larger hollow projectiles under hypersonic penetration.In the hypersonic penetration test,eight ogive-nose AerMet100 steel projectiles with a diameter of 40 mm were launched to hit concrete targets with impact velocities that ranged from 1351 to 1877 m/s.Severe erosion of the projectiles was observed during high-speed penetration of heterogeneous targets,and apparent localized mushrooming occurred in the front nose of recovered projectiles.By examining the damage to projectiles,a linear relationship was found between the relative length reduction rate and the initial kinetic energy of projectiles in different penetration tests.Furthermore,microscopic analysis revealed the forming mechanism of the localized mushrooming phenomenon for eroding penetration,i.e.,material spall erosion abrasion mechanism,material flow and redistribution abrasion mechanism and localized radial upsetting deformation mechanism.Finally,a model of highspeed penetration that included erosion was established on the basis of a model of the evolution of the projectile nose that considers radial upsetting;the model was validated by test data from the literature and the present study.Depending upon the impact velocity,v0,the projectile nose may behave as undistorted,radially distorted or hemispherical.Due to the effects of abrasion of the projectile and enhancement of radial upsetting on the duration and amplitude of the secondary rising segment in the pulse shape of projectile deceleration,the predicted DOP had an upper limit.
基金The authors'team acknowledges the continuous and invaluable support from the Natural Science Foundation of China under the grants of 51577165,51690182,51377140,and 51077116.
文摘In this paper,application examples of high-speed electrical machines are presented,and the machine structures are categorized.Key issues of design and control for the high-speed permanent magnet machines are reviewed,including bearings selection,rotor dynamics analysis and design,rotor stress analysis and protection,thermal analysis and design,electromagnetic losses analysis and reduction,sensorless control strategies,as well as comparison and selection of sine-wave and square-wave drive modes.Some challenges are also discussed,so that future studies could be focused.
基金the National Key Research and Development Program of China under Grant 2021YFB3301300the National Natural Science Foundation of China under Grant 62203213+1 种基金the Natural Science Foundation of Jiangsu Province under Grant BK20220332the Open Project Program of Fujian Provincial Key Laboratory of Intelligent Identification and Control of Complex Dynamic System under Grant 2022A0004.
文摘The reliable operation of high-speed wire rod finishing mills is crucial in the steel production enterprise.As complex system-level equipment,it is difficult for high-speed wire rod finishing mills to realize fault location and real-time monitoring.To solve the above problems,an expert experience and data-driven-based hybrid fault diagnosis method for high-speed wire rod finishing mills is proposed in this paper.First,based on its mechanical structure,time and frequency domain analysis are improved in fault feature extraction.The approach of combining virtual value,peak value with kurtosis value index,is adopted in time domain analysis.Speed adjustment and side frequency analysis are proposed in frequency domain analysis to obtain accurate component characteristic frequency and its corresponding sideband.Then,according to time and frequency domain characteristics,fault location based on expert experience is proposed to get an accurate fault result.Finally,the proposed method is implemented in the equipment intelligent diagnosis system.By taking an equipment fault on site,for example,the effectiveness of the proposed method is illustrated in the system.
基金the financial support from the National Natural Science Foundation of China(22278070,21978047,21776046)。
文摘The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations.
基金EP-A and JMT-R acknowledges financial support from the project PID2021-128062NB-I00 funded by MCIN/AEI/10.13039/501100011033The lunar samples studied here were acquired in the framework of grant PGC2018-097374-B-I00(P.I.JMT-R)+3 种基金This project has received funding from the European Research Council(ERC)under the European Union’s Horizon 2020 research and innovation programme(No.865657)for the project“Quantum Chemistry on Interstellar Grains”(QUANTUMGRAIN),AR acknowledges financial support from the FEDER/Ministerio de Ciencia e Innovación-Agencia Estatal de Investigación(No.PID2021-126427NB-I00)Partial financial support from the Spanish Government(No.PID2020-116844RB-C21)the Generalitat de Catalunya(No.2021-SGR-00651)is acknowledgedThis work was supported by the LUMIO project funded by the Agenzia Spaziale Italiana(No.2024-6-HH.0).
文摘Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research.
基金supported by the National Natural Science Foundation of China(62172033).
文摘In recent years,the global surge of High-speed Railway(HSR)revolutionized ground transportation,providing secure,comfortable,and punctual services.The next-gen HSR,fueled by emerging services like video surveillance,emergency communication,and real-time scheduling,demands advanced capabilities in real-time perception,automated driving,and digitized services,which accelerate the integration and application of Artificial Intelligence(AI)in the HSR system.This paper first provides a brief overview of AI,covering its origin,evolution,and breakthrough applications.A comprehensive review is then given regarding the most advanced AI technologies and applications in three macro application domains of the HSR system:mechanical manufacturing and electrical control,communication and signal control,and transportation management.The literature is categorized and compared across nine application directions labeled as intelligent manufacturing of trains and key components,forecast of railroad maintenance,optimization of energy consumption in railroads and trains,communication security,communication dependability,channel modeling and estimation,passenger scheduling,traffic flow forecasting,high-speed railway smart platform.Finally,challenges associated with the application of AI are discussed,offering insights for future research directions.
文摘Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.
文摘The high rate of early recurrence in hepatocellular carcinoma(HCC)post curative surgical intervention poses a substantial clinical hurdle,impacting patient outcomes and complicating postoperative management.The advent of machine learning provides a unique opportunity to harness vast datasets,identifying subtle patterns and factors that elude conventional prognostic methods.Machine learning models,equipped with the ability to analyse intricate relationships within datasets,have shown promise in predicting outcomes in various medical disciplines.In the context of HCC,the application of machine learning to predict early recurrence holds potential for personalized postoperative care strategies.This editorial comments on the study carried out exploring the merits and efficacy of random survival forests(RSF)in identifying significant risk factors for recurrence,stratifying patients at low and high risk of HCC recurrence and comparing this to traditional COX proportional hazard models(CPH).In doing so,the study demonstrated that the RSF models are superior to traditional CPH models in predicting recurrence of HCC and represent a giant leap towards precision medicine.
基金financially supported by the National Natural Science Foundation of China(No.51974028)。
文摘The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys.
基金supported by the U.S.Department of Energy,Office of Science,Office of Biological and Environmental Research program as part of the Regional and Global Model Analysis and Multi-Sector Dynamics program areas(Award Number DE-SC0016605)Argonne National Laboratory is operated for the DOE by UChicago Argonne,LLC,under contract DE-AC02-06CH11357+1 种基金the National Energy Research Scientific Computing Center(NERSC)NERSC is a U.S.DOE Office of Science User Facility operated under Contract DE-AC02-05CH11231.
文摘Fires,including wildfires,harm air quality and essential public services like transportation,communication,and utilities.These fires can also influence atmospheric conditions,including temperature and aerosols,potentially affecting severe convective storms.Here,we investigate the remote impacts of fires in the western United States(WUS)on the occurrence of large hail(size:≥2.54 cm)in the central US(CUS)over the 20-year period of 2001–20 using the machine learning(ML),Random Forest(RF),and Extreme Gradient Boosting(XGB)methods.The developed RF and XGB models demonstrate high accuracy(>90%)and F1 scores of up to 0.78 in predicting large hail occurrences when WUS fires and CUS hailstorms coincide,particularly in four states(Wyoming,South Dakota,Nebraska,and Kansas).The key contributing variables identified from both ML models include the meteorological variables in the fire region(temperature and moisture),the westerly wind over the plume transport path,and the fire features(i.e.,the maximum fire power and burned area).The results confirm a linkage between WUS fires and severe weather in the CUS,corroborating the findings of our previous modeling study conducted on case simulations with a detailed physics model.
基金This work is supported by the National Key R&D Program of China(No.2022ZD0117501)the Singapore RIE2020 Advanced Manufacturing and Engineering Programmatic Grant by the Agency for Science,Technology and Research(A*STAR)under grant no.A1898b0043Tsinghua University Initiative Scientific Research Program and Low Carbon En-ergy Research Funding Initiative by A*STAR under grant number A-8000182-00-00.
文摘Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.
基金Korea Institute of Energy Technology Evaluation and Planning(KETEP)grant funded by the Korea government(Grant No.20214000000140,Graduate School of Convergence for Clean Energy Integrated Power Generation)Korea Basic Science Institute(National Research Facilities and Equipment Center)grant funded by the Ministry of Education(2021R1A6C101A449)the National Research Foundation of Korea grant funded by the Ministry of Science and ICT(2021R1A2C1095139),Republic of Korea。
文摘Mg alloys possess an inherent plastic anisotropy owing to the selective activation of deformation mechanisms depending on the loading condition.This characteristic results in a diverse range of flow curves that vary with a deformation condition.This study proposes a novel approach for accurately predicting an anisotropic deformation behavior of wrought Mg alloys using machine learning(ML)with data augmentation.The developed model combines four key strategies from data science:learning the entire flow curves,generative adversarial networks(GAN),algorithm-driven hyperparameter tuning,and gated recurrent unit(GRU)architecture.The proposed model,namely GAN-aided GRU,was extensively evaluated for various predictive scenarios,such as interpolation,extrapolation,and a limited dataset size.The model exhibited significant predictability and improved generalizability for estimating the anisotropic compressive behavior of ZK60 Mg alloys under 11 annealing conditions and for three loading directions.The GAN-aided GRU results were superior to those of previous ML models and constitutive equations.The superior performance was attributed to hyperparameter optimization,GAN-based data augmentation,and the inherent predictivity of the GRU for extrapolation.As a first attempt to employ ML techniques other than artificial neural networks,this study proposes a novel perspective on predicting the anisotropic deformation behaviors of wrought Mg alloys.
基金Australian Research Council,Grant/Award Numbers:FT210100624,DP190101985,DE230101033。
文摘As some recent information security legislation endowed users with unconditional rights to be forgotten by any trained machine learning model,personalised IoT service pro-viders have to put unlearning functionality into their consideration.The most straight-forward method to unlearn users'contribution is to retrain the model from the initial state,which is not realistic in high throughput applications with frequent unlearning requests.Though some machine unlearning frameworks have been proposed to speed up the retraining process,they fail to match decentralised learning scenarios.A decentralised unlearning framework called heterogeneous decentralised unlearning framework with seed(HDUS)is designed,which uses distilled seed models to construct erasable en-sembles for all clients.Moreover,the framework is compatible with heterogeneous on-device models,representing stronger scalability in real-world applications.Extensive experiments on three real-world datasets show that our HDUS achieves state-of-the-art performance.
基金This work has been supported by the Conselleria de Inno-vación,Universidades,Ciencia y Sociedad Digital de la Generalitat Valenciana(CIAICO/2021/335).
文摘Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns.
文摘A large number of network security breaches in IoT networks have demonstrated the unreliability of current Network Intrusion Detection Systems(NIDSs).Consequently,network interruptions and loss of sensitive data have occurred,which led to an active research area for improving NIDS technologies.In an analysis of related works,it was observed that most researchers aim to obtain better classification results by using a set of untried combinations of Feature Reduction(FR)and Machine Learning(ML)techniques on NIDS datasets.However,these datasets are different in feature sets,attack types,and network design.Therefore,this paper aims to discover whether these techniques can be generalised across various datasets.Six ML models are utilised:a Deep Feed Forward(DFF),Convolutional Neural Network(CNN),Recurrent Neural Network(RNN),Decision Tree(DT),Logistic Regression(LR),and Naive Bayes(NB).The accuracy of three Feature Extraction(FE)algorithms is detected;Principal Component Analysis(PCA),Auto-encoder(AE),and Linear Discriminant Analysis(LDA),are evaluated using three benchmark datasets:UNSW-NB15,ToN-IoT and CSE-CIC-IDS2018.Although PCA and AE algorithms have been widely used,the determination of their optimal number of extracted dimensions has been overlooked.The results indicate that no clear FE method or ML model can achieve the best scores for all datasets.The optimal number of extracted dimensions has been identified for each dataset,and LDA degrades the performance of the ML models on two datasets.The variance is used to analyse the extracted dimensions of LDA and PCA.Finally,this paper concludes that the choice of datasets significantly alters the performance of the applied techniques.We believe that a universal(benchmark)feature set is needed to facilitate further advancement and progress of research in this field.