Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Ext...Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.展开更多
Diabetic retinopathy(DR)remains a leading cause of vision impairment and blindness among individuals with diabetes,necessitating innovative approaches to screening and management.This editorial explores the transforma...Diabetic retinopathy(DR)remains a leading cause of vision impairment and blindness among individuals with diabetes,necessitating innovative approaches to screening and management.This editorial explores the transformative potential of artificial intelligence(AI)and machine learning(ML)in revolutionizing DR care.AI and ML technologies have demonstrated remarkable advancements in enhancing the accuracy,efficiency,and accessibility of DR screening,helping to overcome barriers to early detection.These technologies leverage vast datasets to identify patterns and predict disease progression with unprecedented precision,enabling clinicians to make more informed decisions.Furthermore,AI-driven solutions hold promise in personalizing management strategies for DR,incorpo-rating predictive analytics to tailor interventions and optimize treatment path-ways.By automating routine tasks,AI can reduce the burden on healthcare providers,allowing for a more focused allocation of resources towards complex patient care.This review aims to evaluate the current advancements and applic-ations of AI and ML in DR screening,and to discuss the potential of these techno-logies in developing personalized management strategies,ultimately aiming to improve patient outcomes and reduce the global burden of DR.The integration of AI and ML in DR care represents a paradigm shift,offering a glimpse into the future of ophthalmic healthcare.展开更多
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique...Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.展开更多
The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important prac...The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys.展开更多
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an...Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.展开更多
The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating condi...The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method.展开更多
The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during the...The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches.展开更多
BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are p...BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication.展开更多
The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad...The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad measurements but lack finer resolution.Laboratory-based rock core measurements offer higher resolution but are resource-intensive.Conventionally,wireline logging and rock core measurements have been used independently.This study introduces a novel approach that integrates both data sources.The method leverages the detailed features from limited core data to enhance the resolution of wireline logging data.By combining machine learning with random field theory,the method allows for probabilistic predictions in regions with sparse data sampling.In this framework,12 parameters from wireline tests are used to predict trends in rock core data.The residuals are modeled using random field theory.The outcomes are high-resolution predictions that combine both the predicted trend and the probabilistic realizations of the residual.By utilizing unconditional and conditional random field theories,this method enables unconditional and conditional simulations of the underlying high-resolution rock compressional wave travel time profile and provides uncertainty estimates.This integrated approach optimizes the use of existing core and logging data.Its applicability is confirmed in an oil project in West China.展开更多
In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the...In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating.展开更多
Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the rece...Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms.展开更多
Accessing drinking water is a global issue. This study aims to contribute to the assessment of groundwater quality in the municipality of Za-Kpota (southern Benin) using remote sensing and Machine Learning. The method...Accessing drinking water is a global issue. This study aims to contribute to the assessment of groundwater quality in the municipality of Za-Kpota (southern Benin) using remote sensing and Machine Learning. The methodological approach used consisted in linking groundwater physico-chemical parameter data collected in the field and in the laboratory using AFNOR 1994 standardized methods to satellite data (Landsat) in order to sketch out a groundwater quality prediction model. The data was processed using QGis (Semi-Automatic Plugin: SCP) and Python (Jupyter Netebook: Prediction) softwares. The results of water analysis from the sampled wells and boreholes indicated that most of the water is acidic (pH varying between 5.59 and 7.83). The water was moderately mineralized, with conductivity values of less than 1500 μs/cm overall (59 µS/cm to 1344 µS/cm), with high concentrations of nitrates and phosphates in places. The dynamics of groundwater quality in the municipality of Za-Kpota between 2008 and 2022 are also marked by a regression in land use units (a regression in vegetation and marshland formation in favor of built-up areas, bare soil, crops and fallow land) revealed by the diachronic analysis of satellite images from 2008, 2013, 2018 and 2022. Surveys of local residents revealed the use of herbicides and pesticides in agricultural fields, which are the main drivers contributing to the groundwater quality deterioration observed in the study area. Field surveys revealed the use of herbicides and pesticides in agricultural fields, which are factors contributing to the deterioration in groundwater quality observed in the study area. The results of the groundwater quality prediction models (ANN, RF and LR) developed led to the conclusion that the model based on Artificial Neural Networks (ANN: R2 = 0.97 and RMSE = 0) is the best for groundwater quality changes modelling in the Za-Kpota municipality.展开更多
Railway switch machine is essential for maintaining the safety and punctuality of train operations.A data-driven fault diagnosis scheme for railway switch machine using tensor machine and multi-representation monitori...Railway switch machine is essential for maintaining the safety and punctuality of train operations.A data-driven fault diagnosis scheme for railway switch machine using tensor machine and multi-representation monitoring data is developed herein.Unlike existing methods,this approach takes into account the spatial information of the time series monitoring data,aligning with the domain expertise of on-site manual monitoring.Besides,a multi-sensor fusion tensor machine is designed to improve single signal data’s limitations in insufficient information.First,one-dimensional signal data is preprocessed and transformed into two-dimensional images.Afterward,the fusion feature tensor is created by utilizing the images of the three-phase current and employing the CANDE-COMP/PARAFAC(CP)decomposition method.Then,the tensor learning-based model is built using the extracted fusion feature tensor.The developed fault diagnosis scheme is valid with the field three-phase current dataset.The experiment indicates an enhanced performance of the developed fault diagnosis scheme over the current approach,particularly in terms of recall,precision,and F1-score.展开更多
Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is co...Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is constrained by issues like unclear fundamental principles,complex experimental cycles,and high costs.Machine learning,as a novel artificial intelligence technology,has the potential to deeply engage in the development of additive manufacturing process,assisting engineers in learning and developing new techniques.This paper provides a comprehensive overview of the research and applications of machine learning in the field of additive manufacturing,particularly in model design and process development.Firstly,it introduces the background and significance of machine learning-assisted design in additive manufacturing process.It then further delves into the application of machine learning in additive manufacturing,focusing on model design and process guidance.Finally,it concludes by summarizing and forecasting the development trends of machine learning technology in the field of additive manufacturing.展开更多
The paper presents an innovative approach towards agricultural insurance underwriting and risk pricing through the development of an Extreme Machine Learning (ELM) Actuarial Intelligent Model. This model integrates di...The paper presents an innovative approach towards agricultural insurance underwriting and risk pricing through the development of an Extreme Machine Learning (ELM) Actuarial Intelligent Model. This model integrates diverse datasets, including climate change scenarios, crop types, farm sizes, and various risk factors, to automate underwriting decisions and estimate loss reserves in agricultural insurance. The study conducts extensive exploratory data analysis, model building, feature engineering, and validation to demonstrate the effectiveness of the proposed approach. Additionally, the paper discusses the application of robust tests, stress tests, and scenario tests to assess the model’s resilience and adaptability to changing market conditions. Overall, the research contributes to advancing actuarial science in agricultural insurance by leveraging advanced machine learning techniques for enhanced risk management and decision-making.展开更多
The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting me...The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting mechanism(FOS-ELM)are applied in the prediction of the lime utilization ratio of dephosphorization in the basic oxygen furnace steelmaking process.The ELM model exhibites the best performance compared with the models of MLR and SVR.OS-ELM and FOS-ELM are applied for sequential learning and model updating.The optimal number of samples in validity term of the FOS-ELM model is determined to be 1500,with the smallest population mean absolute relative error(MARE)value of 0.058226 for the population.The variable importance analysis reveals lime weight,initial P content,and hot metal weight as the most important variables for the lime utilization ratio.The lime utilization ratio increases with the decrease in lime weight and the increases in the initial P content and hot metal weight.A prediction system based on FOS-ELM is applied in actual industrial production for one month.The hit ratios of the predicted lime utilization ratio in the error ranges of±1%,±3%,and±5%are 61.16%,90.63%,and 94.11%,respectively.The coefficient of determination,MARE,and root mean square error are 0.8670,0.06823,and 1.4265,respectively.The system exhibits desirable performance for applications in actual industrial pro-duction.展开更多
The underwater wireless optical communication(UWOC)system has gradually become essential to underwater wireless communication technology.Unlike other existing works on UWOC systems,this paper evaluates the proposed ma...The underwater wireless optical communication(UWOC)system has gradually become essential to underwater wireless communication technology.Unlike other existing works on UWOC systems,this paper evaluates the proposed machine learningbased signal demodulation methods through the selfbuilt experimental platform.Based on such a platform,we first construct a real signal dataset with ten modulation methods.Then,we propose a deep belief network(DBN)-based demodulator for feature extraction and multi-class feature classification.We also design an adaptive boosting(Ada Boost)demodulator as an alternative scheme without feature filtering for multiple modulated signals.Finally,it is demonstrated by extensive experimental results that the Ada Boost demodulator significantly outperforms the other algorithms.It also reveals that the demodulator accuracy decreases as the modulation order increases for a fixed received optical power.A higher-order modulation may achieve a higher effective transmission rate when the signal-to-noise ratio(SNR)is higher.展开更多
BACKGROUND:Sepsis is one of the main causes of mortality in intensive care units(ICUs).Early prediction is critical for reducing injury.As approximately 36%of sepsis occur within 24 h after emergency department(ED)adm...BACKGROUND:Sepsis is one of the main causes of mortality in intensive care units(ICUs).Early prediction is critical for reducing injury.As approximately 36%of sepsis occur within 24 h after emergency department(ED)admission in Medical Information Mart for Intensive Care(MIMIC-IV),a prediction system for the ED triage stage would be helpful.Previous methods such as the quick Sequential Organ Failure Assessment(qSOFA)are more suitable for screening than for prediction in the ED,and we aimed to fi nd a light-weight,convenient prediction method through machine learning.METHODS:We accessed the MIMIC-IV for sepsis patient data in the EDs.Our dataset comprised demographic information,vital signs,and synthetic features.Extreme Gradient Boosting(XGBoost)was used to predict the risk of developing sepsis within 24 h after ED admission.Additionally,SHapley Additive exPlanations(SHAP)was employed to provide a comprehensive interpretation of the model's results.Ten percent of the patients were randomly selected as the testing set,while the remaining patients were used for training with 10-fold cross-validation.RESULTS:For 10-fold cross-validation on 14,957 samples,we reached an accuracy of 84.1%±0.3%and an area under the receiver operating characteristic(ROC)curve of 0.92±0.02.The model achieved similar performance on the testing set of 1,662 patients.SHAP values showed that the fi ve most important features were acuity,arrival transportation,age,shock index,and respiratory rate.CONCLUSION:Machine learning models such as XGBoost may be used for sepsis prediction using only a small amount of data conveniently collected in the ED triage stage.This may help reduce workload in the ED and warn medical workers against the risk of sepsis in advance.展开更多
Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to...Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to managing SPL variability,underscoring the critical importance of robust cybersecurity measures.This paper advocates for leveraging machine learning(ML)to address variability management issues and fortify the security of SPL.In the context of the broader special issue theme on innovative cybersecurity approaches,our proposed ML-based framework offers an interdisciplinary perspective,blending insights from computing,social sciences,and business.Specifically,it employs ML for demand analysis,dynamic feature extraction,and enhanced feature selection in distributed settings,contributing to cyber-resilient ecosystems.Our experiments demonstrate the framework’s superiority,emphasizing its potential to boost productivity and security in SPLs.As digital threats evolve,this research catalyzes interdisciplinary collaborations,aligning with the special issue’s goal of breaking down academic barriers to strengthen digital ecosystems against sophisticated attacks while upholding ethics,privacy,and human values.展开更多
Traditional Enterprise Resource Planning (ERP) systems with relational databases take weeks to deliver predictable insights instantly. The most accurate information is provided to companies to make the best decisions ...Traditional Enterprise Resource Planning (ERP) systems with relational databases take weeks to deliver predictable insights instantly. The most accurate information is provided to companies to make the best decisions through advanced analytics that examine the past and the future and capture information about the present. Integrating machine learning (ML) into financial ERP systems offers several benefits, including increased accuracy, efficiency, and cost savings. Also, ERP systems are crucial in overseeing different aspects of Human Capital Management (HCM) in organizations. The performance of the staff draws the interest of the management. In particular, to guarantee that the proper employees are assigned to the convenient task at the suitable moment, train and qualify them, and build evaluation systems to follow up their performance and an attempt to maintain the potential talents of workers. Also, predicting employee salaries correctly is necessary for the efficient distribution of resources, retaining talent, and ensuring the success of the organization as a whole. Conventional ERP system salary forecasting methods typically use static reports that only show the system’s current state, without analyzing employee data or providing recommendations. We designed and enforced a prototype to define to apply ML algorithms on Oracle EBS data to enhance employee evaluation using real-time data directly from the ERP system. Based on measurements of accuracy, the Random Forest algorithm enhanced the performance of this system. This model offers an accuracy of 90% on the balanced dataset.展开更多
基金the University of Transport Technology under grant number DTTD2022-12.
文摘Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.
文摘Diabetic retinopathy(DR)remains a leading cause of vision impairment and blindness among individuals with diabetes,necessitating innovative approaches to screening and management.This editorial explores the transformative potential of artificial intelligence(AI)and machine learning(ML)in revolutionizing DR care.AI and ML technologies have demonstrated remarkable advancements in enhancing the accuracy,efficiency,and accessibility of DR screening,helping to overcome barriers to early detection.These technologies leverage vast datasets to identify patterns and predict disease progression with unprecedented precision,enabling clinicians to make more informed decisions.Furthermore,AI-driven solutions hold promise in personalizing management strategies for DR,incorpo-rating predictive analytics to tailor interventions and optimize treatment path-ways.By automating routine tasks,AI can reduce the burden on healthcare providers,allowing for a more focused allocation of resources towards complex patient care.This review aims to evaluate the current advancements and applic-ations of AI and ML in DR screening,and to discuss the potential of these techno-logies in developing personalized management strategies,ultimately aiming to improve patient outcomes and reduce the global burden of DR.The integration of AI and ML in DR care represents a paradigm shift,offering a glimpse into the future of ophthalmic healthcare.
文摘Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.
基金financially supported by the National Natural Science Foundation of China(No.51974028)。
文摘The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys.
基金This work is supported by the National Key R&D Program of China(No.2022ZD0117501)the Singapore RIE2020 Advanced Manufacturing and Engineering Programmatic Grant by the Agency for Science,Technology and Research(A*STAR)under grant no.A1898b0043Tsinghua University Initiative Scientific Research Program and Low Carbon En-ergy Research Funding Initiative by A*STAR under grant number A-8000182-00-00.
文摘Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end.
基金the financial support from the National Natural Science Foundation of China(52207229)the financial support from the China Scholarship Council(202207550010)。
文摘The safe and reliable operation of lithium-ion batteries necessitates the accurate prediction of remaining useful life(RUL).However,this task is challenging due to the diverse ageing mechanisms,various operating conditions,and limited measured signals.Although data-driven methods are perceived as a promising solution,they ignore intrinsic battery physics,leading to compromised accuracy,low efficiency,and low interpretability.In response,this study integrates domain knowledge into deep learning to enhance the RUL prediction performance.We demonstrate accurate RUL prediction using only a single charging curve.First,a generalisable physics-based model is developed to extract ageing-correlated parameters that can describe and explain battery degradation from battery charging data.The parameters inform a deep neural network(DNN)to predict RUL with high accuracy and efficiency.The trained model is validated under 3 types of batteries working under 7 conditions,considering fully charged and partially charged cases.Using data from one cycle only,the proposed method achieves a root mean squared error(RMSE)of 11.42 cycles and a mean absolute relative error(MARE)of 3.19%on average,which are over45%and 44%lower compared to the two state-of-the-art data-driven methods,respectively.Besides its accuracy,the proposed method also outperforms existing methods in terms of efficiency,input burden,and robustness.The inherent relationship between the model parameters and the battery degradation mechanism is further revealed,substantiating the intrinsic superiority of the proposed method.
文摘The advent of pandemics such as COVID-19 significantly impacts human behaviour and lives every day.Therefore,it is essential to make medical services connected to internet,available in every remote location during these situations.Also,the security issues in the Internet of Medical Things(IoMT)used in these service,make the situation even more critical because cyberattacks on the medical devices might cause treatment delays or clinical failures.Hence,services in the healthcare ecosystem need rapid,uninterrupted,and secure facilities.The solution provided in this research addresses security concerns and services availability for patients with critical health in remote areas.This research aims to develop an intelligent Software Defined Networks(SDNs)enabled secure framework for IoT healthcare ecosystem.We propose a hybrid of machine learning and deep learning techniques(DNN+SVM)to identify network intrusions in the sensor-based healthcare data.In addition,this system can efficiently monitor connected devices and suspicious behaviours.Finally,we evaluate the performance of our proposed framework using various performance metrics based on the healthcare application scenarios.the experimental results show that the proposed approach effectively detects and mitigates attacks in the SDN-enabled IoT networks and performs better that other state-of-art-approaches.
文摘BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication.
基金the Australian Government through the Australian Research Council's Discovery Projects funding scheme(Project DP190101592)the National Natural Science Foundation of China(Grant Nos.41972280 and 52179103).
文摘The travel time of rock compressional waves is an essential parameter used for estimating important rock properties,such as porosity,permeability,and lithology.Current methods,like wireline logging tests,provide broad measurements but lack finer resolution.Laboratory-based rock core measurements offer higher resolution but are resource-intensive.Conventionally,wireline logging and rock core measurements have been used independently.This study introduces a novel approach that integrates both data sources.The method leverages the detailed features from limited core data to enhance the resolution of wireline logging data.By combining machine learning with random field theory,the method allows for probabilistic predictions in regions with sparse data sampling.In this framework,12 parameters from wireline tests are used to predict trends in rock core data.The residuals are modeled using random field theory.The outcomes are high-resolution predictions that combine both the predicted trend and the probabilistic realizations of the residual.By utilizing unconditional and conditional random field theories,this method enables unconditional and conditional simulations of the underlying high-resolution rock compressional wave travel time profile and provides uncertainty estimates.This integrated approach optimizes the use of existing core and logging data.Its applicability is confirmed in an oil project in West China.
基金financially supported by the National Key Research and Development Program of China(Grant No.2022YFE0129800)the National Natural Science Foundation of China(Grant No.42202204)。
文摘In-situ upgrading by heating is feasible for low-maturity shale oil,where the pore space dynamically evolves.We characterize this response for a heated substrate concurrently imaged by SEM.We systematically follow the evolution of pore quantity,size(length,width and cross-sectional area),orientation,shape(aspect ratio,roundness and solidity)and their anisotropy—interpreted by machine learning.Results indicate that heating generates new pores in both organic matter and inorganic minerals.However,the newly formed pores are smaller than the original pores and thus reduce average lengths and widths of the bedding-parallel pore system.Conversely,the average pore lengths and widths are increased in the bedding-perpendicular direction.Besides,heating increases the cross-sectional area of pores in low-maturity oil shales,where this growth tendency fluctuates at<300℃ but becomes steady at>300℃.In addition,the orientation and shape of the newly-formed heating-induced pores follow the habit of the original pores and follow the initial probability distributions of pore orientation and shape.Herein,limited anisotropy is detected in pore direction and shape,indicating similar modes of evolution both bedding-parallel and bedding-normal.We propose a straightforward but robust model to describe evolution of pore system in low-maturity oil shales during heating.
基金supported in part by the National Natural Science Foundation of China(U2001213 and 61971191)in part by the Beijing Natural Science Foundation under Grant L182018 and L201011+2 种基金in part by National Key Research and Development Project(2020YFB1807204)in part by the Key project of Natural Science Foundation of Jiangxi Province(20202ACBL202006)in part by the Innovation Fund Designated for Graduate Students of Jiangxi Province(YC2020-S321)。
文摘Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms.
文摘Accessing drinking water is a global issue. This study aims to contribute to the assessment of groundwater quality in the municipality of Za-Kpota (southern Benin) using remote sensing and Machine Learning. The methodological approach used consisted in linking groundwater physico-chemical parameter data collected in the field and in the laboratory using AFNOR 1994 standardized methods to satellite data (Landsat) in order to sketch out a groundwater quality prediction model. The data was processed using QGis (Semi-Automatic Plugin: SCP) and Python (Jupyter Netebook: Prediction) softwares. The results of water analysis from the sampled wells and boreholes indicated that most of the water is acidic (pH varying between 5.59 and 7.83). The water was moderately mineralized, with conductivity values of less than 1500 μs/cm overall (59 µS/cm to 1344 µS/cm), with high concentrations of nitrates and phosphates in places. The dynamics of groundwater quality in the municipality of Za-Kpota between 2008 and 2022 are also marked by a regression in land use units (a regression in vegetation and marshland formation in favor of built-up areas, bare soil, crops and fallow land) revealed by the diachronic analysis of satellite images from 2008, 2013, 2018 and 2022. Surveys of local residents revealed the use of herbicides and pesticides in agricultural fields, which are the main drivers contributing to the groundwater quality deterioration observed in the study area. Field surveys revealed the use of herbicides and pesticides in agricultural fields, which are factors contributing to the deterioration in groundwater quality observed in the study area. The results of the groundwater quality prediction models (ANN, RF and LR) developed led to the conclusion that the model based on Artificial Neural Networks (ANN: R2 = 0.97 and RMSE = 0) is the best for groundwater quality changes modelling in the Za-Kpota municipality.
基金supported by the National Key Research and Development Program of China under Grant 2022YFB4300504-4the HKRGC Research Impact Fund under Grant R5020-18.
文摘Railway switch machine is essential for maintaining the safety and punctuality of train operations.A data-driven fault diagnosis scheme for railway switch machine using tensor machine and multi-representation monitoring data is developed herein.Unlike existing methods,this approach takes into account the spatial information of the time series monitoring data,aligning with the domain expertise of on-site manual monitoring.Besides,a multi-sensor fusion tensor machine is designed to improve single signal data’s limitations in insufficient information.First,one-dimensional signal data is preprocessed and transformed into two-dimensional images.Afterward,the fusion feature tensor is created by utilizing the images of the three-phase current and employing the CANDE-COMP/PARAFAC(CP)decomposition method.Then,the tensor learning-based model is built using the extracted fusion feature tensor.The developed fault diagnosis scheme is valid with the field three-phase current dataset.The experiment indicates an enhanced performance of the developed fault diagnosis scheme over the current approach,particularly in terms of recall,precision,and F1-score.
基金financially supported by the Technology Development Fund of China Academy of Machinery Science and Technology(No.170221ZY01)。
文摘Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is constrained by issues like unclear fundamental principles,complex experimental cycles,and high costs.Machine learning,as a novel artificial intelligence technology,has the potential to deeply engage in the development of additive manufacturing process,assisting engineers in learning and developing new techniques.This paper provides a comprehensive overview of the research and applications of machine learning in the field of additive manufacturing,particularly in model design and process development.Firstly,it introduces the background and significance of machine learning-assisted design in additive manufacturing process.It then further delves into the application of machine learning in additive manufacturing,focusing on model design and process guidance.Finally,it concludes by summarizing and forecasting the development trends of machine learning technology in the field of additive manufacturing.
文摘The paper presents an innovative approach towards agricultural insurance underwriting and risk pricing through the development of an Extreme Machine Learning (ELM) Actuarial Intelligent Model. This model integrates diverse datasets, including climate change scenarios, crop types, farm sizes, and various risk factors, to automate underwriting decisions and estimate loss reserves in agricultural insurance. The study conducts extensive exploratory data analysis, model building, feature engineering, and validation to demonstrate the effectiveness of the proposed approach. Additionally, the paper discusses the application of robust tests, stress tests, and scenario tests to assess the model’s resilience and adaptability to changing market conditions. Overall, the research contributes to advancing actuarial science in agricultural insurance by leveraging advanced machine learning techniques for enhanced risk management and decision-making.
基金supported by the National Natural Science Foundation of China (No.U1960202).
文摘The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting mechanism(FOS-ELM)are applied in the prediction of the lime utilization ratio of dephosphorization in the basic oxygen furnace steelmaking process.The ELM model exhibites the best performance compared with the models of MLR and SVR.OS-ELM and FOS-ELM are applied for sequential learning and model updating.The optimal number of samples in validity term of the FOS-ELM model is determined to be 1500,with the smallest population mean absolute relative error(MARE)value of 0.058226 for the population.The variable importance analysis reveals lime weight,initial P content,and hot metal weight as the most important variables for the lime utilization ratio.The lime utilization ratio increases with the decrease in lime weight and the increases in the initial P content and hot metal weight.A prediction system based on FOS-ELM is applied in actual industrial production for one month.The hit ratios of the predicted lime utilization ratio in the error ranges of±1%,±3%,and±5%are 61.16%,90.63%,and 94.11%,respectively.The coefficient of determination,MARE,and root mean square error are 0.8670,0.06823,and 1.4265,respectively.The system exhibits desirable performance for applications in actual industrial pro-duction.
基金supported by the major key project of Peng Cheng Laboratory under grant PCL2023AS31 and PCL2023AS1-2the National Key Research and Development Program of China(No.2019YFA0706604)the Natural Science Foundation(NSF)of China(Nos.61976169,62293483,62371451)。
文摘The underwater wireless optical communication(UWOC)system has gradually become essential to underwater wireless communication technology.Unlike other existing works on UWOC systems,this paper evaluates the proposed machine learningbased signal demodulation methods through the selfbuilt experimental platform.Based on such a platform,we first construct a real signal dataset with ten modulation methods.Then,we propose a deep belief network(DBN)-based demodulator for feature extraction and multi-class feature classification.We also design an adaptive boosting(Ada Boost)demodulator as an alternative scheme without feature filtering for multiple modulated signals.Finally,it is demonstrated by extensive experimental results that the Ada Boost demodulator significantly outperforms the other algorithms.It also reveals that the demodulator accuracy decreases as the modulation order increases for a fixed received optical power.A higher-order modulation may achieve a higher effective transmission rate when the signal-to-noise ratio(SNR)is higher.
基金supported by the National Key Research and Development Program of China(2021YFC2500803)the CAMS Innovation Fund for Medical Sciences(2021-I2M-1-056).
文摘BACKGROUND:Sepsis is one of the main causes of mortality in intensive care units(ICUs).Early prediction is critical for reducing injury.As approximately 36%of sepsis occur within 24 h after emergency department(ED)admission in Medical Information Mart for Intensive Care(MIMIC-IV),a prediction system for the ED triage stage would be helpful.Previous methods such as the quick Sequential Organ Failure Assessment(qSOFA)are more suitable for screening than for prediction in the ED,and we aimed to fi nd a light-weight,convenient prediction method through machine learning.METHODS:We accessed the MIMIC-IV for sepsis patient data in the EDs.Our dataset comprised demographic information,vital signs,and synthetic features.Extreme Gradient Boosting(XGBoost)was used to predict the risk of developing sepsis within 24 h after ED admission.Additionally,SHapley Additive exPlanations(SHAP)was employed to provide a comprehensive interpretation of the model's results.Ten percent of the patients were randomly selected as the testing set,while the remaining patients were used for training with 10-fold cross-validation.RESULTS:For 10-fold cross-validation on 14,957 samples,we reached an accuracy of 84.1%±0.3%and an area under the receiver operating characteristic(ROC)curve of 0.92±0.02.The model achieved similar performance on the testing set of 1,662 patients.SHAP values showed that the fi ve most important features were acuity,arrival transportation,age,shock index,and respiratory rate.CONCLUSION:Machine learning models such as XGBoost may be used for sepsis prediction using only a small amount of data conveniently collected in the ED triage stage.This may help reduce workload in the ED and warn medical workers against the risk of sepsis in advance.
基金supported via funding from Ministry of Defense,Government of Pakistan under Project Number AHQ/95013/6/4/8/NASTP(ACP).Titled:Development of ICT and Artificial Intelligence Based Precision Agriculture Systems Utilizing Dual-Use Aerospace Technologies-GREENAI.
文摘Embracing software product lines(SPLs)is pivotal in the dynamic landscape of contemporary software devel-opment.However,the flexibility and global distribution inherent in modern systems pose significant challenges to managing SPL variability,underscoring the critical importance of robust cybersecurity measures.This paper advocates for leveraging machine learning(ML)to address variability management issues and fortify the security of SPL.In the context of the broader special issue theme on innovative cybersecurity approaches,our proposed ML-based framework offers an interdisciplinary perspective,blending insights from computing,social sciences,and business.Specifically,it employs ML for demand analysis,dynamic feature extraction,and enhanced feature selection in distributed settings,contributing to cyber-resilient ecosystems.Our experiments demonstrate the framework’s superiority,emphasizing its potential to boost productivity and security in SPLs.As digital threats evolve,this research catalyzes interdisciplinary collaborations,aligning with the special issue’s goal of breaking down academic barriers to strengthen digital ecosystems against sophisticated attacks while upholding ethics,privacy,and human values.
文摘Traditional Enterprise Resource Planning (ERP) systems with relational databases take weeks to deliver predictable insights instantly. The most accurate information is provided to companies to make the best decisions through advanced analytics that examine the past and the future and capture information about the present. Integrating machine learning (ML) into financial ERP systems offers several benefits, including increased accuracy, efficiency, and cost savings. Also, ERP systems are crucial in overseeing different aspects of Human Capital Management (HCM) in organizations. The performance of the staff draws the interest of the management. In particular, to guarantee that the proper employees are assigned to the convenient task at the suitable moment, train and qualify them, and build evaluation systems to follow up their performance and an attempt to maintain the potential talents of workers. Also, predicting employee salaries correctly is necessary for the efficient distribution of resources, retaining talent, and ensuring the success of the organization as a whole. Conventional ERP system salary forecasting methods typically use static reports that only show the system’s current state, without analyzing employee data or providing recommendations. We designed and enforced a prototype to define to apply ML algorithms on Oracle EBS data to enhance employee evaluation using real-time data directly from the ERP system. Based on measurements of accuracy, the Random Forest algorithm enhanced the performance of this system. This model offers an accuracy of 90% on the balanced dataset.