期刊文献+
共找到4,038篇文章
< 1 2 202 >
每页显示 20 50 100
Enhancing Classification Algorithm Recommendation in Automated Machine Learning: A Meta-Learning Approach Using Multivariate Sparse Group Lasso
1
作者 Irfan Khan Xianchao Zhang +2 位作者 Ramesh Kumar Ayyasamy Saadat M.Alhashmi Azizur Rahim 《Computer Modeling in Engineering & Sciences》 2025年第2期1611-1636,共26页
The rapid growth of machine learning(ML)across fields has intensified the challenge of selecting the right algorithm for specific tasks,known as the Algorithm Selection Problem(ASP).Traditional trial-and-error methods... The rapid growth of machine learning(ML)across fields has intensified the challenge of selecting the right algorithm for specific tasks,known as the Algorithm Selection Problem(ASP).Traditional trial-and-error methods have become impractical due to their resource demands.Automated Machine Learning(AutoML)systems automate this process,but often neglect the group structures and sparsity in meta-features,leading to inefficiencies in algorithm recommendations for classification tasks.This paper proposes a meta-learning approach using Multivariate Sparse Group Lasso(MSGL)to address these limitations.Our method models both within-group and across-group sparsity among meta-features to manage high-dimensional data and reduce multicollinearity across eight meta-feature groups.The Fast Iterative Shrinkage-Thresholding Algorithm(FISTA)with adaptive restart efficiently solves the non-smooth optimization problem.Empirical validation on 145 classification datasets with 17 classification algorithms shows that our meta-learning method outperforms four state-of-the-art approaches,achieving 77.18%classification accuracy,86.07%recommendation accuracy and 88.83%normalized discounted cumulative gain. 展开更多
关键词 META-learning machine learning automated machine learning classification meta-features
下载PDF
Congruent Feature Selection Method to Improve the Efficacy of Machine Learning-Based Classification in Medical Image Processing
2
作者 Mohd Anjum Naoufel Kraiem +2 位作者 Hong Min Ashit Kumar Dutta Yousef Ibrahim Daradkeh 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期357-384,共28页
Machine learning(ML)is increasingly applied for medical image processing with appropriate learning paradigms.These applications include analyzing images of various organs,such as the brain,lung,eye,etc.,to identify sp... Machine learning(ML)is increasingly applied for medical image processing with appropriate learning paradigms.These applications include analyzing images of various organs,such as the brain,lung,eye,etc.,to identify specific flaws/diseases for diagnosis.The primary concern of ML applications is the precise selection of flexible image features for pattern detection and region classification.Most of the extracted image features are irrelevant and lead to an increase in computation time.Therefore,this article uses an analytical learning paradigm to design a Congruent Feature Selection Method to select the most relevant image features.This process trains the learning paradigm using similarity and correlation-based features over different textural intensities and pixel distributions.The similarity between the pixels over the various distribution patterns with high indexes is recommended for disease diagnosis.Later,the correlation based on intensity and distribution is analyzed to improve the feature selection congruency.Therefore,the more congruent pixels are sorted in the descending order of the selection,which identifies better regions than the distribution.Now,the learning paradigm is trained using intensity and region-based similarity to maximize the chances of selection.Therefore,the probability of feature selection,regardless of the textures and medical image patterns,is improved.This process enhances the performance of ML applications for different medical image processing.The proposed method improves the accuracy,precision,and training rate by 13.19%,10.69%,and 11.06%,respectively,compared to other models for the selected dataset.The mean error and selection time is also reduced by 12.56%and 13.56%,respectively,compared to the same models and dataset. 展开更多
关键词 Computer vision feature selection machine learning region detection texture analysis image classification medical images
下载PDF
Prediction of Shear Bond Strength of Asphalt Concrete Pavement Using Machine Learning Models and Grid Search Optimization Technique
3
作者 Quynh-Anh Thi Bui Dam Duc Nguyen +2 位作者 Hiep Van Le Indra Prakash Binh Thai Pham 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期691-712,共22页
Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Ext... Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design. 展开更多
关键词 Shear bond asphalt pavement grid search OPTIMIZATION machine learning
下载PDF
High-throughput screening of CO_(2) cycloaddition MOF catalyst with an explainable machine learning model
4
作者 Xuefeng Bai Yi Li +3 位作者 Yabo Xie Qiancheng Chen Xin Zhang Jian-Rong Li 《Green Energy & Environment》 SCIE EI CAS 2025年第1期132-138,共7页
The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF str... The high porosity and tunable chemical functionality of metal-organic frameworks(MOFs)make it a promising catalyst design platform.High-throughput screening of catalytic performance is feasible since the large MOF structure database is available.In this study,we report a machine learning model for high-throughput screening of MOF catalysts for the CO_(2) cycloaddition reaction.The descriptors for model training were judiciously chosen according to the reaction mechanism,which leads to high accuracy up to 97%for the 75%quantile of the training set as the classification criterion.The feature contribution was further evaluated with SHAP and PDP analysis to provide a certain physical understanding.12,415 hypothetical MOF structures and 100 reported MOFs were evaluated under 100℃ and 1 bar within one day using the model,and 239 potentially efficient catalysts were discovered.Among them,MOF-76(Y)achieved the top performance experimentally among reported MOFs,in good agreement with the prediction. 展开更多
关键词 Metal-organic frameworks High-throughput screening machine learning Explainable model CO_(2)cycloaddition
下载PDF
Machine Learning Techniques in Predicting Hot Deformation Behavior of Metallic Materials
5
作者 Petr Opela Josef Walek Jaromír Kopecek 《Computer Modeling in Engineering & Sciences》 SCIE EI 2025年第1期713-732,共20页
In engineering practice,it is often necessary to determine functional relationships between dependent and independent variables.These relationships can be highly nonlinear,and classical regression approaches cannot al... In engineering practice,it is often necessary to determine functional relationships between dependent and independent variables.These relationships can be highly nonlinear,and classical regression approaches cannot always provide sufficiently reliable solutions.Nevertheless,Machine Learning(ML)techniques,which offer advanced regression tools to address complicated engineering issues,have been developed and widely explored.This study investigates the selected ML techniques to evaluate their suitability for application in the hot deformation behavior of metallic materials.The ML-based regression methods of Artificial Neural Networks(ANNs),Support Vector Machine(SVM),Decision Tree Regression(DTR),and Gaussian Process Regression(GPR)are applied to mathematically describe hot flow stress curve datasets acquired experimentally for a medium-carbon steel.Although the GPR method has not been used for such a regression task before,the results showed that its performance is the most favorable and practically unrivaled;neither the ANN method nor the other studied ML techniques provide such precise results of the solved regression analysis. 展开更多
关键词 machine learning Gaussian process regression artificial neural networks support vector machine hot deformation behavior
下载PDF
Interpretable Machine Learning Method for Compressive Strength Prediction and Analysis of Pure Fly Ash-based Geopolymer Concrete
6
作者 SHI Yuqiong LI Jingyi +1 位作者 ZHANG Yang LI Li 《Journal of Wuhan University of Technology(Materials Science)》 SCIE EI CAS 2025年第1期65-78,共14页
In order to study the characteristics of pure fly ash-based geopolymer concrete(PFGC)conveniently,we used a machine learning method that can quantify the perception of characteristics to predict its compressive streng... In order to study the characteristics of pure fly ash-based geopolymer concrete(PFGC)conveniently,we used a machine learning method that can quantify the perception of characteristics to predict its compressive strength.In this study,505 groups of data were collected,and a new database of compressive strength of PFGC was constructed.In order to establish an accurate prediction model of compressive strength,five different types of machine learning networks were used for comparative analysis.The five machine learning models all showed good compressive strength prediction performance on PFGC.Among them,R2,MSE,RMSE and MAE of decision tree model(DT)are 0.99,1.58,1.25,and 0.25,respectively.While R2,MSE,RMSE and MAE of random forest model(RF)are 0.97,5.17,2.27 and 1.38,respectively.The two models have high prediction accuracy and outstanding generalization ability.In order to enhance the interpretability of model decision-making,we used importance ranking to obtain the perception of machine learning model to 13 variables.These 13 variables include chemical composition of fly ash(SiO_(2)/Al_(2)O_(3),Si/Al),the ratio of alkaline liquid to the binder,curing temperature,curing durations inside oven,fly ash dosage,fine aggregate dosage,coarse aggregate dosage,extra water dosage and sodium hydroxide dosage.Curing temperature,specimen ages and curing durations inside oven have the greatest influence on the prediction results,indicating that curing conditions have more prominent influence on the compressive strength of PFGC than ordinary Portland cement concrete.The importance of curing conditions of PFGC even exceeds that of the concrete mix proportion,due to the low reactivity of pure fly ash. 展开更多
关键词 machine learning pure fly ash geopolymer compressive strength feature perception
下载PDF
Revolutionizing diabetic retinopathy screening and management:The role of artificial intelligence and machine learning
7
作者 Mona Mohamed Ibrahim Abdalla Jaiprakash Mohanraj 《World Journal of Clinical Cases》 SCIE 2025年第5期1-12,共12页
Diabetic retinopathy(DR)remains a leading cause of vision impairment and blindness among individuals with diabetes,necessitating innovative approaches to screening and management.This editorial explores the transforma... Diabetic retinopathy(DR)remains a leading cause of vision impairment and blindness among individuals with diabetes,necessitating innovative approaches to screening and management.This editorial explores the transformative potential of artificial intelligence(AI)and machine learning(ML)in revolutionizing DR care.AI and ML technologies have demonstrated remarkable advancements in enhancing the accuracy,efficiency,and accessibility of DR screening,helping to overcome barriers to early detection.These technologies leverage vast datasets to identify patterns and predict disease progression with unprecedented precision,enabling clinicians to make more informed decisions.Furthermore,AI-driven solutions hold promise in personalizing management strategies for DR,incorpo-rating predictive analytics to tailor interventions and optimize treatment path-ways.By automating routine tasks,AI can reduce the burden on healthcare providers,allowing for a more focused allocation of resources towards complex patient care.This review aims to evaluate the current advancements and applic-ations of AI and ML in DR screening,and to discuss the potential of these techno-logies in developing personalized management strategies,ultimately aiming to improve patient outcomes and reduce the global burden of DR.The integration of AI and ML in DR care represents a paradigm shift,offering a glimpse into the future of ophthalmic healthcare. 展开更多
关键词 Diabetic retinopathy Artificial intelligence machine learning SCREENING MANAGEMENT Predictive analytics Personalized medicine
下载PDF
Machine learning in solid organ transplantation:Charting the evolving landscape
8
作者 Badi Rawashdeh Haneen Al-abdallat +3 位作者 Emre Arpali Beje Thomas Ty B Dunn Matthew Cooper 《World Journal of Transplantation》 2025年第1期165-177,共13页
BACKGROUND Machine learning(ML),a major branch of artificial intelligence,has not only demonstrated the potential to significantly improve numerous sectors of healthcare but has also made significant contributions to ... BACKGROUND Machine learning(ML),a major branch of artificial intelligence,has not only demonstrated the potential to significantly improve numerous sectors of healthcare but has also made significant contributions to the field of solid organ transplantation.ML provides revolutionary opportunities in areas such as donorrecipient matching,post-transplant monitoring,and patient care by automatically analyzing large amounts of data,identifying patterns,and forecasting outcomes.AIM To conduct a comprehensive bibliometric analysis of publications on the use of ML in transplantation to understand current research trends and their implications.METHODS On July 18,a thorough search strategy was used with the Web of Science database.ML and transplantation-related keywords were utilized.With the aid of the VOS viewer application,the identified articles were subjected to bibliometric variable analysis in order to determine publication counts,citation counts,contributing countries,and institutions,among other factors.RESULTS Of the 529 articles that were first identified,427 were deemed relevant for bibliometric analysis.A surge in publications was observed over the last four years,especially after 2018,signifying growing interest in this area.With 209 publications,the United States emerged as the top contributor.Notably,the"Journal of Heart and Lung Transplantation"and the"American Journal of Transplantation"emerged as the leading journals,publishing the highest number of relevant articles.Frequent keyword searches revealed that patient survival,mortality,outcomes,allocation,and risk assessment were significant themes of focus.CONCLUSION The growing body of pertinent publications highlights ML's growing presence in the field of solid organ transplantation.This bibliometric analysis highlights the growing importance of ML in transplant research and highlights its exciting potential to change medical practices and enhance patient outcomes.Encouraging collaboration between significant contributors can potentially fast-track advancements in this interdisciplinary domain. 展开更多
关键词 machine learning Artificial Intelligence Solid organ transplantation Bibliometric analysis
下载PDF
Machine learning model using immune indicators to predict outcomes in early liver cancer
9
作者 Yi Zhang Ke Shi +1 位作者 Ying Feng Xian-Bo Wang 《World Journal of Gastroenterology》 2025年第5期43-56,共14页
BACKGROUND Patients with early-stage hepatocellular carcinoma(HCC)generally have good survival rates following surgical resection.However,a subset of these patients experience recurrence within five years post-surgery... BACKGROUND Patients with early-stage hepatocellular carcinoma(HCC)generally have good survival rates following surgical resection.However,a subset of these patients experience recurrence within five years post-surgery.AIM To develop predictive models utilizing machine learning(ML)methods to detect early-stage patients at a high risk of mortality.METHODS Eight hundred and eight patients with HCC at Beijing Ditan Hospital were randomly allocated to training and validation cohorts in a 2:1 ratio.Prognostic models were generated using random survival forests and artificial neural networks(ANNs).These ML models were compared with other classic HCC scoring systems.A decision-tree model was established to validate the contri-bution of immune-inflammatory indicators to the long-term outlook of patients with early-stage HCC.RESULTS Immune-inflammatory markers,albumin-bilirubin scores,alpha-fetoprotein,tumor size,and International Normalized Ratio were closely associated with the 5-year survival rates.Among various predictive models,the ANN model gene-rated using these indicators through ML algorithms exhibited superior perfor-mance,with a 5-year area under the curve(AUC)of 0.85(95%CI:0.82-0.88).In the validation cohort,the 5-year AUC was 0.82(95%CI:0.74-0.85).According to the ANN model,patients were classified into high-risk and low-risk groups,with an overall survival hazard ratio of 7.98(95%CI:5.85-10.93,P<0.0001)between the two cohorts.INTRODUCTION Hepatocellular carcinoma(HCC)is one of the six most prevalent cancers[1]and the third leading cause of cancer-related mortality[2].China has some of the highest incidence and mortality rates for liver cancer,accounting for half of global cases[3,4].The Barcelona Clinic Liver Cancer(BCLC)Staging System is the most widely used framework for diagnosing and treating HCC[5].The optimal candidates for surgical treatment are those with early-stage HCC,classified as BCLC stage 0 or A.Patients with early-stage liver cancer typically have a better prognosis after surgical resection,achieving a 5-year survival rate of 60%-70%[6].However,the high postoperative recurrence rates of HCC remain a major obstacle to long-term efficacy.To improve the prognosis of patients with early-stage HCC,it is necessary to develop models that can identify those with poor prognoses,enabling stratified and personalized treatment and follow-up strategies.Chronic inflammation is linked to the development and advancement of tumors[7].Recently,peripheral blood immune indicators,such as neutrophil-to-lymphocyte ratio(NLR),platelet-to-lymphocyte ratio(PLR),and lymphocyte-to-monocyte ratio(LMR),have garnered extensive attention and have been used to predict survival in various tumors and inflammation-related diseases[8-10].However,the relationship between these combinations of immune markers and the outcomes in patients with early-stage HCC require further investigation.Machine learning(ML)algorithms are capable of handling large and complex datasets,generating more accurate and personalized predictions through unique training algorithms that better manage nonlinear statistical relationships than traditional analytical methods.Commonly used ML models include artificial neural networks(ANNs)and random survival forests(RSFs),which have shown satisfactory accuracy in prognostic predictions across various cancers and other diseases[11-13].ANNs have performed well in identifying the progression from liver cirrhosis to HCC and predicting overall survival(OS)in patients with HCC[14,15].However,no studies have confirmed the ability of ML models to predict post-surgical survival in patients with early-stage HCC.Through ML,a better understanding of the risk factors for early-stage HCC prognosis can be achieved.This aids in surgical decision-making,identifying patients at a high risk of mortality,and selecting subsequent treatment strategies.In this study,we aimed to establish a 5-year prognostic model for patients with early-stage HCC after surgical resection,based on ML and systemic immune-inflammatory indicators.This model seeks to improve the early monitoring of high-risk patients and provide personalized treatment plans. 展开更多
关键词 Hepatocellular carcinoma Inflammation machine learning Prognosis Artificial neural networks Immune biomarkers
下载PDF
Machine learning and deep learning to improve prevention of anastomotic leak after rectal cancer surgery
10
作者 Francesco Celotto Quoc R Bao +2 位作者 Giulia Capelli Gaya Spolverato Andrew A Gumbs 《World Journal of Gastrointestinal Surgery》 2025年第1期25-31,共7页
Anastomotic leakage(AL)is a significant complication following rectal cancer surgery,adversely affecting both quality of life and oncological outcomes.Recent advancements in artificial intelligence(AI),particularly ma... Anastomotic leakage(AL)is a significant complication following rectal cancer surgery,adversely affecting both quality of life and oncological outcomes.Recent advancements in artificial intelligence(AI),particularly machine learning and deep learning,offer promising avenues for predicting and preventing AL.These technologies can analyze extensive clinical datasets to identify preoperative and perioperative risk factors such as malnutrition,body composition,and radiological features.AI-based models have demonstrated superior predictive power compared to traditional statistical methods,potentially guiding clinical decisionmaking and improving patient outcomes.Additionally,AI can provide surgeons with intraoperative feedback on blood supply and anatomical dissection planes,minimizing the risk of intraoperative complications and reducing the likelihood of AL development. 展开更多
关键词 Anastomotic leak Rectal cancer SURGERY machine learning Deep learning
下载PDF
A Machine Learning-Based Observational Constraint Correction Method for Seasonal Precipitation Prediction
11
作者 Bofei ZHANG Haipeng YU +5 位作者 Zeyong HU Ping YUE Zunye TANG Hongyu LUO Guantian WANG Shanling CHENG 《Advances in Atmospheric Sciences》 2025年第1期36-52,共17页
Seasonal precipitation has always been a key focus of climate prediction.As a dynamic-statistical combined method,the existing observational constraint correction establishes a regression relationship between the nume... Seasonal precipitation has always been a key focus of climate prediction.As a dynamic-statistical combined method,the existing observational constraint correction establishes a regression relationship between the numerical model outputs and historical observations,which can partly predict seasonal precipitation.However,solving a nonlinear problem through linear regression is significantly biased.This study implements a nonlinear optimization of an existing observational constrained correction model using a Light Gradient Boosting Machine(LightGBM)machine learning algorithm based on output from the Beijing National Climate Center Climate System Model(BCC-CSM)and station observations to improve the prediction of summer precipitation in China.The model was trained using a rolling approach,and LightGBM outperformed Linear Regression(LR),Extreme Gradient Boosting(XGBoost),and Categorical Boosting(CatBoost).Using parameter tuning to optimize the machine learning model and predict future summer precipitation using eight different predictors in BCC-CSM,the mean Anomaly Correlation Coefficient(ACC)score in the 2019–22 summer precipitation predictions was 0.17,and the mean Prediction Score(PS)reached 74.The PS score was improved by 7.87%and 6.63%compared with the BCC-CSM and the linear observational constraint approach,respectively.The observational constraint correction prediction strategy with LightGBM significantly and stably improved the prediction of summer precipitation in China compared to the previous linear observational constraint solution,providing a reference for flood control and drought relief during the flood season(summer)in China. 展开更多
关键词 observational constraint LightGBM seasonal prediction summer precipitation machine learning
下载PDF
Modeling of Spring Phenology of Boreal Forest by Coupling Machine Learning and Diurnal Temperature Indicators
12
作者 DENG Guorong ZHANG Hongyan +3 位作者 HONG Ying GUO Xiaoyi YI Zhihua EHSAN BINIYAZ 《Chinese Geographical Science》 2025年第1期38-54,共17页
The roles of diurnal temperature in providing heat accumulation and chilling requirements for vegetation spring phenology differ.Although previous studies have established a stronger correlation between leaf onset and... The roles of diurnal temperature in providing heat accumulation and chilling requirements for vegetation spring phenology differ.Although previous studies have established a stronger correlation between leaf onset and diurnal temperature than between leaf onset and average temperature,current research on modeling spring phenology based on diurnal temperature indicators remains limited.In this study,we confirmed the start of the growing season(SOS)sensitivity to diurnal temperature and average temperature in boreal forest.The estimation of SOS was carried out by employing K-Nearest Neighbor Regression(KNR-TDN)model,Random Forest Regres-sion(RFR-TDN)model,eXtreme Gradient Boosting(XGB-TDN)model and Light Gradient Boosting Machine model(LightGBM-TDN)driven by diurnal temperature indicators during 1982-2015,and the SOS was projected from 2015 to 2100 based on the Coupled Model Intercomparison Project Phase 6(CMIP6)climate scenario datasets.The sensitivity of boreal forest SOS to daytime temperature is greater than that to average temperature and nighttime temperature.The LightGBM-TDN model perform best across all vegetation types,exhibiting the lowest RMSE and bias compared to the KNR-TDN model,RFR-TDN model and XGB-TDN model.By incorporating diurn-al temperature indicators instead of relying only on average temperature indicators to simulate spring phenology,an improvement in the accuracy of the model is achieved.Furthermore,the preseason accumulated daytime temperature,daytime temperature and snow cover end date emerged as significant drivers of the SOS simulation in the study area.The simulation results based on LightGBM-TDN model exhibit a trend of advancing SOS followed by stabilization under future climate scenarios.This study underscores the potential of diurn-al temperature indicators as a viable alternative to average temperature indicators in driving spring phenology models,offering a prom-ising new method for simulating spring phenology. 展开更多
关键词 spring phenology diurnal temperature machine learning future climate scenarios boreal forest
下载PDF
Machine learning based damage state identification:A novel perspective on fragility analysis for nuclear power plants considering structural uncertainties
13
作者 Zheng Zhi Wang Yong +1 位作者 Pan Xiaolan Ji Duofa 《Earthquake Engineering and Engineering Vibration》 2025年第1期201-222,共22页
Seismic fragility analysis(SFA)is known as an effective probabilistic-based approach used to evaluate seismic fragility.There are various sources of uncertainties associated with this approach.A nuclear power plant(NP... Seismic fragility analysis(SFA)is known as an effective probabilistic-based approach used to evaluate seismic fragility.There are various sources of uncertainties associated with this approach.A nuclear power plant(NPP)system is an extremely important infrastructure and contains many structural uncertainties due to construction issues or structural deterioration during service.Simulation of structural uncertainties effects is a costly and time-consuming endeavor.A novel approach to SFA for the NPP considering structural uncertainties based on the damage state is proposed and examined.The results suggest that considering the structural uncertainties is essential in assessing the fragility of the NPP structure,and the impact of structural uncertainties tends to increase with the state of damage.Subsequently,machine learning(ML)is found to be superior in high-precision damage state identification of the NPP for reducing the time of nonlinear time-history analysis(NLTHA)and could be applied in the damage state-based SFA.Also,the impact of various sources of uncertainties is investigated through sensitivity analysis.The Sobol and Shapley additive explanations(SHAP)method can be complementary to each other and able to solve the problem of quantifying seismic and structural uncertainties simultaneously and the interaction effect of each parameter. 展开更多
关键词 seismic fragility analysis damage state structural uncertainties machine learning sensitivity analysis
下载PDF
Machine learning prediction of hepatic encephalopathy for long-term survival after transjugular intrahepatic portosystemic shunt in acute variceal bleeding
14
作者 De-Jia Liu Li-Xuan Jia +9 位作者 Feng-Xia Zeng Wei-Xiong Zeng Geng-Geng Qin Qi-Feng Peng Qing Tan Hui Zeng Zhong-Yue Ou Li-Zi Kun Jian-Bo Zhao Wei-Guo Chen 《World Journal of Gastroenterology》 2025年第4期59-71,共13页
BACKGROUND Transjugular intrahepatic portosystemic shunt(TIPS)is an effective intervention for managing complications of portal hypertension,particularly acute variceal bleeding(AVB).While effective in reducing portal... BACKGROUND Transjugular intrahepatic portosystemic shunt(TIPS)is an effective intervention for managing complications of portal hypertension,particularly acute variceal bleeding(AVB).While effective in reducing portal pressure and preventing rebleeding,TIPS is associated with a considerable risk of overt hepatic encephalopathy(OHE),a complication that significantly elevates mortality rates.AIM To develop a machine learning(ML)model to predict OHE occurrence post-TIPS in patients with AVB using a 5-year dataset.METHODS This retrospective single-center study included 218 patients with AVB who underwent TIPS.The dataset was divided into training(70%)and testing(30%)sets.Critical features were identified using embedded methods and recursive feature elimination.Three ML algorithms-random forest,extreme gradient boosting,and logistic regression-were validated via 10-fold cross-validation.SHapley Additive exPlanations analysis was employed to interpret the model’s predictions.Survival analysis was conducted using Kaplan-Meier curves and stepwise Cox regression analysis to compare overall survival(OS)between patients with and without OHE.RESULTS The median OS of the study cohort was 47.83±22.95 months.Among the models evaluated,logistic regression demonstrated the highest performance with an area under the curve(AUC)of 0.825.Key predictors identified were Child-Pugh score,age,and portal vein thrombosis.Kaplan-Meier analysis revealed that patients without OHE had a significantly longer OS(P=0.005).The 5-year survival rate was 78.4%,with an OHE incidence of 15.1%.Both actual OHE status and predicted OHE value were significant predictors in each Cox model,with model-predicted OHE achieving an AUC of 88.1 in survival prediction.CONCLUSION The ML model accurately predicts post-TIPS OHE and outperforms traditional models,supporting its use in improving outcomes in patients with AVB. 展开更多
关键词 Transjugular intrahepatic portosystemic shunt Acute variceal bleeding Overt hepatic encephalopathy machine learning Logistic regression
下载PDF
Predicting Diabetic Retinopathy Using a Machine Learning Approach Informed by Whole-Exome Sequencing Studies
15
作者 Chongyang She Wenying Fan +2 位作者 Yunyun Li Yong Tao Zufei Li 《Biomedical and Environmental Sciences》 2025年第1期67-78,共12页
Objective To establish and validate a novel diabetic retinopathy(DR)risk-prediction model using a whole-exome sequencing(WES)-based machine learning(ML)method.Methods WES was performed to identify potential single nuc... Objective To establish and validate a novel diabetic retinopathy(DR)risk-prediction model using a whole-exome sequencing(WES)-based machine learning(ML)method.Methods WES was performed to identify potential single nucleotide polymorphism(SNP)or mutation sites in a DR pedigree comprising 10 members.A prediction model was established and validated in a cohort of 420 type 2 diabetic patients based on both genetic and demographic features.The contribution of each feature was assessed using Shapley Additive explanation analysis.The efficacies of the models with and without SNP were compared.Results WES revealed that seven SNPs/mutations(rs116911833 in TRIM7,1997T>C in LRBA,1643T>C in PRMT10,rs117858678 in C9orf152,rs201922794 in CLDN25,rs146694895 in SH3GLB2,and rs201407189 in FANCC)were associated with DR.Notably,the model including rs146694895 and rs201407189 achieved better performance in predicting DR(accuracy:80.2%;sensitivity:83.3%;specificity:76.7%;area under the receiver operating characteristic curve[AUC]:80.0%)than the model without these SNPs(accuracy:79.4%;sensitivity:80.3%;specificity:78.3%;AUC:79.3%).Conclusion Novel SNP sites associated with DR were identified in the DR pedigree.Inclusion of rs146694895 and rs201407189 significantly enhanced the performance of the ML-based DR prediction model. 展开更多
关键词 machine learning Diabetic retinopathy Whole exome sequencing Type 2 diabetes mellitus
下载PDF
Improving performance of screening MM/PBSA in protein–ligand interactions via machine learning
16
作者 Yuan-Qiang Chen Yao Xu +1 位作者 Yu-Qiang Ma Hong-Ming Ding 《Chinese Physics B》 2025年第1期486-496,共11页
Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surfa... Accurately estimating protein–ligand binding free energy is crucial for drug design and biophysics, yet remains a challenging task. In this study, we applied the screening molecular mechanics/Poisson–Boltzmann surface area(MM/PBSA)method in combination with various machine learning techniques to compute the binding free energies of protein–ligand interactions. Our results demonstrate that machine learning outperforms direct screening MM/PBSA calculations in predicting protein–ligand binding free energies. Notably, the random forest(RF) method exhibited the best predictive performance,with a Pearson correlation coefficient(rp) of 0.702 and a mean absolute error(MAE) of 1.379 kcal/mol. Furthermore, we analyzed feature importance rankings in the gradient boosting(GB), adaptive boosting(Ada Boost), and RF methods, and found that feature selection significantly impacted predictive performance. In particular, molecular weight(MW) and van der Waals(VDW) energies played a decisive role in the prediction. Overall, this study highlights the potential of combining machine learning methods with screening MM/PBSA for accurately predicting binding free energies in biosystems. 展开更多
关键词 molecular mechanics/Poisson-Boltzmann surface area(MM/PBSA) binding free energy machine learning protein-ligand interaction
下载PDF
Database of ternary amorphous alloys based on machine learning
17
作者 Xuhe Gong Ran Li +2 位作者 Ruijuan Xiao Tao Zhang Hong Li 《Chinese Physics B》 2025年第1期129-133,共5页
The unique long-range disordered atomic arrangement inherent in amorphous materials endows them with a range of superior properties,rendering them highly promising for applications in catalysis,medicine,and battery te... The unique long-range disordered atomic arrangement inherent in amorphous materials endows them with a range of superior properties,rendering them highly promising for applications in catalysis,medicine,and battery technology,among other fields.Since not all materials can be synthesized into an amorphous structure,the composition design of amorphous materials holds significant importance.Machine learning offers a valuable alternative to traditional“trial-anderror”methods by predicting properties through experimental data,thus providing efficient guidance in material design.In this study,we develop a machine learning workflow to predict the critical casting diameter,glass transition temperature,and Young's modulus for 45 ternary reported amorphous alloy systems.The predicted results have been organized into a database,enabling direct retrieval of predicted values based on compositional information.Furthermore,the applications of high glass forming ability region screening for specified system,multi-property target system screening and high glass forming ability region search through iteration are also demonstrated.By utilizing machine learning predictions,researchers can effectively narrow the experimental scope and expedite the exploration of compositions. 展开更多
关键词 amorphous alloys machine learning DATABASE
下载PDF
Machine learning for predicting the outcome of terminal ballistics events 被引量:2
18
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
下载PDF
Advancements in machine learning for material design and process optimization in the field of additive manufacturing
19
作者 Hao-ran Zhou Hao Yang +8 位作者 Huai-qian Li Ying-chun Ma Sen Yu Jian shi Jing-chang Cheng Peng Gao Bo Yu Zhi-quan Miao Yan-peng Wei 《China Foundry》 SCIE EI CAS CSCD 2024年第2期101-115,共15页
Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is co... Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is constrained by issues like unclear fundamental principles,complex experimental cycles,and high costs.Machine learning,as a novel artificial intelligence technology,has the potential to deeply engage in the development of additive manufacturing process,assisting engineers in learning and developing new techniques.This paper provides a comprehensive overview of the research and applications of machine learning in the field of additive manufacturing,particularly in model design and process development.Firstly,it introduces the background and significance of machine learning-assisted design in additive manufacturing process.It then further delves into the application of machine learning in additive manufacturing,focusing on model design and process guidance.Finally,it concludes by summarizing and forecasting the development trends of machine learning technology in the field of additive manufacturing. 展开更多
关键词 additive manufacturing machine learning material design process optimization intersection of disciplines embedded machine learning
下载PDF
Multimodal Machine Learning Guides Low Carbon Aeration Strategies in Urban Wastewater Treatment
20
作者 Hong-Cheng Wang Yu-Qi Wang +4 位作者 Xu Wang Wan-Xin Yin Ting-Chao Yu Chen-Hao Xue Ai-Jie Wang 《Engineering》 SCIE EI CAS CSCD 2024年第5期51-62,共12页
The potential for reducing greenhouse gas(GHG)emissions and energy consumption in wastewater treatment can be realized through intelligent control,with machine learning(ML)and multimodality emerging as a promising sol... The potential for reducing greenhouse gas(GHG)emissions and energy consumption in wastewater treatment can be realized through intelligent control,with machine learning(ML)and multimodality emerging as a promising solution.Here,we introduce an ML technique based on multimodal strategies,focusing specifically on intelligent aeration control in wastewater treatment plants(WWTPs).The generalization of the multimodal strategy is demonstrated on eight ML models.The results demonstrate that this multimodal strategy significantly enhances model indicators for ML in environmental science and the efficiency of aeration control,exhibiting exceptional performance and interpretability.Integrating random forest with visual models achieves the highest accuracy in forecasting aeration quantity in multimodal models,with a mean absolute percentage error of 4.4%and a coefficient of determination of 0.948.Practical testing in a full-scale plant reveals that the multimodal model can reduce operation costs by 19.8%compared to traditional fuzzy control methods.The potential application of these strategies in critical water science domains is discussed.To foster accessibility and promote widespread adoption,the multimodal ML models are freely available on GitHub,thereby eliminating technical barriers and encouraging the application of artificial intelligence in urban wastewater treatment. 展开更多
关键词 Wastewater treatment Multimodal machine learning Deep learning Aeration control Interpretable machine learning
下载PDF
上一页 1 2 202 下一页 到第
使用帮助 返回顶部