期刊文献+
共找到6,533篇文章
< 1 2 250 >
每页显示 20 50 100
Landslide susceptibility prediction using slope unit-based machine learning models considering the heterogeneity of conditioning factors 被引量:3
1
作者 Zhilu Chang Filippo Catani +4 位作者 Faming Huang Gengzhe Liu Sansar Raj Meena Jinsong Huang Chuangbing Zhou 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2023年第5期1127-1143,共17页
To perform landslide susceptibility prediction(LSP),it is important to select appropriate mapping unit and landslide-related conditioning factors.The efficient and automatic multi-scale segmentation(MSS)method propose... To perform landslide susceptibility prediction(LSP),it is important to select appropriate mapping unit and landslide-related conditioning factors.The efficient and automatic multi-scale segmentation(MSS)method proposed by the authors promotes the application of slope units.However,LSP modeling based on these slope units has not been performed.Moreover,the heterogeneity of conditioning factors in slope units is neglected,leading to incomplete input variables of LSP modeling.In this study,the slope units extracted by the MSS method are used to construct LSP modeling,and the heterogeneity of conditioning factors is represented by the internal variations of conditioning factors within slope unit using the descriptive statistics features of mean,standard deviation and range.Thus,slope units-based machine learning models considering internal variations of conditioning factors(variant slope-machine learning)are proposed.The Chongyi County is selected as the case study and is divided into 53,055 slope units.Fifteen original slope unit-based conditioning factors are expanded to 38 slope unit-based conditioning factors through considering their internal variations.Random forest(RF)and multi-layer perceptron(MLP)machine learning models are used to construct variant Slope-RF and Slope-MLP models.Meanwhile,the Slope-RF and Slope-MLP models without considering the internal variations of conditioning factors,and conventional grid units-based machine learning(Grid-RF and MLP)models are built for comparisons through the LSP performance assessments.Results show that the variant Slopemachine learning models have higher LSP performances than Slope-machine learning models;LSP results of variant Slope-machine learning models have stronger directivity and practical application than Grid-machine learning models.It is concluded that slope units extracted by MSS method can be appropriate for LSP modeling,and the heterogeneity of conditioning factors within slope units can more comprehensively reflect the relationships between conditioning factors and landslides.The research results have important reference significance for land use and landslide prevention. 展开更多
关键词 Landslide susceptibility prediction(LSP) Slope unit Multi-scale segmentation method(MSS) Heterogeneity of conditioning factors machine learning models
下载PDF
Significant risk factors for intensive care unit-acquired weakness:A processing strategy based on repeated machine learning 被引量:5
2
作者 Ling Wang Deng-Yan Long 《World Journal of Clinical Cases》 SCIE 2024年第7期1235-1242,共8页
BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr... BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration. 展开更多
关键词 Intensive care unit-acquired weakness Risk factors machine learning PREVENTION Strategies
下载PDF
Machine learning applications in stroke medicine:advancements,challenges,and future prospectives 被引量:2
3
作者 Mario Daidone Sergio Ferrantelli Antonino Tuttolomondo 《Neural Regeneration Research》 SCIE CAS CSCD 2024年第4期769-773,共5页
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique... Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease. 展开更多
关键词 cerebrovascular disease deep learning machine learning reinforcement learning STROKE stroke therapy supervised learning unsupervised learning
下载PDF
Prediction model for corrosion rate of low-alloy steels under atmospheric conditions using machine learning algorithms 被引量:1
4
作者 Jingou Kuang Zhilin Long 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第2期337-350,共14页
This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while ... This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models. 展开更多
关键词 machine learning low-alloy steel atmospheric corrosion prediction corrosion rate feature fusion
下载PDF
Assessment of compressive strength of jet grouting by machine learning 被引量:1
5
作者 Esteban Diaz Edgar Leonardo Salamanca-Medina Roberto Tomas 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第1期102-111,共10页
Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the prope... Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns. 展开更多
关键词 Jet grouting Ground improvement Compressive strength machine learning
下载PDF
Machine learning-assisted efficient design of Cu-based shape memory alloy with specific phase transition temperature 被引量:1
6
作者 Mengwei Wu Wei Yong +2 位作者 Cunqin Fu Chunmei Ma Ruiping Liu 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第4期773-785,共13页
The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important prac... The martensitic transformation temperature is the basis for the application of shape memory alloys(SMAs),and the ability to quickly and accurately predict the transformation temperature of SMAs has very important practical significance.In this work,machine learning(ML)methods were utilized to accelerate the search for shape memory alloys with targeted properties(phase transition temperature).A group of component data was selected to design shape memory alloys using reverse design method from numerous unexplored data.Component modeling and feature modeling were used to predict the phase transition temperature of the shape memory alloys.The experimental results of the shape memory alloys were obtained to verify the effectiveness of the support vector regression(SVR)model.The results show that the machine learning model can obtain target materials more efficiently and pertinently,and realize the accurate and rapid design of shape memory alloys with specific target phase transition temperature.On this basis,the relationship between phase transition temperature and material descriptors is analyzed,and it is proved that the key factors affecting the phase transition temperature of shape memory alloys are based on the strength of the bond energy between atoms.This work provides new ideas for the controllable design and performance optimization of Cu-based shape memory alloys. 展开更多
关键词 machine learning support vector regression shape memory alloys martensitic transformation temperature
下载PDF
Multifractal estimation of NMR T_(2) cut-off value in low-permeability rocks considering spectrum kurtosis: SMOTE-based oversampling integrated with machine learning
7
作者 Xiao-Jun Chen Rui-Xue Zhang +4 位作者 Xiao-Bo Zhao Jun-Wei Yang Zhang-Jian Lan Cheng-Fei Luo Jian-Chao Cai 《Petroleum Science》 SCIE EI CAS CSCD 2023年第6期3411-3427,共17页
The transverse relaxation time (T_(2)) cut-off value plays a crucial role in nuclear magnetic resonance for identifying movable and immovable boundaries, evaluating permeability, and determining fluid saturation in pe... The transverse relaxation time (T_(2)) cut-off value plays a crucial role in nuclear magnetic resonance for identifying movable and immovable boundaries, evaluating permeability, and determining fluid saturation in petrophysical characterization of petroleum reservoirs. This study focuses on the systematic analysis of T_(2) spectra and T_(2) cut-off values in low-permeability reservoir rocks. Analysis of 36 low-permeability cores revealed a wide distribution of T_(2) cut-off values, ranging from 7 to 50 ms. Additionally, the T_(2) spectra exhibited multimodal characteristics, predominantly displaying unimodal and bimodal morphologies, with a few trimodal morphologies, which are inherently influenced by different pore types. Fractal characteristics of pore structure in fully water-saturated cores were captured through the T_(2) spectra, which were calculated using generalized fractal and multifractal theories. To augment the limited dataset of 36 cores, the synthetic minority oversampling technique was employed. Models for evaluating the T_(2) cut-off value were separately developed based on the classified T_(2) spectra, considering the number of peaks, and utilizing generalized fractal dimensions at the weight <0 and the singular intensity range. The underlying mechanism is that the singular intensity and generalized fractal dimensions at the weight <0 can detect the T_(2) spectral shift. However, the T_(2) spectral shift has negligible effects on multifractal spectrum function difference and generalized fractal dimensions at the weight >0. The primary objective of this work is to gain insights into the relationship between the kurtosis of the T_(2) spectrum and pore types, as well as to predict the T_(2) cut-off value of low-permeability rocks using machine learning and data augmentation techniques. 展开更多
关键词 Nuclear magnetic resonance Low-permeability porous media T_(2)cut-off value Fractal and multifractal Data augmentation machine learning
下载PDF
Scheduling an Energy-Aware Parallel Machine System with Deteriorating and Learning Effects Considering Multiple Optimization Objectives and Stochastic Processing Time
8
作者 Lei Wang Yuxin Qi 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第4期325-339,共15页
Currently,energy conservation draws wide attention in industrial manufacturing systems.In recent years,many studies have aimed at saving energy consumption in the process of manufacturing and scheduling is regarded as... Currently,energy conservation draws wide attention in industrial manufacturing systems.In recent years,many studies have aimed at saving energy consumption in the process of manufacturing and scheduling is regarded as an effective approach.This paper puts forwards a multi-objective stochastic parallel machine scheduling problem with the consideration of deteriorating and learning effects.In it,the real processing time of jobs is calculated by using their processing speed and normal processing time.To describe this problem in a mathematical way,amultiobjective stochastic programming model aiming at realizing makespan and energy consumption minimization is formulated.Furthermore,we develop a multi-objective multi-verse optimization combined with a stochastic simulation method to deal with it.In this approach,the multi-verse optimization is adopted to find favorable solutions from the huge solution domain,while the stochastic simulation method is employed to assess them.By conducting comparison experiments on test problems,it can be verified that the developed approach has better performance in coping with the considered problem,compared to two classic multi-objective evolutionary algorithms. 展开更多
关键词 Energy consumption optimization parallel machine scheduling multi-objective optimization deteriorating and learning effects stochastic simulation
下载PDF
Use of machine learning models for the prognostication of liver transplantation: A systematic review 被引量:1
9
作者 Gidion Chongo Jonathan Soldera 《World Journal of Transplantation》 2024年第1期164-188,共25页
BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are p... BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication. 展开更多
关键词 Liver transplantation machine learning models PROGNOSTICATION Allograft allocation Artificial intelligence
下载PDF
ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果
10
作者 张上超 曾华驱 王思阳 《医疗装备》 2024年第7期19-24,共6页
目的探讨ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果。方法利用ArcCHECK Machine QA工具和ArcCHECK体模对医用直线加速器进行性能测试,项目包括机架角度、机架旋转速度、机架旋转中心、多叶准直器和铅门位置的一致... 目的探讨ArcCHECK Machine QA工具在医用直线加速器质量保证中的应用效果。方法利用ArcCHECK Machine QA工具和ArcCHECK体模对医用直线加速器进行性能测试,项目包括机架角度、机架旋转速度、机架旋转中心、多叶准直器和铅门位置的一致性、机架旋转出束时的平坦度和对称性,评估该工具在医用直线加速器质量保证中的应用效果。结果旋转模式下机架平均旋转速度为3.6 deg/s,最大偏差约0.5 deg/s;机架旋转等中心形成的平均半径为0.4 mm,多叶准直器与铅门的最大距离正、负差异平均值分别为0.7 mm、-0.7 mm;旋转出束模式下Y方向的平坦度为1.8%,Y方向的对称性为1.1%,X方向的对称性为4.3%。结论ArcCHECK Machine QA工具可用于医用直线加速器常规及容积调强出束性能质量保证。 展开更多
关键词 ArcCHECK machine QA工具 质量保证 容积调强 等中心
下载PDF
Machine learning for predicting the outcome of terminal ballistics events
11
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
下载PDF
Advancements in machine learning for material design and process optimization in the field of additive manufacturing
12
作者 Hao-ran Zhou Hao Yang +8 位作者 Huai-qian Li Ying-chun Ma Sen Yu Jian shi Jing-chang Cheng Peng Gao Bo Yu Zhi-quan Miao Yan-peng Wei 《China Foundry》 SCIE EI CAS CSCD 2024年第2期101-115,共15页
Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is co... Additive manufacturing technology is highly regarded due to its advantages,such as high precision and the ability to address complex geometric challenges.However,the development of additive manufacturing process is constrained by issues like unclear fundamental principles,complex experimental cycles,and high costs.Machine learning,as a novel artificial intelligence technology,has the potential to deeply engage in the development of additive manufacturing process,assisting engineers in learning and developing new techniques.This paper provides a comprehensive overview of the research and applications of machine learning in the field of additive manufacturing,particularly in model design and process development.Firstly,it introduces the background and significance of machine learning-assisted design in additive manufacturing process.It then further delves into the application of machine learning in additive manufacturing,focusing on model design and process guidance.Finally,it concludes by summarizing and forecasting the development trends of machine learning technology in the field of additive manufacturing. 展开更多
关键词 additive manufacturing machine learning material design process optimization intersection of disciplines embedded machine learning
下载PDF
Prediction of lime utilization ratio of dephosphorization in BOF steelmaking based on online sequential extreme learning machine with forgetting mechanism
13
作者 Runhao Zhang Jian Yang +1 位作者 Han Sun Wenkui Yang 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第3期508-517,共10页
The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting me... The machine learning models of multiple linear regression(MLR),support vector regression(SVR),and extreme learning ma-chine(ELM)and the proposed ELM models of online sequential ELM(OS-ELM)and OS-ELM with forgetting mechanism(FOS-ELM)are applied in the prediction of the lime utilization ratio of dephosphorization in the basic oxygen furnace steelmaking process.The ELM model exhibites the best performance compared with the models of MLR and SVR.OS-ELM and FOS-ELM are applied for sequential learning and model updating.The optimal number of samples in validity term of the FOS-ELM model is determined to be 1500,with the smallest population mean absolute relative error(MARE)value of 0.058226 for the population.The variable importance analysis reveals lime weight,initial P content,and hot metal weight as the most important variables for the lime utilization ratio.The lime utilization ratio increases with the decrease in lime weight and the increases in the initial P content and hot metal weight.A prediction system based on FOS-ELM is applied in actual industrial production for one month.The hit ratios of the predicted lime utilization ratio in the error ranges of±1%,±3%,and±5%are 61.16%,90.63%,and 94.11%,respectively.The coefficient of determination,MARE,and root mean square error are 0.8670,0.06823,and 1.4265,respectively.The system exhibits desirable performance for applications in actual industrial pro-duction. 展开更多
关键词 basic oxygen furnace steelmaking machine learning lime utilization ratio DEPHOSPHORIZATION online sequential extreme learning machine forgetting mechanism
下载PDF
Improved PSO-Extreme Learning Machine Algorithm for Indoor Localization
14
作者 Qiu Wanqing Zhang Qingmiao +1 位作者 Zhao Junhui Yang Lihua 《China Communications》 SCIE CSCD 2024年第5期113-122,共10页
Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the rece... Wi Fi and fingerprinting localization method have been a hot topic in indoor positioning because of their universality and location-related features.The basic assumption of fingerprinting localization is that the received signal strength indication(RSSI)distance is accord with the location distance.Therefore,how to efficiently match the current RSSI of the user with the RSSI in the fingerprint database is the key to achieve high-accuracy localization.In this paper,a particle swarm optimization-extreme learning machine(PSO-ELM)algorithm is proposed on the basis of the original fingerprinting localization.Firstly,we collect the RSSI of the experimental area to construct the fingerprint database,and the ELM algorithm is applied to the online stages to determine the corresponding relation between the location of the terminal and the RSSI it receives.Secondly,PSO algorithm is used to improve the bias and weight of ELM neural network,and the global optimal results are obtained.Finally,extensive simulation results are presented.It is shown that the proposed algorithm can effectively reduce mean error of localization and improve positioning accuracy when compared with K-Nearest Neighbor(KNN),Kmeans and Back-propagation(BP)algorithms. 展开更多
关键词 extreme learning machine fingerprinting localization indoor localization machine learning particle swarm optimization
下载PDF
Application of machine learning in perovskite materials and devices:A review
15
作者 Ming Chen Zhenhua Yin +6 位作者 Zhicheng Shan Xiaokai Zheng Lei Liu Zhonghua Dai Jun Zhang Shengzhong(Frank)Liu Zhuo Xu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第7期254-272,共19页
Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for m... Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices. 展开更多
关键词 machine learning PEROVSKITE Materials design Bandgap engineering Stability Crystal structure
下载PDF
Comparative study of different machine learning models in landslide susceptibility assessment: A case study of Conghua District, Guangzhou, China
16
作者 Ao Zhang Xin-wen Zhao +8 位作者 Xing-yuezi Zhao Xiao-zhan Zheng Min Zeng Xuan Huang Pan Wu Tuo Jiang Shi-chang Wang Jun He Yi-yong Li 《China Geology》 CAS CSCD 2024年第1期104-115,共12页
Machine learning is currently one of the research hotspots in the field of landslide prediction.To clarify and evaluate the differences in characteristics and prediction effects of different machine learning models,Co... Machine learning is currently one of the research hotspots in the field of landslide prediction.To clarify and evaluate the differences in characteristics and prediction effects of different machine learning models,Conghua District,which is the most prone to landslide disasters in Guangzhou,was selected for landslide susceptibility evaluation.The evaluation factors were selected by using correlation analysis and variance expansion factor method.Applying four machine learning methods namely Logistic Regression(LR),Random Forest(RF),Support Vector Machines(SVM),and Extreme Gradient Boosting(XGB),landslide models were constructed.Comparative analysis and evaluation of the model were conducted through statistical indices and receiver operating characteristic(ROC)curves.The results showed that LR,RF,SVM,and XGB models have good predictive performance for landslide susceptibility,with the area under curve(AUC)values of 0.752,0.965,0.996,and 0.998,respectively.XGB model had the highest predictive ability,followed by RF model,SVM model,and LR model.The frequency ratio(FR)accuracy of LR,RF,SVM,and XGB models was 0.775,0.842,0.759,and 0.822,respectively.RF and XGB models were superior to LR and SVM models,indicating that the integrated algorithm has better predictive ability than a single classification algorithm in regional landslide classification problems. 展开更多
关键词 Landslides susceptibility assessment machine learning Logistic Regression Random Forest Support Vector machines XGBoost Assessment model Geological disaster investigation and prevention engineering
下载PDF
Machine learning for membrane design and discovery
17
作者 Haoyu Yin Muzi Xu +4 位作者 Zhiyao Luo Xiaotian Bi Jiali Li Sui Zhang Xiaonan Wang 《Green Energy & Environment》 SCIE EI CAS CSCD 2024年第1期54-70,共17页
Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research an... Membrane technologies are becoming increasingly versatile and helpful today for sustainable development.Machine Learning(ML),an essential branch of artificial intelligence(AI),has substantially impacted the research and development norm of new materials for energy and environment.This review provides an overview and perspectives on ML methodologies and their applications in membrane design and dis-covery.A brief overview of membrane technologies isfirst provided with the current bottlenecks and potential solutions.Through an appli-cations-based perspective of AI-aided membrane design and discovery,we further show how ML strategies are applied to the membrane discovery cycle(including membrane material design,membrane application,membrane process design,and knowledge extraction),in various membrane systems,ranging from gas,liquid,and fuel cell separation membranes.Furthermore,the best practices of integrating ML methods and specific application targets in membrane design and discovery are presented with an ideal paradigm proposed.The challenges to be addressed and prospects of AI applications in membrane discovery are also highlighted in the end. 展开更多
关键词 machine learning Membranes AI for Membrane DATA-DRIVEN DESIGN
下载PDF
Battery pack capacity estimation for electric vehicles based on enhanced machine learning and field data
18
作者 Qingguang Qi Wenxue Liu +3 位作者 Zhongwei Deng Jinwen Li Ziyou Song Xiaosong Hu 《Journal of Energy Chemistry》 SCIE EI CAS CSCD 2024年第5期605-618,共14页
Accurate capacity estimation is of great importance for the reliable state monitoring,timely maintenance,and second-life utilization of lithium-ion batteries.Despite numerous works on battery capacity estimation using... Accurate capacity estimation is of great importance for the reliable state monitoring,timely maintenance,and second-life utilization of lithium-ion batteries.Despite numerous works on battery capacity estimation using laboratory datasets,most of them are applied to battery cells and lack satisfactory fidelity when extended to real-world electric vehicle(EV)battery packs.The challenges intensify for large-sized EV battery packs,where unpredictable operating profiles and low-quality data acquisition hinder precise capacity estimation.To fill the gap,this study introduces a novel data-driven battery pack capacity estimation method grounded in field data.The proposed approach begins by determining labeled capacity through an innovative combination of the inverse ampere-hour integral,open circuit voltage-based,and resistance-based correction methods.Then,multiple health features are extracted from incremental capacity curves,voltage curves,equivalent circuit model parameters,and operating temperature to thoroughly characterize battery aging behavior.A feature selection procedure is performed to determine the optimal feature set based on the Pearson correlation coefficient.Moreover,a convolutional neural network and bidirectional gated recurrent unit,enhanced by an attention mechanism,are employed to estimate the battery pack capacity in real-world EV applications.Finally,the proposed method is validated with a field dataset from two EVs,covering approximately 35,000 kilometers.The results demonstrate that the proposed method exhibits better estimation performance with an error of less than 1.1%compared to existing methods.This work shows great potential for accurate large-sized EV battery pack capacity estimation based on field data,which provides significant insights into reliable labeled capacity calculation,effective features extraction,and machine learning-enabled health diagnosis. 展开更多
关键词 Electricvehicle Lithium-ion battery pack Capacity estimation machine learning Field data
下载PDF
Machine learning for carbonate formation drilling: Mud loss prediction using seismic attributes and mud loss records
19
作者 Hui-Wen Pang Han-Qing Wang +4 位作者 Yi-Tian Xiao Yan Jin Yun-Hu Lu Yong-Dong Fan Zhen Nie 《Petroleum Science》 SCIE EI CAS CSCD 2024年第2期1241-1256,共16页
Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production exp... Due to the complexity and variability of carbonate formation leakage zones, lost circulation prediction and control is one of the major challenges of carbonate drilling. It raises well-control risks and production expenses. This research utilizes the H oilfield as an example, employs seismic features to analyze mud loss prediction, and produces a complete set of pre-drilling mud loss prediction solutions. Firstly, 16seismic attributes are calculated based on the post-stack seismic data, and the mud loss rate per unit footage is specified. The sample set is constructed by extracting each attribute from the seismic trace surrounding 15 typical wells, with a ratio of 8:2 between the training set and the test set. With the calibration results for mud loss rate per unit footage, the nonlinear mapping relationship between seismic attributes and mud loss rate per unit size is established using the mixed density network model.Then, the influence of the number of sub-Gausses and the uncertainty coefficient on the model's prediction is evaluated. Finally, the model is used in conjunction with downhole drilling conditions to assess the risk of mud loss in various layers and along the wellbore trajectory. The study demonstrates that the mean relative errors of the model for training data and test data are 6.9% and 7.5%, respectively, and that R2is 90% and 88%, respectively, for training data and test data. The accuracy and efficacy of mud loss prediction may be greatly enhanced by combining 16 seismic attributes with the mud loss rate per unit footage and applying machine learning methods. The mud loss prediction model based on the MDN model can not only predict the mud loss rate but also objectively evaluate the prediction based on the quality of the data and the model. 展开更多
关键词 Lost circulation Risk prediction machine learning Seismic attributes Mud loss records
下载PDF
A hybrid machine learning optimization algorithm for multivariable pore pressure prediction
20
作者 Song Deng Hao-Yu Pan +8 位作者 Hai-Ge Wang Shou-Kun Xu Xiao-Peng Yan Chao-Wei Li Ming-Guo Peng Hao-Ping Peng Lin Shi Meng Cui Fei Zhao 《Petroleum Science》 SCIE EI CAS CSCD 2024年第1期535-550,共16页
Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when f... Pore pressure is essential data in drilling design,and its accurate prediction is necessary to ensure drilling safety and improve drilling efficiency.Traditional methods for predicting pore pressure are limited when forming particular structures and lithology.In this paper,a machine learning algorithm and effective stress theorem are used to establish the transformation model between rock physical parameters and pore pressure.This study collects data from three wells.Well 1 had 881 data sets for model training,and Wells 2 and 3 had 538 and 464 data sets for model testing.In this paper,support vector machine(SVM),random forest(RF),extreme gradient boosting(XGB),and multilayer perceptron(MLP)are selected as the machine learning algorithms for pore pressure modeling.In addition,this paper uses the grey wolf optimization(GWO)algorithm,particle swarm optimization(PSO)algorithm,sparrow search algorithm(SSA),and bat algorithm(BA)to establish a hybrid machine learning optimization algorithm,and proposes an improved grey wolf optimization(IGWO)algorithm.The IGWO-MLP model obtained the minimum root mean square error(RMSE)by using the 5-fold cross-validation method for the training data.For the pore pressure data in Well 2 and Well 3,the coefficients of determination(R^(2))of SVM,RF,XGB,and MLP are 0.9930 and 0.9446,0.9943 and 0.9472,0.9945 and 0.9488,0.9949 and 0.9574.MLP achieves optimal performance on both training and test data,and the MLP model shows a high degree of generalization.It indicates that the IGWO-MLP is an excellent predictor of pore pressure and can be used to predict pore pressure. 展开更多
关键词 Pore pressure Grey wolf optimization Multilayer perceptron Effective stress machine learning
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部