Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for m...Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices.展开更多
Perovskite solar cells(PsCs)have developed tremendously over the past decade.However,the key factors influencing the power conversion efficiency(PCE)of PSCs remain incompletely understood,due to the complexity and cou...Perovskite solar cells(PsCs)have developed tremendously over the past decade.However,the key factors influencing the power conversion efficiency(PCE)of PSCs remain incompletely understood,due to the complexity and coupling of these structural and compositional parameters.In this research,we demon-strate an effective approach to optimize PSCs performance via machine learning(ML).To address chal-lenges posed by limited samples,we propose a feature mask(FM)method,which augments training samples through feature transformation rather than synthetic data.Using this approach,squeeze-and-excitation residual network(SEResNet)model achieves an accuracy with a root-mean-square-error(RMSE)of 0.833%and a Pearson's correlation coefficient(r)of 0.980.Furthermore,we employ the permu-tation importance(PI)algorithm to investigate key features for PCE.Subsequently,we predict PCE through high-throughput screenings,in which we study the relationship between PCE and chemical com-positions.After that,we conduct experiments to validate the consistency between predicted results by ML and experimental results.In this work,ML demonstrates the capability to predict device performance,extract key parameters from complex systems,and accelerate the transition from laboratory findings to commercialapplications.展开更多
Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications indu...Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications industry loses millions of dollars due to poor video Quality of Experience(QoE)for users.Among the standard proposals for standardizing the quality of video streaming over internet service providers(ISPs)is the Mean Opinion Score(MOS).However,the accurate finding of QoE by MOS is subjective and laborious,and it varies depending on the user.A fully automated data analytics framework is required to reduce the inter-operator variability characteristic in QoE assessment.This work addresses this concern by suggesting a novel hybrid XGBStackQoE analytical model using a two-level layering technique.Level one combines multiple Machine Learning(ML)models via a layer one Hybrid XGBStackQoE-model.Individual ML models at level one are trained using the entire training data set.The level two Hybrid XGBStackQoE-Model is fitted using the outputs(meta-features)of the layer one ML models.The proposed model outperformed the conventional models,with an accuracy improvement of 4 to 5 percent,which is still higher than the current traditional models.The proposed framework could significantly improve video QoE accuracy.展开更多
Software Defined Network(SDN)and Network Function Virtualization(NFV)technology promote several benefits to network operators,including reduced maintenance costs,increased network operational performance,simplified ne...Software Defined Network(SDN)and Network Function Virtualization(NFV)technology promote several benefits to network operators,including reduced maintenance costs,increased network operational performance,simplified network lifecycle,and policies management.Network vulnerabilities try to modify services provided by Network Function Virtualization MANagement and Orchestration(NFV MANO),and malicious attacks in different scenarios disrupt the NFV Orchestrator(NFVO)and Virtualized Infrastructure Manager(VIM)lifecycle management related to network services or individual Virtualized Network Function(VNF).This paper proposes an anomaly detection mechanism that monitors threats in NFV MANO and manages promptly and adaptively to implement and handle security functions in order to enhance the quality of experience for end users.An anomaly detector investigates these identified risks and provides secure network services.It enables virtual network security functions and identifies anomalies in Kubernetes(a cloud-based platform).For training and testing purpose of the proposed approach,an intrusion-containing dataset is used that hold multiple malicious activities like a Smurf,Neptune,Teardrop,Pod,Land,IPsweep,etc.,categorized as Probing(Prob),Denial of Service(DoS),User to Root(U2R),and Remote to User(R2L)attacks.An anomaly detector is anticipated with the capabilities of a Machine Learning(ML)technique,making use of supervised learning techniques like Logistic Regression(LR),Support Vector Machine(SVM),Random Forest(RF),Naïve Bayes(NB),and Extreme Gradient Boosting(XGBoost).The proposed framework has been evaluated by deploying the identified ML algorithm on a Jupyter notebook in Kubeflow to simulate Kubernetes for validation purposes.RF classifier has shown better outcomes(99.90%accuracy)than other classifiers in detecting anomalies/intrusions in the containerized environment.展开更多
Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of ma...Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of machine learning models for customer churn prediction, focusing on the U.S. context. The research evaluates the performance of logistic regression, random forest, and neural networks using industry-specific datasets, considering the economic impact and practical implications of the findings. The exploratory data analysis reveals unique patterns and trends in the U.S. banking and finance industry, such as the age distribution of customers and the prevalence of dormant accounts. The study incorporates macroeconomic factors to capture the potential influence of external conditions on customer churn behavior. The findings highlight the importance of leveraging advanced machine learning techniques and comprehensive customer data to develop effective churn prevention strategies in the U.S. context. By accurately predicting customer churn, financial institutions can proactively identify at-risk customers, implement targeted retention strategies, and optimize resource allocation. The study discusses the limitations and potential future improvements, serving as a roadmap for researchers and practitioners to further advance the field of customer churn prediction in the evolving landscape of the U.S. banking and finance industry.展开更多
Machine learning(ML)is a type of artificial intelligence that assists computers in the acquisition of knowledge through data analysis,thus creating machines that can complete tasks otherwise requiring human intelligen...Machine learning(ML)is a type of artificial intelligence that assists computers in the acquisition of knowledge through data analysis,thus creating machines that can complete tasks otherwise requiring human intelligence.Among its various applications,it has proven groundbreaking in healthcare as well,both in clinical practice and research.In this editorial,we succinctly introduce ML applications and present a study,featured in the latest issue of the World Journal of Clinical Cases.The authors of this study conducted an analysis using both multiple linear regression(MLR)and ML methods to investigate the significant factors that may impact the estimated glomerular filtration rate in healthy women with and without non-alcoholic fatty liver disease(NAFLD).Their results implicated age as the most important determining factor in both groups,followed by lactic dehydrogenase,uric acid,forced expiratory volume in one second,and albumin.In addition,for the NAFLD-group,the 5th and 6th most important impact factors were thyroid-stimulating hormone and systolic blood pressure,as compared to plasma calcium and body fat for the NAFLD+group.However,the study's distinctive contribution lies in its adoption of ML methodologies,showcasing their superiority over traditional statistical approaches(herein MLR),thereby highlighting the potential of ML to represent an invaluable advanced adjunct tool in clinical practice and research.展开更多
Parallel kinematics machine (PKM) is advantageous over the serial machine tools in processing the complex-surface products. A manufacturing service system for PKM is developed to provide the services of the complex-...Parallel kinematics machine (PKM) is advantageous over the serial machine tools in processing the complex-surface products. A manufacturing service system for PKM is developed to provide the services of the complex-surface machining for potential geographically-dispersed manufacturing enterprises. In order to easily in- tegrate the external system, Web services are used to encapsulate post-processing functions of PKM legacy sys- tems, including compilation, workspace calculation, interfere calibration, and kinematics transformation. A ser- vice-oriented architecture(SOA) is proposed for the cooperative work between the PKM system and its client. The workflow and the function module of this manufacturing service system are presented. An example shows that as a result of SOA and loose coupling, such a Web service-based manufacturing service system is easier to in- tegrate and interoperate with its client. Meanwhile, the system decreases the manufacturing cost and improves the efficiency than its former kind of distributed system.展开更多
This work is aimed at investigating the online scheduling problem on two parallel and identical machines with a new feature that service requests from various customers are entitled to many different grade of service ...This work is aimed at investigating the online scheduling problem on two parallel and identical machines with a new feature that service requests from various customers are entitled to many different grade of service (GoS) levels, so each job and machine are labelled with the GoS levels, and each job can be processed by a particular machine only when its GoS level is no less than that of the machine. The goal is to minimize the makespan. For non-preemptive version, we propose an optimal online al-gorithm with competitive ratio 5/3. For preemptive version, we propose an optimal online algorithm with competitive ratio 3/2.展开更多
With the rising demand for data access,network service providers face the challenge of growing their capital and operating costs while at the same time enhancing network capacity and meeting the increased demand for a...With the rising demand for data access,network service providers face the challenge of growing their capital and operating costs while at the same time enhancing network capacity and meeting the increased demand for access.To increase efficacy of Software Defined Network(SDN)and Network Function Virtualization(NFV)framework,we need to eradicate network security configuration errors that may create vulnerabilities to affect overall efficiency,reduce network performance,and increase maintenance cost.The existing frameworks lack in security,and computer systems face few abnormalities,which prompts the need for different recognition and mitigation methods to keep the system in the operational state proactively.The fundamental concept behind SDN-NFV is the encroachment from specific resource execution to the programming-based structure.This research is around the combination of SDN and NFV for rational decision making to control and monitor traffic in the virtualized environment.The combination is often seen as an extra burden in terms of resources usage in a heterogeneous network environment,but as well as it provides the solution for critical problems specially regarding massive network traffic issues.The attacks have been expanding step by step;therefore,it is hard to recognize and protect by conventional methods.To overcome these issues,there must be an autonomous system to recognize and characterize the network traffic’s abnormal conduct if there is any.Only four types of assaults,including HTTP Flood,UDP Flood,Smurf Flood,and SiDDoS Flood,are considered in the identified dataset,to optimize the stability of the SDN-NFVenvironment and security management,through several machine learning based characterization techniques like Support Vector Machine(SVM),K-Nearest Neighbors(KNN),Logistic Regression(LR)and Isolation Forest(IF).Python is used for simulation purposes,including several valuable utilities like the mine package,the open-source Python ML libraries Scikit-learn,NumPy,SciPy,Matplotlib.Few Flood assaults and Structured Query Language(SQL)injections anomalies are validated and effectively-identified through the anticipated procedure.The classification results are promising and show that overall accuracy lies between 87%to 95%for SVM,LR,KNN,and IF classifiers in the scrutiny of traffic,whether the network traffic is normal or anomalous in the SDN-NFV environment.展开更多
The performance of the metal halide perovskite solar cells(PSCs)highly relies on the experimental parameters,including the fabrication processes and the compositions of the perovskites;tremendous experimental work has...The performance of the metal halide perovskite solar cells(PSCs)highly relies on the experimental parameters,including the fabrication processes and the compositions of the perovskites;tremendous experimental work has been done to optimize these factors.However,predicting the device performance of the PSCs from the fabrication parameters before experiments is still challenging.Herein,we bridge this gap by machine learning(ML)based on a dataset including 1072 devices from peer-reviewed publications.The optimized ML model accurately predicts the PCE from the experimental parameters with a root mean square error of 1.28%and a Pearson coefficientr of 0.768.Moreover,the factors governing the device performance are ranked by shapley additive explanations(SHAP),among which,A-site cation is crucial to getting highly efficient PSCs.Experiments and density functional theory calculations are employed to validate and help explain the predicting results by the ML model.Our work reveals the feasibility of ML in predicting the device performance from the experimental parameters before experiments,which enables the reverse experimental design toward highly efficient PSCs.展开更多
Cam mechanics is one of the most popular devices for generating irregular motions and is widely used in automatic equipment,such as textile machines,internal combustion engines,and other automatic devices.In order to ...Cam mechanics is one of the most popular devices for generating irregular motions and is widely used in automatic equipment,such as textile machines,internal combustion engines,and other automatic devices.In order to obtain a positive motion from the follower using a rotating cam,its shape should be correctly designed and manufactured.The development of an adequate CAD/CAM system for a cam profile CNC grinding machine is necessary to manufacture high-precision cams.The purpose of this study is the development of a CAD/CAM system and profile measuring device for a CNC grinding machine to obtain an optimal grinding speed with a constant surface roughness.Three types of disk cams were manufactured using the proposed algorithm and procedures to verify effectiveness of the developed CAD/CAM system.展开更多
The exponential growth of mobile applications and services during the last years has challenged the existing network infrastructures.Consequently,the arrival of multiple management solutions to cope with this explosio...The exponential growth of mobile applications and services during the last years has challenged the existing network infrastructures.Consequently,the arrival of multiple management solutions to cope with this explosion along the end-to-end network chain has increased the complexity in the coordinated orchestration of different segments composing the whole infrastructure.The Zero-touch Network and Service Management(ZSM)concept has recently emerged to automatically orchestrate and manage network resources while assuring the Quality of Experience(QoE)demanded by users.Machine Learning(ML)is one of the key enabling technologies that many ZSM frameworks are adopting to bring intelligent decision making to the network management system.This paper presents a comprehensive survey of the state-of-the-art application of ML-based techniques to improve ZSM performance.To this end,the main related standardization activities and the aligned international projects and research efforts are deeply examined.From this dissection,the skyrocketing growth of the ZSM paradigm can be observed.Concretely,different standardization bodies have already designed reference architectures to set the foundations of novel automatic network management functions and resource orchestration.Aligned with these advances,diverse ML techniques are being currently exploited to build further ZSM developments in different aspects,including multi-tenancy management,traffic monitoring,and architecture coordination,among others.However,different challenges,such as the complexity,scalability,and security of ML mechanisms,are also identified,and future research guidelines are provided to accomplish a firm development of the ZSM ecosystem.展开更多
One of the common transportation systems in Korea is calling taxis through online applications,which is more convenient for passengers and drivers in the modern area.However,the driver’s passenger taxi request can be...One of the common transportation systems in Korea is calling taxis through online applications,which is more convenient for passengers and drivers in the modern area.However,the driver’s passenger taxi request can be rejected based on the driver’s location and distance.Therefore,there is a need to specify driver’s acceptance and rejection of the received request.The security of this systemis anothermain core to save the transaction information and safety of passengers and drivers.In this study,the origin and destination of the Jeju island SouthKorea were captured from T-map and processed based on machine learning decision tree and XGBoost techniques.The blockchain framework is implemented in the Hyperledger Fabric platform.The experimental results represent the features of socio-economic.The cross-validation was accomplished.Distance is another factor for the taxi trip,which in total trip in midnight is quite shorter.This process presents the successful matching of ride-hailing taxi services with the specialty of distance,the trip request,and safety based on the total city measurement.展开更多
With the continuous development of science and technology,electronic devices have begun to enter all aspects of human life,becoming increasingly closely related to human life.Users have higher quality requirements for...With the continuous development of science and technology,electronic devices have begun to enter all aspects of human life,becoming increasingly closely related to human life.Users have higher quality requirements for electronic devices.Electronic device testing has gradually become an irreplaceable engineering process in modern manufacturing enterprises to guarantee the quality of products while preventing inferior products from entering the market.Considering the large output of electronic devices,improving the testing efficiency while reducing the testing cost has become an urgent problem to be solved.This study investigates the electronic device testing machine allocation problem(EDTMAP),aiming to improve the production of electronic devices and reduce the scheduling distance among testing machines through reasonable machine allocation.First,a mathematical model was formulated for the EDTMAP to maximize both production and the scheduling distance among testing machines.Second,we developed a discrete multi-objective artificial bee colony(DMOABC)algorithm to solve EDTMAP.A crossover operator and local search operator were designed to improve the exploration and exploitation of the algorithm,respectively.Numerical experiments were conducted to evaluate the performance of the proposed algorithm.The experimental results demonstrate the superiority of the proposed algorithm compared with the non-dominated sorting genetic algorithm II(NSGA-II)and strength Pareto evolutionary algorithm 2(SPEA2).Finally,the mathematical model and DMOABC algorithm were applied to a real-world factory that tests radio-frequency modules.The results verify that our method can significantly improve production and reduce the scheduling distance among testing machines.展开更多
In order to incorporate smart elements into distribution networks at ITELCA laboratories in Bogotá-Colombia, a Machine-to-Machine-based solution has been developed. This solution aids in the process of low-cost e...In order to incorporate smart elements into distribution networks at ITELCA laboratories in Bogotá-Colombia, a Machine-to-Machine-based solution has been developed. This solution aids in the process of low-cost electrical fault location, which contributes to improving quality of service, particularly by shortening interruption time spans in mid-voltage grids. The implementation makes use of a GENEKO modem that exploits its digital inputs together with a full coverage of certain required auxiliary services so as to generate proper detection signals whenever failure currents occur;which allows incorporating the latest failure detection technology into the system.展开更多
BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr...BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration.展开更多
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique...Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.展开更多
This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while ...This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models.展开更多
Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experi...Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems.展开更多
The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceu...The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations.展开更多
基金funded by the Strategic Priority Research Program of the Chinese Academy of Sciences (Grant No.XDA17040506)the National Natural Science Foundation of China(62005148/12004235)+2 种基金The Open Competition Mechanism to Select The Best Candidates Project in Jinzhong Science and Technology Bureau (J202101)the DNL Cooperation Fund CAS(DNL180311)the 111 Project (B14041)
文摘Metal-halide hybrid perovskite materials are excellent candidates for solar cells and photoelectric devices.In recent years,machine learning(ML)techniques have developed rapidly in many fields and provided ideas for material discovery and design.ML can be applied to discover new materials quickly and effectively,with significant savings in resources and time compared with traditional experiments and density functional theory(DFT)calculations.In this review,we present the application of ML in per-ovskites and briefly review the recent works in the field of ML-assisted perovskite design.Firstly,the advantages of perovskites in solar cells and the merits of ML applied to perovskites are discussed.Secondly,the workflow of ML in perovskite design and some basic ML algorithms are introduced.Thirdly,the applications of ML in predicting various properties of perovskite materials and devices are reviewed.Finally,we propose some prospects for the future development of this field.The rapid devel-opment of ML technology will largely promote the process of materials science,and ML will become an increasingly popular method for predicting the target properties of materials and devices.
基金supported by the National Key Research and Development Program (2022YFF0609504)the National Natural Science Foundation of China (61974126,51902273,62005230,62001405)the Natural Science Foundation of Fujian Province of China (No.2021J06009)
文摘Perovskite solar cells(PsCs)have developed tremendously over the past decade.However,the key factors influencing the power conversion efficiency(PCE)of PSCs remain incompletely understood,due to the complexity and coupling of these structural and compositional parameters.In this research,we demon-strate an effective approach to optimize PSCs performance via machine learning(ML).To address chal-lenges posed by limited samples,we propose a feature mask(FM)method,which augments training samples through feature transformation rather than synthetic data.Using this approach,squeeze-and-excitation residual network(SEResNet)model achieves an accuracy with a root-mean-square-error(RMSE)of 0.833%and a Pearson's correlation coefficient(r)of 0.980.Furthermore,we employ the permu-tation importance(PI)algorithm to investigate key features for PCE.Subsequently,we predict PCE through high-throughput screenings,in which we study the relationship between PCE and chemical com-positions.After that,we conduct experiments to validate the consistency between predicted results by ML and experimental results.In this work,ML demonstrates the capability to predict device performance,extract key parameters from complex systems,and accelerate the transition from laboratory findings to commercialapplications.
文摘Video streaming applications have grown considerably in recent years.As a result,this becomes one of the most significant contributors to global internet traffic.According to recent studies,the telecommunications industry loses millions of dollars due to poor video Quality of Experience(QoE)for users.Among the standard proposals for standardizing the quality of video streaming over internet service providers(ISPs)is the Mean Opinion Score(MOS).However,the accurate finding of QoE by MOS is subjective and laborious,and it varies depending on the user.A fully automated data analytics framework is required to reduce the inter-operator variability characteristic in QoE assessment.This work addresses this concern by suggesting a novel hybrid XGBStackQoE analytical model using a two-level layering technique.Level one combines multiple Machine Learning(ML)models via a layer one Hybrid XGBStackQoE-model.Individual ML models at level one are trained using the entire training data set.The level two Hybrid XGBStackQoE-Model is fitted using the outputs(meta-features)of the layer one ML models.The proposed model outperformed the conventional models,with an accuracy improvement of 4 to 5 percent,which is still higher than the current traditional models.The proposed framework could significantly improve video QoE accuracy.
基金This work was funded by the Deanship of Scientific Research at Jouf University under Grant Number(DSR2022-RG-0102).
文摘Software Defined Network(SDN)and Network Function Virtualization(NFV)technology promote several benefits to network operators,including reduced maintenance costs,increased network operational performance,simplified network lifecycle,and policies management.Network vulnerabilities try to modify services provided by Network Function Virtualization MANagement and Orchestration(NFV MANO),and malicious attacks in different scenarios disrupt the NFV Orchestrator(NFVO)and Virtualized Infrastructure Manager(VIM)lifecycle management related to network services or individual Virtualized Network Function(VNF).This paper proposes an anomaly detection mechanism that monitors threats in NFV MANO and manages promptly and adaptively to implement and handle security functions in order to enhance the quality of experience for end users.An anomaly detector investigates these identified risks and provides secure network services.It enables virtual network security functions and identifies anomalies in Kubernetes(a cloud-based platform).For training and testing purpose of the proposed approach,an intrusion-containing dataset is used that hold multiple malicious activities like a Smurf,Neptune,Teardrop,Pod,Land,IPsweep,etc.,categorized as Probing(Prob),Denial of Service(DoS),User to Root(U2R),and Remote to User(R2L)attacks.An anomaly detector is anticipated with the capabilities of a Machine Learning(ML)technique,making use of supervised learning techniques like Logistic Regression(LR),Support Vector Machine(SVM),Random Forest(RF),Naïve Bayes(NB),and Extreme Gradient Boosting(XGBoost).The proposed framework has been evaluated by deploying the identified ML algorithm on a Jupyter notebook in Kubeflow to simulate Kubernetes for validation purposes.RF classifier has shown better outcomes(99.90%accuracy)than other classifiers in detecting anomalies/intrusions in the containerized environment.
文摘Customer churn poses a significant challenge for the banking and finance industry in the United States, directly affecting profitability and market share. This study conducts a comprehensive comparative analysis of machine learning models for customer churn prediction, focusing on the U.S. context. The research evaluates the performance of logistic regression, random forest, and neural networks using industry-specific datasets, considering the economic impact and practical implications of the findings. The exploratory data analysis reveals unique patterns and trends in the U.S. banking and finance industry, such as the age distribution of customers and the prevalence of dormant accounts. The study incorporates macroeconomic factors to capture the potential influence of external conditions on customer churn behavior. The findings highlight the importance of leveraging advanced machine learning techniques and comprehensive customer data to develop effective churn prevention strategies in the U.S. context. By accurately predicting customer churn, financial institutions can proactively identify at-risk customers, implement targeted retention strategies, and optimize resource allocation. The study discusses the limitations and potential future improvements, serving as a roadmap for researchers and practitioners to further advance the field of customer churn prediction in the evolving landscape of the U.S. banking and finance industry.
文摘Machine learning(ML)is a type of artificial intelligence that assists computers in the acquisition of knowledge through data analysis,thus creating machines that can complete tasks otherwise requiring human intelligence.Among its various applications,it has proven groundbreaking in healthcare as well,both in clinical practice and research.In this editorial,we succinctly introduce ML applications and present a study,featured in the latest issue of the World Journal of Clinical Cases.The authors of this study conducted an analysis using both multiple linear regression(MLR)and ML methods to investigate the significant factors that may impact the estimated glomerular filtration rate in healthy women with and without non-alcoholic fatty liver disease(NAFLD).Their results implicated age as the most important determining factor in both groups,followed by lactic dehydrogenase,uric acid,forced expiratory volume in one second,and albumin.In addition,for the NAFLD-group,the 5th and 6th most important impact factors were thyroid-stimulating hormone and systolic blood pressure,as compared to plasma calcium and body fat for the NAFLD+group.However,the study's distinctive contribution lies in its adoption of ML methodologies,showcasing their superiority over traditional statistical approaches(herein MLR),thereby highlighting the potential of ML to represent an invaluable advanced adjunct tool in clinical practice and research.
文摘Parallel kinematics machine (PKM) is advantageous over the serial machine tools in processing the complex-surface products. A manufacturing service system for PKM is developed to provide the services of the complex-surface machining for potential geographically-dispersed manufacturing enterprises. In order to easily in- tegrate the external system, Web services are used to encapsulate post-processing functions of PKM legacy sys- tems, including compilation, workspace calculation, interfere calibration, and kinematics transformation. A ser- vice-oriented architecture(SOA) is proposed for the cooperative work between the PKM system and its client. The workflow and the function module of this manufacturing service system are presented. An example shows that as a result of SOA and loose coupling, such a Web service-based manufacturing service system is easier to in- tegrate and interoperate with its client. Meanwhile, the system decreases the manufacturing cost and improves the efficiency than its former kind of distributed system.
基金Project supported by the National Natural Science Foundation of China (No. 10271110) and the Teaching and Research Award Pro-gram for Outstanding Young Teachers in Higher Education, Institu-tions of MOE, China
文摘This work is aimed at investigating the online scheduling problem on two parallel and identical machines with a new feature that service requests from various customers are entitled to many different grade of service (GoS) levels, so each job and machine are labelled with the GoS levels, and each job can be processed by a particular machine only when its GoS level is no less than that of the machine. The goal is to minimize the makespan. For non-preemptive version, we propose an optimal online al-gorithm with competitive ratio 5/3. For preemptive version, we propose an optimal online algorithm with competitive ratio 3/2.
文摘With the rising demand for data access,network service providers face the challenge of growing their capital and operating costs while at the same time enhancing network capacity and meeting the increased demand for access.To increase efficacy of Software Defined Network(SDN)and Network Function Virtualization(NFV)framework,we need to eradicate network security configuration errors that may create vulnerabilities to affect overall efficiency,reduce network performance,and increase maintenance cost.The existing frameworks lack in security,and computer systems face few abnormalities,which prompts the need for different recognition and mitigation methods to keep the system in the operational state proactively.The fundamental concept behind SDN-NFV is the encroachment from specific resource execution to the programming-based structure.This research is around the combination of SDN and NFV for rational decision making to control and monitor traffic in the virtualized environment.The combination is often seen as an extra burden in terms of resources usage in a heterogeneous network environment,but as well as it provides the solution for critical problems specially regarding massive network traffic issues.The attacks have been expanding step by step;therefore,it is hard to recognize and protect by conventional methods.To overcome these issues,there must be an autonomous system to recognize and characterize the network traffic’s abnormal conduct if there is any.Only four types of assaults,including HTTP Flood,UDP Flood,Smurf Flood,and SiDDoS Flood,are considered in the identified dataset,to optimize the stability of the SDN-NFVenvironment and security management,through several machine learning based characterization techniques like Support Vector Machine(SVM),K-Nearest Neighbors(KNN),Logistic Regression(LR)and Isolation Forest(IF).Python is used for simulation purposes,including several valuable utilities like the mine package,the open-source Python ML libraries Scikit-learn,NumPy,SciPy,Matplotlib.Few Flood assaults and Structured Query Language(SQL)injections anomalies are validated and effectively-identified through the anticipated procedure.The classification results are promising and show that overall accuracy lies between 87%to 95%for SVM,LR,KNN,and IF classifiers in the scrutiny of traffic,whether the network traffic is normal or anomalous in the SDN-NFV environment.
基金the National Natural Science Foundation of China(Grant No.62075006)the National Key Research and Development Program of China(Grant No.2021YFB3600403)the Natural Science Talents Foundation(Grant No.KSRC22001532)。
文摘The performance of the metal halide perovskite solar cells(PSCs)highly relies on the experimental parameters,including the fabrication processes and the compositions of the perovskites;tremendous experimental work has been done to optimize these factors.However,predicting the device performance of the PSCs from the fabrication parameters before experiments is still challenging.Herein,we bridge this gap by machine learning(ML)based on a dataset including 1072 devices from peer-reviewed publications.The optimized ML model accurately predicts the PCE from the experimental parameters with a root mean square error of 1.28%and a Pearson coefficientr of 0.768.Moreover,the factors governing the device performance are ranked by shapley additive explanations(SHAP),among which,A-site cation is crucial to getting highly efficient PSCs.Experiments and density functional theory calculations are employed to validate and help explain the predicting results by the ML model.Our work reveals the feasibility of ML in predicting the device performance from the experimental parameters before experiments,which enables the reverse experimental design toward highly efficient PSCs.
基金Project(RTI04-01-03) supported by the Regional Technology Innovation Program of Ministry of Knowledge Economy (MKE),Korea
文摘Cam mechanics is one of the most popular devices for generating irregular motions and is widely used in automatic equipment,such as textile machines,internal combustion engines,and other automatic devices.In order to obtain a positive motion from the follower using a rotating cam,its shape should be correctly designed and manufactured.The development of an adequate CAD/CAM system for a cam profile CNC grinding machine is necessary to manufacture high-precision cams.The purpose of this study is the development of a CAD/CAM system and profile measuring device for a CNC grinding machine to obtain an optimal grinding speed with a constant surface roughness.Three types of disk cams were manufactured using the proposed algorithm and procedures to verify effectiveness of the developed CAD/CAM system.
基金This work has been supported by Fundación Séneca-Agencia de Ciencia y Tecnología de la Región de Murcia-under the FPI Grant 21429/FPI/20,and co-funded by Odin Solutions S.L.,Región de Murcia(Spain)the Spanish Ministry of Science,Innovation and Universities,under the projects ONOFRE 3(Grant No.PID2020-112675RB-C44)+1 种基金5GHuerta(Grant No.EQC2019-006364-P)both with ERDF fundsthe European Commission,under the INSPIRE-5Gplus(Grant No.871808)project.
文摘The exponential growth of mobile applications and services during the last years has challenged the existing network infrastructures.Consequently,the arrival of multiple management solutions to cope with this explosion along the end-to-end network chain has increased the complexity in the coordinated orchestration of different segments composing the whole infrastructure.The Zero-touch Network and Service Management(ZSM)concept has recently emerged to automatically orchestrate and manage network resources while assuring the Quality of Experience(QoE)demanded by users.Machine Learning(ML)is one of the key enabling technologies that many ZSM frameworks are adopting to bring intelligent decision making to the network management system.This paper presents a comprehensive survey of the state-of-the-art application of ML-based techniques to improve ZSM performance.To this end,the main related standardization activities and the aligned international projects and research efforts are deeply examined.From this dissection,the skyrocketing growth of the ZSM paradigm can be observed.Concretely,different standardization bodies have already designed reference architectures to set the foundations of novel automatic network management functions and resource orchestration.Aligned with these advances,diverse ML techniques are being currently exploited to build further ZSM developments in different aspects,including multi-tenancy management,traffic monitoring,and architecture coordination,among others.However,different challenges,such as the complexity,scalability,and security of ML mechanisms,are also identified,and future research guidelines are provided to accomplish a firm development of the ZSM ecosystem.
基金This research was financially supported by the Ministry of Small and Mediumsized Enterprises(SMEs)and Startups(MSS),Korea,under the“Regional Specialized Industry Development Program(R&D,S3091627)”supervised by Korea Institute for Advancement of Technology(KIAT).
文摘One of the common transportation systems in Korea is calling taxis through online applications,which is more convenient for passengers and drivers in the modern area.However,the driver’s passenger taxi request can be rejected based on the driver’s location and distance.Therefore,there is a need to specify driver’s acceptance and rejection of the received request.The security of this systemis anothermain core to save the transaction information and safety of passengers and drivers.In this study,the origin and destination of the Jeju island SouthKorea were captured from T-map and processed based on machine learning decision tree and XGBoost techniques.The blockchain framework is implemented in the Hyperledger Fabric platform.The experimental results represent the features of socio-economic.The cross-validation was accomplished.Distance is another factor for the taxi trip,which in total trip in midnight is quite shorter.This process presents the successful matching of ride-hailing taxi services with the specialty of distance,the trip request,and safety based on the total city measurement.
基金National Key R&D Program of China(Grant No.2019YFB1704600)National Natural Science Foundation of China(Grant Nos.51825502,51775216)Program for HUST Academic Frontier Youth Team of China(Grant No.2017QYTD04).
文摘With the continuous development of science and technology,electronic devices have begun to enter all aspects of human life,becoming increasingly closely related to human life.Users have higher quality requirements for electronic devices.Electronic device testing has gradually become an irreplaceable engineering process in modern manufacturing enterprises to guarantee the quality of products while preventing inferior products from entering the market.Considering the large output of electronic devices,improving the testing efficiency while reducing the testing cost has become an urgent problem to be solved.This study investigates the electronic device testing machine allocation problem(EDTMAP),aiming to improve the production of electronic devices and reduce the scheduling distance among testing machines through reasonable machine allocation.First,a mathematical model was formulated for the EDTMAP to maximize both production and the scheduling distance among testing machines.Second,we developed a discrete multi-objective artificial bee colony(DMOABC)algorithm to solve EDTMAP.A crossover operator and local search operator were designed to improve the exploration and exploitation of the algorithm,respectively.Numerical experiments were conducted to evaluate the performance of the proposed algorithm.The experimental results demonstrate the superiority of the proposed algorithm compared with the non-dominated sorting genetic algorithm II(NSGA-II)and strength Pareto evolutionary algorithm 2(SPEA2).Finally,the mathematical model and DMOABC algorithm were applied to a real-world factory that tests radio-frequency modules.The results verify that our method can significantly improve production and reduce the scheduling distance among testing machines.
文摘In order to incorporate smart elements into distribution networks at ITELCA laboratories in Bogotá-Colombia, a Machine-to-Machine-based solution has been developed. This solution aids in the process of low-cost electrical fault location, which contributes to improving quality of service, particularly by shortening interruption time spans in mid-voltage grids. The implementation makes use of a GENEKO modem that exploits its digital inputs together with a full coverage of certain required auxiliary services so as to generate proper detection signals whenever failure currents occur;which allows incorporating the latest failure detection technology into the system.
基金Supported by Science and Technology Support Program of Qiandongnan Prefecture,No.Qiandongnan Sci-Tech Support[2021]12Guizhou Province High-Level Innovative Talent Training Program,No.Qiannan Thousand Talents[2022]201701.
文摘BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration.
文摘Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease.
基金the National Key R&D Program of China(No.2021YFB3701705).
文摘This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models.
基金financially supported by the National Key Research and Development Program of China(No.2016YFB0701202,No.2017YFB0701500 and No.2020YFB1505901)National Natural Science Foundation of China(General Program No.51474149,52072240)+3 种基金Shanghai Science and Technology Committee(No.18511109300)Science and Technology Commission of the CMC(2019JCJQZD27300)financial support from the University of Michigan and Shanghai Jiao Tong University joint funding,China(AE604401)Science and Technology Commission of Shanghai Municipality(No.18511109302).
文摘Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems.
基金the financial support from the National Natural Science Foundation of China(22278070,21978047,21776046)。
文摘The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations.