期刊文献+
共找到6,384篇文章
< 1 2 250 >
每页显示 20 50 100
Scheduling an Energy-Aware Parallel Machine System with Deteriorating and Learning Effects Considering Multiple Optimization Objectives and Stochastic Processing Time
1
作者 Lei Wang Yuxin Qi 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第4期325-339,共15页
Currently,energy conservation draws wide attention in industrial manufacturing systems.In recent years,many studies have aimed at saving energy consumption in the process of manufacturing and scheduling is regarded as... Currently,energy conservation draws wide attention in industrial manufacturing systems.In recent years,many studies have aimed at saving energy consumption in the process of manufacturing and scheduling is regarded as an effective approach.This paper puts forwards a multi-objective stochastic parallel machine scheduling problem with the consideration of deteriorating and learning effects.In it,the real processing time of jobs is calculated by using their processing speed and normal processing time.To describe this problem in a mathematical way,amultiobjective stochastic programming model aiming at realizing makespan and energy consumption minimization is formulated.Furthermore,we develop a multi-objective multi-verse optimization combined with a stochastic simulation method to deal with it.In this approach,the multi-verse optimization is adopted to find favorable solutions from the huge solution domain,while the stochastic simulation method is employed to assess them.By conducting comparison experiments on test problems,it can be verified that the developed approach has better performance in coping with the considered problem,compared to two classic multi-objective evolutionary algorithms. 展开更多
关键词 Energy consumption optimization parallel machine scheduling multi-objective optimization deteriorating and learning effects stochastic simulation
下载PDF
Recent Developments of Modulation and Control for High-Power Current-Source-Converters Fed Electric Machine Systems 被引量:3
2
作者 Pengcheng Liu Zheng Wang +2 位作者 Sanmin Wei Yuwen Bo Shaoning Pu 《CES Transactions on Electrical Machines and Systems》 CSCD 2020年第3期215-226,共12页
The pulse-width-modulated(PWM)current-source converters(CSCs)fed electric machine systems can be considered as a type of high reliability energy conversion systems,since they work with the long-life DC-link inductor a... The pulse-width-modulated(PWM)current-source converters(CSCs)fed electric machine systems can be considered as a type of high reliability energy conversion systems,since they work with the long-life DC-link inductor and offer high fault-tolerant capability for short-circuit faults.Besides,they provide motor friendly waveforms and four-quadrant operation ability.Therefore,they are suitable for high-power applications of fans,pumps,compressors and wind power generation.The purpose of this paper is to comprehensively review recent developments of key technologies on modulation and control of high-power(HP)PWM-CSC fed electric machines systems,including reduction of low-order current harmonics,suppression of inductor–capacitor(LC)resonance,mitigation of common-mode voltage(CMV)and control of modular PWM-CSC fed systems.In particular,recent work on the overlapping effects during commutation,LC resonance suppression under fault-tolerant operation and collaboration of modular PMW-CSCs are described.Both theoretical analysis and some results in simulations and experiments are presented.Finally,a brief discussion regarding the future trend of the HP CSC fed electric machines systems is presented. 展开更多
关键词 Current source converter(CSC) high power(HP)applications electric machine system inductor–capacitor(LC)resonance low-order current harmonics common-mode voltage(CMV) MODULATION control
下载PDF
Modeling and Control of Hybrid Machine Systems—a Five-bar Mechanism Case 被引量:13
3
作者 Hongnian Yu 《International Journal of Automation and computing》 EI 2006年第3期235-243,共9页
A hybrid machine (HM) as a typical mechatronic device, is a useful tool to generate smooth motion, and combines the motions of a large constant speed motor with a small servo motor by means of a mechnical linkage me... A hybrid machine (HM) as a typical mechatronic device, is a useful tool to generate smooth motion, and combines the motions of a large constant speed motor with a small servo motor by means of a mechnical linkage mechanism, in order to provide a powerful programmable drive system. To achieve design objectives, a control system is required. To design a better control system and analyze the performance of an HM, a dynamic model is necessary. This paper first develops a dynamic model of an HM with a five-bar mechanism using a Lagrangian formulation. Then, several important properties which are very useful in system analysis, and control system design, are presented. Based on the developed dynamic model, two control approaches, computed torque, and combined computed torque and slide mode control, are adopted to control the HM system. Simulation results demonstrate the control performance and limitations of each control approach. 展开更多
关键词 Hybrid machine (HM) Lagrangian systems DYNAMICS computed torque control sliding mode control.
下载PDF
DXF File Identification with C# for CNC Engraving Machine System 被引量:1
4
作者 Huibin Yang Juan Yan 《Intelligent Control and Automation》 2015年第1期20-28,共9页
This paper researches the main technology of open CNC engraving machine, the DXF identification technology. Agraphic information extraction method is proposed. By this method, the graphic information in DXF file can b... This paper researches the main technology of open CNC engraving machine, the DXF identification technology. Agraphic information extraction method is proposed. By this method, the graphic information in DXF file can be identified and transformed into bottom motion controller’s code. So the engraving machine can achieve trajectory tracking. Then the open CNC engraving machine system is developed with C#. At last, the method is validated on a three axes motion experiment platform. The result shows that this method can efficiently identify the graphic information including line, circle, arc etc. in DXF file and the CNC engraving machine can be controlled well. 展开更多
关键词 DXF CNC ENGRAVING machine GALIL C#
下载PDF
A Multilevel Design Method of Large-scale Machine System Oriented Network Environment
5
作者 LI Shuiping HE Jianjun (School of Mechanical & Electronical Engineering,Wuhan University of Technology,Wuhan 430070 ,China 《武汉理工大学学报》 CAS CSCD 北大核心 2006年第S2期565-569,共5页
The design of large-scale machine system is a very complex problem.These design problems usually have a lot of design variables and constraints so that they are difficult to be solved rapidly and efficiently by using ... The design of large-scale machine system is a very complex problem.These design problems usually have a lot of design variables and constraints so that they are difficult to be solved rapidly and efficiently by using conventional methods.In this paper,a new multilevel design method oriented network environment is proposed,which maps the design problem of large-scale machine system into a hypergraph with degree of linking strength (DLS) between vertices.By decomposition of hypergraph,this method can divide the complex design problem into some small and simple subproblems that can be solved concurrently in a network. 展开更多
关键词 design LARGE-SCALE machine system DEGREE of LINKING strength
下载PDF
Coordinated power system stabilizers design of a nine-machine system
6
作者 Yao-Nan Yu Qing-Hua Li Department of Electrical Engineering,The University of British Columbia Canada 《Electricity》 1992年第3期32-38,共7页
In our earlier paper,power system stabilizers (PSSs) are designed for a nine-machine system,a new pole-placement tech-nique is developed for the design,and participation factors are used to decide how many stabilizers... In our earlier paper,power system stabilizers (PSSs) are designed for a nine-machine system,a new pole-placement tech-nique is developed for the design,and participation factors are used to decide how many stabilizers are required and where they shall be.Eachmachine being represented by a low-order linear model,there is some reservation of the results.In this paper,extensive transient simulationsare performed and each machine is represented by a high-order nonlinear model.Coherent groups are found.A weighted speed deviationindex (SDI) is defined to find out the most unstable machines in the system.PSSs are designed after the decisions of PSS number and sites.Transient simulations are carried out again for the closed-loop system.A system stability index (SSI) is used to evaluate the stability of theclosed-loop system.It is found that three PSSs are sufficient to ensure the stability of the nine-machine system. 展开更多
关键词 RESERVATION decide placement machines UNSTABLE participation EARLIER behave ALGEBRAIC AGAIN
下载PDF
Use of machine learning models for the prognostication of liver transplantation: A systematic review 被引量:1
7
作者 Gidion Chongo Jonathan Soldera 《World Journal of Transplantation》 2024年第1期164-188,共25页
BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are p... BACKGROUND Liver transplantation(LT)is a life-saving intervention for patients with end-stage liver disease.However,the equitable allocation of scarce donor organs remains a formidable challenge.Prognostic tools are pivotal in identifying the most suitable transplant candidates.Traditionally,scoring systems like the model for end-stage liver disease have been instrumental in this process.Nevertheless,the landscape of prognostication is undergoing a transformation with the integration of machine learning(ML)and artificial intelligence models.AIM To assess the utility of ML models in prognostication for LT,comparing their performance and reliability to established traditional scoring systems.METHODS Following the Preferred Reporting Items for Systematic Reviews and Meta-Analysis guidelines,we conducted a thorough and standardized literature search using the PubMed/MEDLINE database.Our search imposed no restrictions on publication year,age,or gender.Exclusion criteria encompassed non-English studies,review articles,case reports,conference papers,studies with missing data,or those exhibiting evident methodological flaws.RESULTS Our search yielded a total of 64 articles,with 23 meeting the inclusion criteria.Among the selected studies,60.8%originated from the United States and China combined.Only one pediatric study met the criteria.Notably,91%of the studies were published within the past five years.ML models consistently demonstrated satisfactory to excellent area under the receiver operating characteristic curve values(ranging from 0.6 to 1)across all studies,surpassing the performance of traditional scoring systems.Random forest exhibited superior predictive capabilities for 90-d mortality following LT,sepsis,and acute kidney injury(AKI).In contrast,gradient boosting excelled in predicting the risk of graft-versus-host disease,pneumonia,and AKI.CONCLUSION This study underscores the potential of ML models in guiding decisions related to allograft allocation and LT,marking a significant evolution in the field of prognostication. 展开更多
关键词 Liver transplantation machine learning models PROGNOSTICATION Allograft allocation Artificial intelligence
下载PDF
Social Media-Based Surveillance Systems for Health Informatics Using Machine and Deep Learning Techniques:A Comprehensive Review and Open Challenges
8
作者 Samina Amin Muhammad Ali Zeb +3 位作者 Hani Alshahrani Mohammed Hamdi Mohammad Alsulami Asadullah Shaikh 《Computer Modeling in Engineering & Sciences》 SCIE EI 2024年第5期1167-1202,共36页
Social media(SM)based surveillance systems,combined with machine learning(ML)and deep learning(DL)techniques,have shown potential for early detection of epidemic outbreaks.This review discusses the current state of SM... Social media(SM)based surveillance systems,combined with machine learning(ML)and deep learning(DL)techniques,have shown potential for early detection of epidemic outbreaks.This review discusses the current state of SM-based surveillance methods for early epidemic outbreaks and the role of ML and DL in enhancing their performance.Since,every year,a large amount of data related to epidemic outbreaks,particularly Twitter data is generated by SM.This paper outlines the theme of SM analysis for tracking health-related issues and detecting epidemic outbreaks in SM,along with the ML and DL techniques that have been configured for the detection of epidemic outbreaks.DL has emerged as a promising ML technique that adaptsmultiple layers of representations or features of the data and yields state-of-the-art extrapolation results.In recent years,along with the success of ML and DL in many other application domains,both ML and DL are also popularly used in SM analysis.This paper aims to provide an overview of epidemic outbreaks in SM and then outlines a comprehensive analysis of ML and DL approaches and their existing applications in SM analysis.Finally,this review serves the purpose of offering suggestions,ideas,and proposals,along with highlighting the ongoing challenges in the field of early outbreak detection that still need to be addressed. 展开更多
关键词 Social media EPIDEMIC machine learning deep learning health informatics PANDEMIC
下载PDF
Computing large deviation prefactors of stochastic dynamical systems based on machine learning
9
作者 李扬 袁胜兰 +1 位作者 陆凌宏志 刘先斌 《Chinese Physics B》 SCIE EI CAS CSCD 2024年第4期364-373,共10页
We present a large deviation theory that characterizes the exponential estimate for rare events in stochastic dynamical systems in the limit of weak noise.We aim to consider a next-to-leading-order approximation for m... We present a large deviation theory that characterizes the exponential estimate for rare events in stochastic dynamical systems in the limit of weak noise.We aim to consider a next-to-leading-order approximation for more accurate calculation of the mean exit time by computing large deviation prefactors with the aid of machine learning.More specifically,we design a neural network framework to compute quasipotential,most probable paths and prefactors based on the orthogonal decomposition of a vector field.We corroborate the higher effectiveness and accuracy of our algorithm with two toy models.Numerical experiments demonstrate its powerful functionality in exploring the internal mechanism of rare events triggered by weak random fluctuations. 展开更多
关键词 machine learning large deviation prefactors stochastic dynamical systems rare events
下载PDF
A Systematic Literature Review of Machine Learning and Deep Learning Approaches for Spectral Image Classification in Agricultural Applications Using Aerial Photography
10
作者 Usman Khan Muhammad Khalid Khan +4 位作者 Muhammad Ayub Latif Muhammad Naveed Muhammad Mansoor Alam Salman A.Khan Mazliham Mohd Su’ud 《Computers, Materials & Continua》 SCIE EI 2024年第3期2967-3000,共34页
Recently,there has been a notable surge of interest in scientific research regarding spectral images.The potential of these images to revolutionize the digital photography industry,like aerial photography through Unma... Recently,there has been a notable surge of interest in scientific research regarding spectral images.The potential of these images to revolutionize the digital photography industry,like aerial photography through Unmanned Aerial Vehicles(UAVs),has captured considerable attention.One encouraging aspect is their combination with machine learning and deep learning algorithms,which have demonstrated remarkable outcomes in image classification.As a result of this powerful amalgamation,the adoption of spectral images has experienced exponential growth across various domains,with agriculture being one of the prominent beneficiaries.This paper presents an extensive survey encompassing multispectral and hyperspectral images,focusing on their applications for classification challenges in diverse agricultural areas,including plants,grains,fruits,and vegetables.By meticulously examining primary studies,we delve into the specific agricultural domains where multispectral and hyperspectral images have found practical use.Additionally,our attention is directed towards utilizing machine learning techniques for effectively classifying hyperspectral images within the agricultural context.The findings of our investigation reveal that deep learning and support vector machines have emerged as widely employed methods for hyperspectral image classification in agriculture.Nevertheless,we also shed light on the various issues and limitations of working with spectral images.This comprehensive analysis aims to provide valuable insights into the current state of spectral imaging in agriculture and its potential for future advancements. 展开更多
关键词 machine learning deep learning unmanned aerial vehicles multi-spectral images image recognition object detection hyperspectral images aerial photography
下载PDF
Smart Energy Management System Using Machine Learning
11
作者 Ali Sheraz Akram Sagheer Abbas +3 位作者 Muhammad Adnan Khan Atifa Athar Taher M.Ghazal Hussam Al Hamadi 《Computers, Materials & Continua》 SCIE EI 2024年第1期959-973,共15页
Energy management is an inspiring domain in developing of renewable energy sources.However,the growth of decentralized energy production is revealing an increased complexity for power grid managers,inferring more qual... Energy management is an inspiring domain in developing of renewable energy sources.However,the growth of decentralized energy production is revealing an increased complexity for power grid managers,inferring more quality and reliability to regulate electricity flows and less imbalance between electricity production and demand.The major objective of an energy management system is to achieve optimum energy procurement and utilization throughout the organization,minimize energy costs without affecting production,and minimize environmental effects.Modern energy management is an essential and complex subject because of the excessive consumption in residential buildings,which necessitates energy optimization and increased user comfort.To address the issue of energy management,many researchers have developed various frameworks;while the objective of each framework was to sustain a balance between user comfort and energy consumption,this problem hasn’t been fully solved because of how difficult it is to solve it.An inclusive and Intelligent Energy Management System(IEMS)aims to provide overall energy efficiency regarding increased power generation,increase flexibility,increase renewable generation systems,improve energy consumption,reduce carbon dioxide emissions,improve stability,and reduce energy costs.Machine Learning(ML)is an emerging approach that may be beneficial to predict energy efficiency in a better way with the assistance of the Internet of Energy(IoE)network.The IoE network is playing a vital role in the energy sector for collecting effective data and usage,resulting in smart resource management.In this research work,an IEMS is proposed for Smart Cities(SC)using the ML technique to better resolve the energy management problem.The proposed system minimized the energy consumption with its intelligent nature and provided better outcomes than the previous approaches in terms of 92.11% accuracy,and 7.89% miss-rate. 展开更多
关键词 Intelligent energy management system smart cities machine learning
下载PDF
Machine learning and human‐machine trust in healthcare:A systematic survey
12
作者 Han Lin Jiatong Han +4 位作者 Pingping Wu Jiangyan Wang Juan Tu Hao Tang Liuning Zhu 《CAAI Transactions on Intelligence Technology》 SCIE EI 2024年第2期286-302,共17页
As human‐machine interaction(HMI)in healthcare continues to evolve,the issue of trust in HMI in healthcare has been raised and explored.It is critical for the development and safety of healthcare that humans have pro... As human‐machine interaction(HMI)in healthcare continues to evolve,the issue of trust in HMI in healthcare has been raised and explored.It is critical for the development and safety of healthcare that humans have proper trust in medical machines.Intelligent machines that have applied machine learning(ML)technologies continue to penetrate deeper into the medical environment,which also places higher demands on intelligent healthcare.In order to make machines play a role in HMI in healthcare more effectively and make human‐machine cooperation more harmonious,the authors need to build good humanmachine trust(HMT)in healthcare.This article provides a systematic overview of the prominent research on ML and HMT in healthcare.In addition,this study explores and analyses ML and three important factors that influence HMT in healthcare,and then proposes a HMT model in healthcare.Finally,general trends are summarised and issues to consider addressing in future research on HMT in healthcare are identified. 展开更多
关键词 human-machine interaction machine learning trust
下载PDF
An Improved Enterprise Resource Planning System Using Machine Learning Techniques
13
作者 Ahmed Youssri Zakaria Elsayed Abdelbadea +4 位作者 Atef Raslan Tarek Ali Mervat Gheith Al-Sayed Khater Essam A. Amin 《Journal of Software Engineering and Applications》 2024年第5期203-213,共11页
Traditional Enterprise Resource Planning (ERP) systems with relational databases take weeks to deliver predictable insights instantly. The most accurate information is provided to companies to make the best decisions ... Traditional Enterprise Resource Planning (ERP) systems with relational databases take weeks to deliver predictable insights instantly. The most accurate information is provided to companies to make the best decisions through advanced analytics that examine the past and the future and capture information about the present. Integrating machine learning (ML) into financial ERP systems offers several benefits, including increased accuracy, efficiency, and cost savings. Also, ERP systems are crucial in overseeing different aspects of Human Capital Management (HCM) in organizations. The performance of the staff draws the interest of the management. In particular, to guarantee that the proper employees are assigned to the convenient task at the suitable moment, train and qualify them, and build evaluation systems to follow up their performance and an attempt to maintain the potential talents of workers. Also, predicting employee salaries correctly is necessary for the efficient distribution of resources, retaining talent, and ensuring the success of the organization as a whole. Conventional ERP system salary forecasting methods typically use static reports that only show the system’s current state, without analyzing employee data or providing recommendations. We designed and enforced a prototype to define to apply ML algorithms on Oracle EBS data to enhance employee evaluation using real-time data directly from the ERP system. Based on measurements of accuracy, the Random Forest algorithm enhanced the performance of this system. This model offers an accuracy of 90% on the balanced dataset. 展开更多
关键词 ERP HCM machine Learning Employee Performance Pythonista Pythoneer
下载PDF
A systematic machine learning method for reservoir identification and production prediction 被引量:1
14
作者 Wei Liu Zhangxin Chen +1 位作者 Yuan Hu Liuyang Xu 《Petroleum Science》 SCIE EI CAS CSCD 2023年第1期295-308,共14页
Reservoir identification and production prediction are two of the most important tasks in petroleum exploration and development.Machine learning(ML)methods are used for petroleum-related studies,but have not been appl... Reservoir identification and production prediction are two of the most important tasks in petroleum exploration and development.Machine learning(ML)methods are used for petroleum-related studies,but have not been applied to reservoir identification and production prediction based on reservoir identification.Production forecasting studies are typically based on overall reservoir thickness and lack accuracy when reservoirs contain a water or dry layer without oil production.In this paper,a systematic ML method was developed using classification models for reservoir identification,and regression models for production prediction.The production models are based on the reservoir identification results.To realize the reservoir identification,seven optimized ML methods were used:four typical single ML methods and three ensemble ML methods.These methods classify the reservoir into five types of layers:water,dry and three levels of oil(I oil layer,II oil layer,III oil layer).The validation and test results of these seven optimized ML methods suggest the three ensemble methods perform better than the four single ML methods in reservoir identification.The XGBoost produced the model with the highest accuracy;up to 99%.The effective thickness of I and II oil layers determined during the reservoir identification was fed into the models for predicting production.Effective thickness considers the distribution of the water and the oil resulting in a more reasonable production prediction compared to predictions based on the overall reservoir thickness.To validate the superiority of the ML methods,reference models using overall reservoir thickness were built for comparison.The models based on effective thickness outperformed the reference models in every evaluation metric.The prediction accuracy of the ML models using effective thickness were 10%higher than that of reference model.Without the personal error or data distortion existing in traditional methods,this novel system realizes rapid analysis of data while reducing the time required to resolve reservoir classification and production prediction challenges.The ML models using the effective thickness obtained from reservoir identification were more accurate when predicting oil production compared to previous studies which use overall reservoir thickness. 展开更多
关键词 Reservoir identification Production prediction machine learning Ensemble method
下载PDF
Machine Learning Accelerated Real-Time Model Predictive Control for Power Systems 被引量:1
15
作者 Ramij Raja Hossain Ratnesh Kumar 《IEEE/CAA Journal of Automatica Sinica》 SCIE EI CSCD 2023年第4期916-930,共15页
This paper presents a machine-learning-based speedup strategy for real-time implementation of model-predictive-control(MPC)in emergency voltage stabilization of power systems.Despite success in various applications,re... This paper presents a machine-learning-based speedup strategy for real-time implementation of model-predictive-control(MPC)in emergency voltage stabilization of power systems.Despite success in various applications,real-time implementation of MPC in power systems has not been successful due to the online control computation time required for large-sized complex systems,and in power systems,the computation time exceeds the available decision time used in practice by a large extent.This long-standing problem is addressed here by developing a novel MPC-based framework that i)computes an optimal strategy for nominal loads in an offline setting and adapts it for real-time scenarios by successive online control corrections at each control instant utilizing the latest measurements,and ii)employs a machine-learning based approach for the prediction of voltage trajectory and its sensitivity to control inputs,thereby accelerating the overall control computation by multiple times.Additionally,a realistic control coordination scheme among static var compensators(SVC),load-shedding(LS),and load tap-changers(LTC)is presented that incorporates the practical delayed actions of the LTCs.The performance of the proposed scheme is validated for IEEE 9-bus and 39-bus systems,with±20%variations in nominal loading conditions together with contingencies.We show that our proposed methodology speeds up the online computation by 20-fold,bringing it down to a practically feasible value(fraction of a second),making the MPC real-time and feasible for power system control for the first time. 展开更多
关键词 machine learning model predictive control(MPC) neural network perturbation control voltage stabilization
下载PDF
Investigation of Nonlinear PI Multi-loop Control Strategy for Aircraft HVDC Generator System with Wound Rotor Synchronous Machine 被引量:1
16
作者 Zhaoyang Qu Zhuoran Zhang +1 位作者 Jincai Li Heng Shi 《CES Transactions on Electrical Machines and Systems》 CSCD 2023年第1期92-99,共8页
In order to enhance the transient performance of aircraft high voltage DC(HVDC)generation system with wound rotor synchronous machine(WRSM)under a wide speed range,the nonlinear PI multi-loop control strategy is propo... In order to enhance the transient performance of aircraft high voltage DC(HVDC)generation system with wound rotor synchronous machine(WRSM)under a wide speed range,the nonlinear PI multi-loop control strategy is proposed in this paper.Traditional voltage control method is hard to achieve the dynamic performance requirements of the HVDC generation system under a wide speed range,so the nonlinear PI parameter adjustment,load current feedback and speed feedback are added to the voltage and excitation current double loop control.The transfer function of the HVDC generation system is derived,and the relationship between speed,load current and PI parameters is obtained.The PI parameters corresponding to the load at certain speed are used to shorten the adjusting time when the load suddenly changes.The dynamic responses in transient processes are analyzed by experiment.The results illustrate that the WRSM HVDC generator system with this method has better dynamic performance. 展开更多
关键词 Aircraft HVDC generation system More electric aircraft Transient performance Wound rotor synchronous machine
下载PDF
Significant risk factors for intensive care unit-acquired weakness:A processing strategy based on repeated machine learning 被引量:2
17
作者 Ling Wang Deng-Yan Long 《World Journal of Clinical Cases》 SCIE 2024年第7期1235-1242,共8页
BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr... BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration. 展开更多
关键词 Intensive care unit-acquired weakness Risk factors machine learning PREVENTION Strategies
下载PDF
Machine learning applications in stroke medicine:advancements,challenges,and future prospectives 被引量:1
18
作者 Mario Daidone Sergio Ferrantelli Antonino Tuttolomondo 《Neural Regeneration Research》 SCIE CAS CSCD 2024年第4期769-773,共5页
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique... Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease. 展开更多
关键词 cerebrovascular disease deep learning machine learning reinforcement learning STROKE stroke therapy supervised learning unsupervised learning
下载PDF
Assessment of compressive strength of jet grouting by machine learning 被引量:1
19
作者 Esteban Diaz Edgar Leonardo Salamanca-Medina Roberto Tomas 《Journal of Rock Mechanics and Geotechnical Engineering》 SCIE CSCD 2024年第1期102-111,共10页
Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the prope... Jet grouting is one of the most popular soil improvement techniques,but its design usually involves great uncertainties that can lead to economic cost overruns in construction projects.The high dispersion in the properties of the improved material leads to designers assuming a conservative,arbitrary and unjustified strength,which is even sometimes subjected to the results of the test fields.The present paper presents an approach for prediction of the uniaxial compressive strength(UCS)of jet grouting columns based on the analysis of several machine learning algorithms on a database of 854 results mainly collected from different research papers.The selected machine learning model(extremely randomized trees)relates the soil type and various parameters of the technique to the value of the compressive strength.Despite the complex mechanism that surrounds the jet grouting process,evidenced by the high dispersion and low correlation of the variables studied,the trained model allows to optimally predict the values of compressive strength with a significant improvement with respect to the existing works.Consequently,this work proposes for the first time a reliable and easily applicable approach for estimation of the compressive strength of jet grouting columns. 展开更多
关键词 Jet grouting Ground improvement Compressive strength machine learning
下载PDF
A machine learning model to predict efficacy of neoadjuvant therapy in breast cancer based on dynamic changes in systemic immunity
20
作者 Yusong Wang Mozhi Wang +6 位作者 Keda Yu Shouping Xu Pengfei Qiu Zhidong Lyu Mingke Cui Qiang Zhang Yingying Xu 《Cancer Biology & Medicine》 SCIE CAS CSCD 2023年第3期218-228,共11页
Objective:Neoadjuvant therapy(NAT)has been widely implemented as an essential treatment to improve therapeutic efficacy in patients with locally-advanced cancer to reduce tumor burden and prolong survival,particularly... Objective:Neoadjuvant therapy(NAT)has been widely implemented as an essential treatment to improve therapeutic efficacy in patients with locally-advanced cancer to reduce tumor burden and prolong survival,particularly for human epidermal growth receptor 2-positive and triple-negative breast cancer.The role of peripheral immune components in predicting therapeutic responses has received limited attention.Herein we determined the relationship between dynamic changes in peripheral immune indices and therapeutic responses during NAT administration.Methods:Peripheral immune index data were collected from 134 patients before and after NAT.Logistic regression and machine learning algorithms were applied to the feature selection and model construction processes,respectively.Results:Peripheral immune status with a greater number of CD3^(+)T cells before and after NAT,and a greater number of CD8^(+)T cells,fewer CD4^(+)T cells,and fewer NK cells after NAT was significantly related to a pathological complete response(P<0.05).The post-NAT NK cell-to-pre-NAT NK cell ratio was negatively correlated with the response to NAT(HR=0.13,P=0.008).Based on the results of logistic regression,14 reliable features(P<0.05)were selected to construct the machine learning model.The random forest model exhibited the best power to predict efficacy of NAT among 10 machine learning model approaches(AUC=0.733).Conclusions:Statistically significant relationships between several specific immune indices and the efficacy of NAT were revealed.A random forest model based on dynamic changes in peripheral immune indices showed robust performance in predicting NAT efficacy. 展开更多
关键词 Breast cancer neoadjuvant therapy peripheral blood lymphocytes machine learning prediction model
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部