期刊文献+
共找到221,132篇文章
< 1 2 250 >
每页显示 20 50 100
基于改进Q-Learning的移动机器人路径规划算法
1
作者 王立勇 王弘轩 +2 位作者 苏清华 王绅同 张鹏博 《电子测量技术》 北大核心 2024年第9期85-92,共8页
随着移动机器人在生产生活中的深入应用,其路径规划能力也需要向快速性和环境适应性兼备发展。为解决现有移动机器人使用强化学习方法进行路径规划时存在的探索前期容易陷入局部最优、反复搜索同一区域,探索后期收敛率低、收敛速度慢的... 随着移动机器人在生产生活中的深入应用,其路径规划能力也需要向快速性和环境适应性兼备发展。为解决现有移动机器人使用强化学习方法进行路径规划时存在的探索前期容易陷入局部最优、反复搜索同一区域,探索后期收敛率低、收敛速度慢的问题,本研究提出一种改进的Q-Learning算法。该算法改进Q矩阵赋值方法,使迭代前期探索过程具有指向性,并降低碰撞的情况;改进Q矩阵迭代方法,使Q矩阵更新具有前瞻性,避免在一个小区域中反复探索;改进随机探索策略,在迭代前期全面利用环境信息,后期向目标点靠近。在不同栅格地图仿真验证结果表明,本文算法在Q-Learning算法的基础上,通过上述改进降低探索过程中的路径长度、减少抖动并提高收敛的速度,具有更高的计算效率。 展开更多
关键词 路径规划 强化学习 移动机器人 Q-learning算法 ε-decreasing策略
下载PDF
M-learning结合CBL在消化科规培教学中的探讨及应用
2
作者 洪静 程中华 +3 位作者 余金玲 王韶英 嵇贝纳 冯珍 《中国卫生产业》 2024年第2期203-205,共3页
目的探究移动学习平台(M-learning,ML)结合案例教学(Case-based Learning,CBL)在消化科住院医师规范化培训(简称规培)教学中的应用效果。方法选取2021年1月—2023年1月于上海市徐汇区中心医院消化科参加规培学习的80名医师作为研究对象... 目的探究移动学习平台(M-learning,ML)结合案例教学(Case-based Learning,CBL)在消化科住院医师规范化培训(简称规培)教学中的应用效果。方法选取2021年1月—2023年1月于上海市徐汇区中心医院消化科参加规培学习的80名医师作为研究对象,将其按照随机数表法分为研究组和对照组,每组40名。对照组给予传统讲授式教学法,研究组给予M-learning结合CBL教学法,对比两组医师的理论考试成绩、实践技能考试成绩和学习满意度。结果研究组的理论成绩和实践技能考试成绩均高于对照组,差异具有统计学意义(P均<0.05);研究组的学习满意度明显高于对照组,差异具有统计学意义(P<0.05)。结论将Mlearning结合CBL教学法应用于消化科规培教学中,不仅能够提升医师的理论考试成绩和实践技能考试成绩,还能够有效提高医师学习满意度。 展开更多
关键词 M-learning CBL 消化科 规培教学
下载PDF
基于Q-Learning的航空器滑行路径规划研究
3
作者 王兴隆 王睿峰 《中国民航大学学报》 CAS 2024年第3期28-33,共6页
针对传统算法规划航空器滑行路径准确度低、不能根据整体场面运行情况进行路径规划的问题,提出一种基于Q-Learning的路径规划方法。通过对机场飞行区网络结构模型和强化学习的仿真环境分析,设置了状态空间和动作空间,并根据路径的合规... 针对传统算法规划航空器滑行路径准确度低、不能根据整体场面运行情况进行路径规划的问题,提出一种基于Q-Learning的路径规划方法。通过对机场飞行区网络结构模型和强化学习的仿真环境分析,设置了状态空间和动作空间,并根据路径的合规性和合理性设定了奖励函数,将路径合理性评价值设置为滑行路径长度与飞行区平均滑行时间乘积的倒数。最后,分析了动作选择策略参数对路径规划模型的影响。结果表明,与A*算法和Floyd算法相比,基于Q-Learning的路径规划在滑行距离最短的同时,避开了相对繁忙的区域,路径合理性评价值高。 展开更多
关键词 滑行路径规划 机场飞行区 强化学习 Q-learning
下载PDF
Machine learning applications in stroke medicine:advancements,challenges,and future prospectives 被引量:3
4
作者 Mario Daidone Sergio Ferrantelli Antonino Tuttolomondo 《Neural Regeneration Research》 SCIE CAS CSCD 2024年第4期769-773,共5页
Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning technique... Stroke is a leading cause of disability and mortality worldwide,necessitating the development of advanced technologies to improve its diagnosis,treatment,and patient outcomes.In recent years,machine learning techniques have emerged as promising tools in stroke medicine,enabling efficient analysis of large-scale datasets and facilitating personalized and precision medicine approaches.This abstract provides a comprehensive overview of machine learning’s applications,challenges,and future directions in stroke medicine.Recently introduced machine learning algorithms have been extensively employed in all the fields of stroke medicine.Machine learning models have demonstrated remarkable accuracy in imaging analysis,diagnosing stroke subtypes,risk stratifications,guiding medical treatment,and predicting patient prognosis.Despite the tremendous potential of machine learning in stroke medicine,several challenges must be addressed.These include the need for standardized and interoperable data collection,robust model validation and generalization,and the ethical considerations surrounding privacy and bias.In addition,integrating machine learning models into clinical workflows and establishing regulatory frameworks are critical for ensuring their widespread adoption and impact in routine stroke care.Machine learning promises to revolutionize stroke medicine by enabling precise diagnosis,tailored treatment selection,and improved prognostication.Continued research and collaboration among clinicians,researchers,and technologists are essential for overcoming challenges and realizing the full potential of machine learning in stroke care,ultimately leading to enhanced patient outcomes and quality of life.This review aims to summarize all the current implications of machine learning in stroke diagnosis,treatment,and prognostic evaluation.At the same time,another purpose of this paper is to explore all the future perspectives these techniques can provide in combating this disabling disease. 展开更多
关键词 cerebrovascular disease deep learning machine learning reinforcement learning STROKE stroke therapy supervised learning unsupervised learning
下载PDF
改进Q-Learning的路径规划算法研究
5
作者 宋丽君 周紫瑜 +2 位作者 李云龙 侯佳杰 何星 《小型微型计算机系统》 CSCD 北大核心 2024年第4期823-829,共7页
针对Q-Learning算法学习效率低、收敛速度慢且在动态障碍物的环境下路径规划效果不佳的问题,本文提出一种改进Q-Learning的移动机器人路径规划算法.针对该问题,算法根据概率的突变性引入探索因子来平衡探索和利用以加快学习效率;通过在... 针对Q-Learning算法学习效率低、收敛速度慢且在动态障碍物的环境下路径规划效果不佳的问题,本文提出一种改进Q-Learning的移动机器人路径规划算法.针对该问题,算法根据概率的突变性引入探索因子来平衡探索和利用以加快学习效率;通过在更新函数中设计深度学习因子以保证算法探索概率;融合遗传算法,避免陷入局部路径最优同时按阶段探索最优迭代步长次数,以减少动态地图探索重复率;最后提取输出的最优路径关键节点采用贝塞尔曲线进行平滑处理,进一步保证路径平滑度和可行性.实验通过栅格法构建地图,对比实验结果表明,改进后的算法效率相较于传统算法在迭代次数和路径上均有较大优化,且能够较好的实现动态地图下的路径规划,进一步验证所提方法的有效性和实用性. 展开更多
关键词 移动机器人 路径规划 Q-learning算法 平滑处理 动态避障
下载PDF
基于Q-learning的自适应链路状态路由协议
6
作者 吴麒 左琳立 +2 位作者 丁建 邢智童 夏士超 《重庆邮电大学学报(自然科学版)》 CSCD 北大核心 2024年第5期945-953,共9页
针对大规模无人机自组网面临的任务需求多样性、电磁环境复杂性、节点高机动性等问题,充分考虑无人机节点高速移动的特点,基于无人机拓扑稳定度和链路通信容量指标设计了一种无人机多点中继(multi-point relay,MPR)选择方法;为了减少网... 针对大规模无人机自组网面临的任务需求多样性、电磁环境复杂性、节点高机动性等问题,充分考虑无人机节点高速移动的特点,基于无人机拓扑稳定度和链路通信容量指标设计了一种无人机多点中继(multi-point relay,MPR)选择方法;为了减少网络路由更新时间,增加无人机自组网路由策略的稳定性和可靠性,提出了一种基于Q-learning的自适应链路状态路由协议(Q-learning based adaptive link state routing,QALSR)。仿真结果表明,所提算法性能指标优于现有的主动路由协议。 展开更多
关键词 无人机自组网 路由协议 强化学习 自适应
下载PDF
Significant risk factors for intensive care unit-acquired weakness:A processing strategy based on repeated machine learning 被引量:10
7
作者 Ling Wang Deng-Yan Long 《World Journal of Clinical Cases》 SCIE 2024年第7期1235-1242,共8页
BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective pr... BACKGROUND Intensive care unit-acquired weakness(ICU-AW)is a common complication that significantly impacts the patient's recovery process,even leading to adverse outcomes.Currently,there is a lack of effective preventive measures.AIM To identify significant risk factors for ICU-AW through iterative machine learning techniques and offer recommendations for its prevention and treatment.METHODS Patients were categorized into ICU-AW and non-ICU-AW groups on the 14th day post-ICU admission.Relevant data from the initial 14 d of ICU stay,such as age,comorbidities,sedative dosage,vasopressor dosage,duration of mechanical ventilation,length of ICU stay,and rehabilitation therapy,were gathered.The relationships between these variables and ICU-AW were examined.Utilizing iterative machine learning techniques,a multilayer perceptron neural network model was developed,and its predictive performance for ICU-AW was assessed using the receiver operating characteristic curve.RESULTS Within the ICU-AW group,age,duration of mechanical ventilation,lorazepam dosage,adrenaline dosage,and length of ICU stay were significantly higher than in the non-ICU-AW group.Additionally,sepsis,multiple organ dysfunction syndrome,hypoalbuminemia,acute heart failure,respiratory failure,acute kidney injury,anemia,stress-related gastrointestinal bleeding,shock,hypertension,coronary artery disease,malignant tumors,and rehabilitation therapy ratios were significantly higher in the ICU-AW group,demonstrating statistical significance.The most influential factors contributing to ICU-AW were identified as the length of ICU stay(100.0%)and the duration of mechanical ventilation(54.9%).The neural network model predicted ICU-AW with an area under the curve of 0.941,sensitivity of 92.2%,and specificity of 82.7%.CONCLUSION The main factors influencing ICU-AW are the length of ICU stay and the duration of mechanical ventilation.A primary preventive strategy,when feasible,involves minimizing both ICU stay and mechanical ventilation duration. 展开更多
关键词 Intensive care unit-acquired weakness Risk factors Machine learning PREVENTION Strategies
下载PDF
Machine learning for predicting the outcome of terminal ballistics events 被引量:1
8
作者 Shannon Ryan Neeraj Mohan Sushma +4 位作者 Arun Kumar AV Julian Berk Tahrima Hashem Santu Rana Svetha Venkatesh 《Defence Technology(防务技术)》 SCIE EI CAS CSCD 2024年第1期14-26,共13页
Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression mode... Machine learning(ML) is well suited for the prediction of high-complexity,high-dimensional problems such as those encountered in terminal ballistics.We evaluate the performance of four popular ML-based regression models,extreme gradient boosting(XGBoost),artificial neural network(ANN),support vector regression(SVR),and Gaussian process regression(GP),on two common terminal ballistics’ problems:(a)predicting the V50ballistic limit of monolithic metallic armour impacted by small and medium calibre projectiles and fragments,and(b) predicting the depth to which a projectile will penetrate a target of semi-infinite thickness.To achieve this we utilise two datasets,each consisting of approximately 1000samples,collated from public release sources.We demonstrate that all four model types provide similarly excellent agreement when interpolating within the training data and diverge when extrapolating outside this range.Although extrapolation is not advisable for ML-based regression models,for applications such as lethality/survivability analysis,such capability is required.To circumvent this,we implement expert knowledge and physics-based models via enforced monotonicity,as a Gaussian prior mean,and through a modified loss function.The physics-informed models demonstrate improved performance over both classical physics-based models and the basic ML regression models,providing an ability to accurately fit experimental data when it is available and then revert to the physics-based model when not.The resulting models demonstrate high levels of predictive accuracy over a very wide range of projectile types,target materials and thicknesses,and impact conditions significantly more diverse than that achievable from any existing analytical approach.Compared with numerical analysis tools such as finite element solvers the ML models run orders of magnitude faster.We provide some general guidelines throughout for the development,application,and reporting of ML models in terminal ballistics problems. 展开更多
关键词 Machine learning Artificial intelligence Physics-informed machine learning Terminal ballistics Armour
下载PDF
A credibility-aware swarm-federated deep learning framework in internet of vehicles 被引量:1
9
作者 Zhe Wang Xinhang Li +2 位作者 Tianhao Wu Chen Xu Lin Zhang 《Digital Communications and Networks》 SCIE CSCD 2024年第1期150-157,共8页
Although Federated Deep Learning(FDL)enables distributed machine learning in the Internet of Vehicles(IoV),it requires multiple clients to upload model parameters,thus still existing unavoidable communication overhead... Although Federated Deep Learning(FDL)enables distributed machine learning in the Internet of Vehicles(IoV),it requires multiple clients to upload model parameters,thus still existing unavoidable communication overhead and data privacy risks.The recently proposed Swarm Learning(SL)provides a decentralized machine learning approach for unit edge computing and blockchain-based coordination.A Swarm-Federated Deep Learning framework in the IoV system(IoV-SFDL)that integrates SL into the FDL framework is proposed in this paper.The IoV-SFDL organizes vehicles to generate local SL models with adjacent vehicles based on the blockchain empowered SL,then aggregates the global FDL model among different SL groups with a credibility weights prediction algorithm.Extensive experimental results show that compared with the baseline frameworks,the proposed IoV-SFDL framework reduces the overhead of client-to-server communication by 16.72%,while the model performance improves by about 5.02%for the same training iterations. 展开更多
关键词 Swarm learning Federated deep learning Internet of vehicles PRIVACY EFFICIENCY
下载PDF
Artificial Intelligence Meets Flexible Sensors:Emerging Smart Flexible Sensing Systems Driven by Machine Learning and Artificial Synapses 被引量:2
10
作者 Tianming Sun Bin Feng +8 位作者 Jinpeng Huo Yu Xiao Wengan Wang Jin Peng Zehua Li Chengjie Du Wenxian Wang Guisheng Zou Lei Liu 《Nano-Micro Letters》 SCIE EI CAS CSCD 2024年第1期235-273,共39页
The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,f... The recent wave of the artificial intelligence(AI)revolution has aroused unprecedented interest in the intelligentialize of human society.As an essential component that bridges the physical world and digital signals,flexible sensors are evolving from a single sensing element to a smarter system,which is capable of highly efficient acquisition,analysis,and even perception of vast,multifaceted data.While challenging from a manual perspective,the development of intelligent flexible sensing has been remarkably facilitated owing to the rapid advances of brain-inspired AI innovations from both the algorithm(machine learning)and the framework(artificial synapses)level.This review presents the recent progress of the emerging AI-driven,intelligent flexible sensing systems.The basic concept of machine learning and artificial synapses are introduced.The new enabling features induced by the fusion of AI and flexible sensing are comprehensively reviewed,which significantly advances the applications such as flexible sensory systems,soft/humanoid robotics,and human activity monitoring.As two of the most profound innovations in the twenty-first century,the deep incorporation of flexible sensing and AI technology holds tremendous potential for creating a smarter world for human beings. 展开更多
关键词 Flexible electronics Wearable electronics Neuromorphic MEMRISTOR Deep learning
下载PDF
Assessments of Data-Driven Deep Learning Models on One-Month Predictions of Pan-Arctic Sea Ice Thickness 被引量:1
11
作者 Chentao SONG Jiang ZHU Xichen LI 《Advances in Atmospheric Sciences》 SCIE CAS CSCD 2024年第7期1379-1390,共12页
In recent years,deep learning methods have gradually been applied to prediction tasks related to Arctic sea ice concentration,but relatively little research has been conducted for larger spatial and temporal scales,ma... In recent years,deep learning methods have gradually been applied to prediction tasks related to Arctic sea ice concentration,but relatively little research has been conducted for larger spatial and temporal scales,mainly due to the limited time coverage of observations and reanalysis data.Meanwhile,deep learning predictions of sea ice thickness(SIT)have yet to receive ample attention.In this study,two data-driven deep learning(DL)models are built based on the ConvLSTM and fully convolutional U-net(FC-Unet)algorithms and trained using CMIP6 historical simulations for transfer learning and fine-tuned using reanalysis/observations.These models enable monthly predictions of Arctic SIT without considering the complex physical processes involved.Through comprehensive assessments of prediction skills by season and region,the results suggest that using a broader set of CMIP6 data for transfer learning,as well as incorporating multiple climate variables as predictors,contribute to better prediction results,although both DL models can effectively predict the spatiotemporal features of SIT anomalies.Regarding the predicted SIT anomalies of the FC-Unet model,the spatial correlations with reanalysis reach an average level of 89%over all months,while the temporal anomaly correlation coefficients are close to unity in most cases.The models also demonstrate robust performances in predicting SIT and SIE during extreme events.The effectiveness and reliability of the proposed deep transfer learning models in predicting Arctic SIT can facilitate more accurate pan-Arctic predictions,aiding climate change research and real-time business applications. 展开更多
关键词 Arctic sea ice thickness deep learning spatiotemporal sequence prediction transfer learning
下载PDF
改进的Q-learning蜂群算法求解置换流水车间调度问题
12
作者 杜利珍 宣自风 +1 位作者 唐家琦 王鑫涛 《组合机床与自动化加工技术》 北大核心 2024年第10期175-180,共6页
针对置换流水车间调度问题,提出了一种基于改进的Q-learning算法的人工蜂群算法。该算法设计了一种改进的奖励函数作为人工蜂群算法的环境,根据奖励函数的优劣来判断下一代种群的寻优策略,并通过Q-learning智能选择人工蜂群算法的蜜源... 针对置换流水车间调度问题,提出了一种基于改进的Q-learning算法的人工蜂群算法。该算法设计了一种改进的奖励函数作为人工蜂群算法的环境,根据奖励函数的优劣来判断下一代种群的寻优策略,并通过Q-learning智能选择人工蜂群算法的蜜源的更新维度数大小,根据选择的维度数大小对编码进行更新,提高了收敛速度和精度,最后使用不同规模的置换流水车间调度问题的实例来验证所提算法的性能,通过对标准实例的计算与其它算法对比,证明该算法的准确性。 展开更多
关键词 Q-learning算法 人工蜂群算法 置换流水车间调度
下载PDF
High-throughput calculations combining machine learning to investigate the corrosion properties of binary Mg alloys 被引量:3
13
作者 Yaowei Wang Tian Xie +4 位作者 Qingli Tang Mingxu Wang Tao Ying Hong Zhu Xiaoqin Zeng 《Journal of Magnesium and Alloys》 SCIE EI CAS CSCD 2024年第4期1406-1418,共13页
Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experi... Magnesium(Mg)alloys have shown great prospects as both structural and biomedical materials,while poor corrosion resistance limits their further application.In this work,to avoid the time-consuming and laborious experiment trial,a high-throughput computational strategy based on first-principles calculations is designed for screening corrosion-resistant binary Mg alloy with intermetallics,from both the thermodynamic and kinetic perspectives.The stable binary Mg intermetallics with low equilibrium potential difference with respect to the Mg matrix are firstly identified.Then,the hydrogen adsorption energies on the surfaces of these Mg intermetallics are calculated,and the corrosion exchange current density is further calculated by a hydrogen evolution reaction(HER)kinetic model.Several intermetallics,e.g.Y_(3)Mg,Y_(2)Mg and La_(5)Mg,are identified to be promising intermetallics which might effectively hinder the cathodic HER.Furthermore,machine learning(ML)models are developed to predict Mg intermetallics with proper hydrogen adsorption energy employing work function(W_(f))and weighted first ionization energy(WFIE).The generalization of the ML models is tested on five new binary Mg intermetallics with the average root mean square error(RMSE)of 0.11 eV.This study not only predicts some promising binary Mg intermetallics which may suppress the galvanic corrosion,but also provides a high-throughput screening strategy and ML models for the design of corrosion-resistant alloy,which can be extended to ternary Mg alloys or other alloy systems. 展开更多
关键词 Mg intermetallics Corrosion property HIGH-THROUGHPUT Density functional theory Machine learning
下载PDF
Low-Cost Federated Broad Learning for Privacy-Preserved Knowledge Sharing in the RIS-Aided Internet of Vehicles 被引量:1
14
作者 Xiaoming Yuan Jiahui Chen +4 位作者 Ning Zhang Qiang(John)Ye Changle Li Chunsheng Zhu Xuemin Sherman Shen 《Engineering》 SCIE EI CAS CSCD 2024年第2期178-189,共12页
High-efficiency and low-cost knowledge sharing can improve the decision-making ability of autonomous vehicles by mining knowledge from the Internet of Vehicles(IoVs).However,it is challenging to ensure high efficiency... High-efficiency and low-cost knowledge sharing can improve the decision-making ability of autonomous vehicles by mining knowledge from the Internet of Vehicles(IoVs).However,it is challenging to ensure high efficiency of local data learning models while preventing privacy leakage in a high mobility environment.In order to protect data privacy and improve data learning efficiency in knowledge sharing,we propose an asynchronous federated broad learning(FBL)framework that integrates broad learning(BL)into federated learning(FL).In FBL,we design a broad fully connected model(BFCM)as a local model for training client data.To enhance the wireless channel quality for knowledge sharing and reduce the communication and computation cost of participating clients,we construct a joint resource allocation and reconfigurable intelligent surface(RIS)configuration optimization framework for FBL.The problem is decoupled into two convex subproblems.Aiming to improve the resource scheduling efficiency in FBL,a double Davidon–Fletcher–Powell(DDFP)algorithm is presented to solve the time slot allocation and RIS configuration problem.Based on the results of resource scheduling,we design a reward-allocation algorithm based on federated incentive learning(FIL)in FBL to compensate clients for their costs.The simulation results show that the proposed FBL framework achieves better performance than the comparison models in terms of efficiency,accuracy,and cost for knowledge sharing in the IoV. 展开更多
关键词 Knowledge sharing Internet of Vehicles Federated learning Broad learning Reconfigurable intelligent surfaces Resource allocation
下载PDF
Prediction model for corrosion rate of low-alloy steels under atmospheric conditions using machine learning algorithms 被引量:3
15
作者 Jingou Kuang Zhilin Long 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第2期337-350,共14页
This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while ... This work constructed a machine learning(ML)model to predict the atmospheric corrosion rate of low-alloy steels(LAS).The material properties of LAS,environmental factors,and exposure time were used as the input,while the corrosion rate as the output.6 dif-ferent ML algorithms were used to construct the proposed model.Through optimization and filtering,the eXtreme gradient boosting(XG-Boost)model exhibited good corrosion rate prediction accuracy.The features of material properties were then transformed into atomic and physical features using the proposed property transformation approach,and the dominant descriptors that affected the corrosion rate were filtered using the recursive feature elimination(RFE)as well as XGBoost methods.The established ML models exhibited better predic-tion performance and generalization ability via property transformation descriptors.In addition,the SHapley additive exPlanations(SHAP)method was applied to analyze the relationship between the descriptors and corrosion rate.The results showed that the property transformation model could effectively help with analyzing the corrosion behavior,thereby significantly improving the generalization ability of corrosion rate prediction models. 展开更多
关键词 machine learning low-alloy steel atmospheric corrosion prediction corrosion rate feature fusion
下载PDF
Deep learning-based inpainting of saturation artifacts in optical coherence tomography images 被引量:2
16
作者 Muyun Hu Zhuoqun Yuan +2 位作者 Di Yang Jingzhu Zhao Yanmei Liang 《Journal of Innovative Optical Health Sciences》 SCIE EI CSCD 2024年第3期1-10,共10页
Limited by the dynamic range of the detector,saturation artifacts usually occur in optical coherence tomography(OCT)imaging for high scattering media.The available methods are difficult to remove saturation artifacts ... Limited by the dynamic range of the detector,saturation artifacts usually occur in optical coherence tomography(OCT)imaging for high scattering media.The available methods are difficult to remove saturation artifacts and restore texture completely in OCT images.We proposed a deep learning-based inpainting method of saturation artifacts in this paper.The generation mechanism of saturation artifacts was analyzed,and experimental and simulated datasets were built based on the mechanism.Enhanced super-resolution generative adversarial networks were trained by the clear–saturated phantom image pairs.The perfect reconstructed results of experimental zebrafish and thyroid OCT images proved its feasibility,strong generalization,and robustness. 展开更多
关键词 Optical coherence tomography saturation artifacts deep learning image inpainting.
下载PDF
IDS-INT:Intrusion detection system using transformer-based transfer learning for imbalanced network traffic 被引量:3
17
作者 Farhan Ullah Shamsher Ullah +1 位作者 Gautam Srivastava Jerry Chun-Wei Lin 《Digital Communications and Networks》 SCIE CSCD 2024年第1期190-204,共15页
A network intrusion detection system is critical for cyber security against llegitimate attacks.In terms of feature perspectives,network traffic may include a variety of elements such as attack reference,attack type,a... A network intrusion detection system is critical for cyber security against llegitimate attacks.In terms of feature perspectives,network traffic may include a variety of elements such as attack reference,attack type,a subcategory of attack,host information,malicious scripts,etc.In terms of network perspectives,network traffic may contain an imbalanced number of harmful attacks when compared to normal traffic.It is challenging to identify a specific attack due to complex features and data imbalance issues.To address these issues,this paper proposes an Intrusion Detection System using transformer-based transfer learning for Imbalanced Network Traffic(IDS-INT).IDS-INT uses transformer-based transfer learning to learn feature interactions in both network feature representation and imbalanced data.First,detailed information about each type of attack is gathered from network interaction descriptions,which include network nodes,attack type,reference,host information,etc.Second,the transformer-based transfer learning approach is developed to learn detailed feature representation using their semantic anchors.Third,the Synthetic Minority Oversampling Technique(SMOTE)is implemented to balance abnormal traffic and detect minority attacks.Fourth,the Convolution Neural Network(CNN)model is designed to extract deep features from the balanced network traffic.Finally,the hybrid approach of the CNN-Long Short-Term Memory(CNN-LSTM)model is developed to detect different types of attacks from the deep features.Detailed experiments are conducted to test the proposed approach using three standard datasets,i.e.,UNsWNB15,CIC-IDS2017,and NSL-KDD.An explainable AI approach is implemented to interpret the proposed method and develop a trustable model. 展开更多
关键词 Network intrusion detection Transfer learning Features extraction Imbalance data Explainable AI CYBERSECURITY
下载PDF
Stress-assisted corrosion mechanism of 3Ni steel by using gradient boosting decision tree machining learning method 被引量:2
18
作者 Xiaojia Yang Jinghuan Jia +5 位作者 Qing Li Renzheng Zhu Jike Yang Zhiyong Liu Xuequn Cheng Xiaogang Li 《International Journal of Minerals,Metallurgy and Materials》 SCIE EI CAS CSCD 2024年第6期1311-1321,共11页
Traditional 3Ni weathering steel cannot completely meet the requirements for offshore engineering development,resulting in the design of novel 3Ni steel with the addition of microalloy elements such as Mn or Nb for st... Traditional 3Ni weathering steel cannot completely meet the requirements for offshore engineering development,resulting in the design of novel 3Ni steel with the addition of microalloy elements such as Mn or Nb for strength enhancement becoming a trend.The stress-assisted corrosion behavior of a novel designed high-strength 3Ni steel was investigated in the current study using the corrosion big data method.The information on the corrosion process was recorded using the galvanic corrosion current monitoring method.The gradi-ent boosting decision tree(GBDT)machine learning method was used to mine the corrosion mechanism,and the importance of the struc-ture factor was investigated.Field exposure tests were conducted to verify the calculated results using the GBDT method.Results indic-ated that the GBDT method can be effectively used to study the influence of structural factors on the corrosion process of 3Ni steel.Dif-ferent mechanisms for the addition of Mn and Cu to the stress-assisted corrosion of 3Ni steel suggested that Mn and Cu have no obvious effect on the corrosion rate of non-stressed 3Ni steel during the early stage of corrosion.When the corrosion reached a stable state,the in-crease in Mn element content increased the corrosion rate of 3Ni steel,while Cu reduced this rate.In the presence of stress,the increase in Mn element content and Cu addition can inhibit the corrosion process.The corrosion law of outdoor-exposed 3Ni steel is consistent with the law based on corrosion big data technology,verifying the reliability of the big data evaluation method and data prediction model selection. 展开更多
关键词 weathering steel stress-assisted corrosion gradient boosting decision tree machining learning
下载PDF
Machine learning applications on lunar meteorite minerals:From classification to mechanical properties prediction 被引量:1
19
作者 Eloy Peña-Asensio Josep M.Trigo-Rodríguez +2 位作者 Jordi Sort Jordi Ibáñez-Insa Albert Rimola 《International Journal of Mining Science and Technology》 SCIE EI CAS CSCD 2024年第9期1283-1292,共10页
Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments an... Amid the scarcity of lunar meteorites and the imperative to preserve their scientific value,nondestructive testing methods are essential.This translates into the application of microscale rock mechanics experiments and scanning electron microscopy for surface composition analysis.This study explores the application of Machine Learning algorithms in predicting the mineralogical and mechanical properties of DHOFAR 1084,JAH 838,and NWA 11444 lunar meteorites based solely on their atomic percentage compositions.Leveraging a prior-data fitted network model,we achieved near-perfect classification scores for meteorites,mineral groups,and individual minerals.The regressor models,notably the KNeighbor model,provided an outstanding estimate of the mechanical properties—previously measured by nanoindentation tests—such as hardness,reduced Young’s modulus,and elastic recovery.Further considerations on the nature and physical properties of the minerals forming these meteorites,including porosity,crystal orientation,or shock degree,are essential for refining predictions.Our findings underscore the potential of Machine Learning in enhancing mineral identification and mechanical property estimation in lunar exploration,which pave the way for new advancements and quick assessments in extraterrestrial mineral mining,processing,and research. 展开更多
关键词 METEORITES MOON MINERALOGY Machine learning Mechanical properties
下载PDF
Machine learning with active pharmaceutical ingredient/polymer interaction mechanism:Prediction for complex phase behaviors of pharmaceuticals and formulations 被引量:2
20
作者 Kai Ge Yiping Huang Yuanhui Ji 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2024年第2期263-272,共10页
The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceu... The high throughput prediction of the thermodynamic phase behavior of active pharmaceutical ingredients(APIs)with pharmaceutically relevant excipients remains a major scientific challenge in the screening of pharmaceutical formulations.In this work,a developed machine-learning model efficiently predicts the solubility of APIs in polymers by learning the phase equilibrium principle and using a few molecular descriptors.Under the few-shot learning framework,thermodynamic theory(perturbed-chain statistical associating fluid theory)was used for data augmentation,and computational chemistry was applied for molecular descriptors'screening.The results showed that the developed machine-learning model can predict the API-polymer phase diagram accurately,broaden the solubility data of APIs in polymers,and reproduce the relationship between API solubility and the interaction mechanisms between API and polymer successfully,which provided efficient guidance for the development of pharmaceutical formulations. 展开更多
关键词 Multi-task machine learning Density functional theory Hydrogen bond interaction MISCIBILITY SOLUBILITY
下载PDF
上一页 1 2 250 下一页 到第
使用帮助 返回顶部