Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning probl...Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.展开更多
By considering the eigenratio of the Laplacian matrix as the synchronizability measure, this paper presents an efficient method to enhance the synchronizability of undirected and unweighted networks via rewiring. The ...By considering the eigenratio of the Laplacian matrix as the synchronizability measure, this paper presents an efficient method to enhance the synchronizability of undirected and unweighted networks via rewiring. The rewiring method combines the use of tabu search and a local greedy algorithm so that an effective search of solutions can be achieved. As demonstrated in the simulation results, the performance of the proposed approach outperforms the existing methods for a large variety of initial networks, both in terms of speed and quality of solutions.展开更多
With the wide application of automated guided vehicles(AGVs) in large scale outdoor scenarios with complex terrain,the collaborative work of a large number of AGVs becomes the main trend.The effective multi-agent path...With the wide application of automated guided vehicles(AGVs) in large scale outdoor scenarios with complex terrain,the collaborative work of a large number of AGVs becomes the main trend.The effective multi-agent path finding(MAPF) algorithm is urgently needed to ensure the efficiency and realizability of the whole system. The complex terrain of outdoor scenarios is fully considered by using different values of passage cost to quantify different terrain types. The objective of the MAPF problem is to minimize the cost of passage while the Manhattan distance of paths and the time of passage are also evaluated for a comprehensive comparison. The pre-path-planning and real-time-conflict based greedy(PRG) algorithm is proposed as the solution. Simulation is conducted and the proposed PRG algorithm is compared with waiting-stop A^(*) and conflict based search(CBS) algorithms. Results show that the PRG algorithm outperforms the waiting-stop A^(*) algorithm in all three performance indicators,and it is more applicable than the CBS algorithm when a large number of AGVs are working collaboratively with frequent collisions.展开更多
The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study intro...The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study introduces a hybrid optimization algorithm, named the adaptive inertia weight whale optimization algorithm and gannet optimization algorithm (AIWGOA), which addresses challenges in enhancing handwritten documents. The hybrid strategy integrates the strengths of both algorithms, significantly enhancing their capabilities, whereas the adaptive parameter strategy mitigates the need for manual parameter setting. By amalgamating the hybrid strategy and parameter-adaptive approach, the Gannet Optimization Algorithm was refined to yield the AIWGOA. Through a performance analysis of the CEC2013 benchmark, the AIWGOA demonstrates notable advantages across various metrics. Subsequently, an evaluation index was employed to assess the enhanced handwritten documents and images, affirming the superior practical application of the AIWGOA compared with other algorithms.展开更多
A Rapid-exploration Random Tree(RRT)autonomous detection algorithm based on the multi-guide-node deflection strategy and Karto Simultaneous Localization and Mapping(SLAM)algorithm was proposed to solve the problems of...A Rapid-exploration Random Tree(RRT)autonomous detection algorithm based on the multi-guide-node deflection strategy and Karto Simultaneous Localization and Mapping(SLAM)algorithm was proposed to solve the problems of low efficiency of detecting frontier boundary points and drift distortion in the process of map building in the traditional RRT algorithm in the autonomous detection strategy of mobile robot.Firstly,an RRT global frontier boundary point detection algorithm based on the multi-guide-node deflection strategy was put forward,which introduces the reference value of guide nodes’deflection probability into the random sampling function so that the global search tree can detect frontier boundary points towards the guide nodes according to random probability.After that,a new autonomous detection algorithm for mobile robots was proposed by combining the graph optimization-based Karto SLAM algorithm with the previously improved RRT algorithm.The algorithm simulation platform based on the Gazebo platform was built.The simulation results show that compared with the traditional RRT algorithm,the proposed RRT autonomous detection algorithm can effectively reduce the time of autonomous detection,plan the length of detection trajectory under the condition of high average detection coverage,and complete the task of autonomous detection mapping more efficiently.Finally,with the help of the ROS-based mobile robot experimental platform,the performance of the proposed algorithm was verified in the real environment of different obstacles.The experimental results show that in the actual environment of simple and complex obstacles,the proposed RRT autonomous detection algorithm was superior to the traditional RRT autonomous detection algorithm in the time of detection,length of detection trajectory,and average coverage,thus improving the efficiency and accuracy of autonomous detection.展开更多
The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource pr...The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.展开更多
Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach ess...Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach essential for minimizing unnecessary transmission energy consumption with sustained network lifetime. This clustering process is identified as the Non-deterministic Polynomial(NP)-hard optimization problems which has the maximized probability of being solved through metaheuristic algorithms.This adoption of hybrid metaheuristic algorithm concentrates on the identification of the optimal or nearoptimal solutions which aids in better energy stability during Cluster Head(CH) selection. In this paper,Hybrid Seagull and Whale Optimization Algorithmbased Dynamic Clustering Protocol(HSWOA-DCP)is proposed with the exploitation benefits of WOA and exploration merits of SEOA to optimal CH selection for maintaining energy stability with prolonged network lifetime. This HSWOA-DCP adopted the modified version of SEagull Optimization Algorithm(SEOA) to handle the problem of premature convergence and computational accuracy which is maximally possible during CH selection. The inclusion of SEOA into WOA improved the global searching capability during the selection of CH and prevents worst fitness nodes from being selected as CH, since the spiral attacking behavior of SEOA is similar to the bubble-net characteristics of WOA. This CH selection integrates the spiral attacking principles of SEOA and contraction surrounding mechanism of WOA for improving computation accuracy to prevent frequent election process. It also included the strategy of levy flight strategy into SEOA for potentially avoiding premature convergence to attain better trade-off between the rate of exploration and exploitation in a more effective manner. The simulation results of the proposed HSWOADCP confirmed better network survivability rate, network residual energy and network overall throughput on par with the competitive CH selection schemes under different number of data transmission rounds.The statistical analysis of the proposed HSWOA-DCP scheme also confirmed its energy stability with respect to ANOVA test.展开更多
In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding ...In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.展开更多
This advanced paper presents a new approach to improving image steganography using the Ant Colony Optimization(ACO)algorithm.Image steganography,a technique of embedding hidden information in digital photographs,shoul...This advanced paper presents a new approach to improving image steganography using the Ant Colony Optimization(ACO)algorithm.Image steganography,a technique of embedding hidden information in digital photographs,should ideally achieve the dual purposes of maximum data hiding and maintenance of the integrity of the cover media so that it is least suspect.The contemporary methods of steganography are at best a compromise between these two.In this paper,we present our approach,entitled Ant Colony Optimization(ACO)-Least Significant Bit(LSB),which attempts to optimize the capacity in steganographic embedding.The approach makes use of a grayscale cover image to hide the confidential data with an additional bit pair per byte,both for integrity verification and the file checksumof the secret data.This approach encodes confidential information into four pairs of bits and embeds it within uncompressed grayscale images.The ACO algorithm uses adaptive exploration to select some pixels,maximizing the capacity of data embedding whileminimizing the degradation of visual quality.Pheromone evaporation is introduced through iterations to avoid stagnation in solution refinement.The levels of pheromone are modified to reinforce successful pixel choices.Experimental results obtained through the ACO-LSB method reveal that it clearly improves image steganography capabilities by providing an increase of up to 30%in the embedding capacity compared with traditional approaches;the average Peak Signal to Noise Ratio(PSNR)is 40.5 dB with a Structural Index Similarity(SSIM)of 0.98.The approach also demonstrates very high resistance to detection,cutting down the rate by 20%.Implemented in MATLAB R2023a,the model was tested against one thousand publicly available grayscale images,thus providing robust evidence of its effectiveness.展开更多
In Wireless Sensor Networks(WSNs),Clustering process is widely utilized for increasing the lifespan with sustained energy stability during data transmission.Several clustering protocols were devised for extending netw...In Wireless Sensor Networks(WSNs),Clustering process is widely utilized for increasing the lifespan with sustained energy stability during data transmission.Several clustering protocols were devised for extending network lifetime,but most of them failed in handling the problem of fixed clustering,static rounds,and inadequate Cluster Head(CH)selection criteria which consumes more energy.In this paper,Stochastic Ranking Improved Teaching-Learning and Adaptive Grasshopper Optimization Algorithm(SRITL-AGOA)-based Clustering Scheme for energy stabilization and extending network lifespan.This SRITL-AGOA selected CH depending on the weightage of factors such as node mobility degree,neighbour's density distance to sink,single-hop or multihop communication and Residual Energy(RE)that directly influences the energy consumption of sensor nodes.In specific,Grasshopper Optimization Algorithm(GOA)is improved through tangent-based nonlinear strategy for enhancing the ability of global optimization.On the other hand,stochastic ranking and violation constraint handling strategies are embedded into Teaching-Learning-based Optimization Algorithm(TLOA)for improving its exploitation tendencies.Then,SR and VCH improved TLOA is embedded into the exploitation phase of AGOA for selecting better CH by maintaining better balance amid exploration and exploitation.Simulation results confirmed that the proposed SRITL-AGOA improved throughput by 21.86%,network stability by 18.94%,load balancing by 16.14%with minimized energy depletion by19.21%,compared to the competitive CH selection approaches.展开更多
This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic ...This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic Algorithm(GA).MOALO version has been employed to address those problems containing many objectives and an archive has been employed for retaining the non-dominated solutions.The uniqueness of the hybrid is that the operators like mutation and crossover of GA are employed in the archive to update the solutions and later those solutions go through the process of MOALO.A first-time hybrid of these algorithms is employed to solve multi-objective problems.The hybrid algorithm overcomes the limitation of ALO of getting caught in the local optimum and the requirement of more computational effort to converge GA.To evaluate the hybridized algorithm’s performance,a set of constrained,unconstrained test problems and engineering design problems were employed and compared with five well-known computational algorithms-MOALO,Multi-objective Crystal Structure Algorithm(MOCryStAl),Multi-objective Particle Swarm Optimization(MOPSO),Multi-objective Multiverse Optimization Algorithm(MOMVO),Multi-objective Salp Swarm Algorithm(MSSA).The outcomes of five performance metrics are statistically analyzed and the most efficient Pareto fronts comparison has been obtained.The proposed hybrid surpasses MOALO based on the results of hypervolume(HV),Spread,and Spacing.So primary objective of developing this hybrid approach has been achieved successfully.The proposed approach demonstrates superior performance on the test functions,showcasing robust convergence and comprehensive coverage that surpasses other existing algorithms.展开更多
One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operati...One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.展开更多
Traditional laboratory tests for measuring rock uniaxial compressive strength(UCS)are tedious and timeconsuming.There is a pressing need for more effective methods to determine rock UCS,especially in deep mining envir...Traditional laboratory tests for measuring rock uniaxial compressive strength(UCS)are tedious and timeconsuming.There is a pressing need for more effective methods to determine rock UCS,especially in deep mining environments under high in-situ stress.Thus,this study aims to develop an advanced model for predicting the UCS of rockmaterial in deepmining environments by combining three boosting-basedmachine learning methods with four optimization algorithms.For this purpose,the Lead-Zinc mine in Southwest China is considered as the case study.Rock density,P-wave velocity,and point load strength index are used as input variables,and UCS is regarded as the output.Subsequently,twelve hybrid predictive models are obtained.Root mean square error(RMSE),mean absolute error(MAE),coefficient of determination(R2),and the proportion of the mean absolute percentage error less than 20%(A-20)are selected as the evaluation metrics.Experimental results showed that the hybridmodel consisting of the extreme gradient boostingmethod and the artificial bee colony algorithm(XGBoost-ABC)achieved satisfactory results on the training dataset and exhibited the best generalization performance on the testing dataset.The values of R2,A-20,RMSE,and MAE on the training dataset are 0.98,1.0,3.11 MPa,and 2.23MPa,respectively.The highest values of R2 and A-20(0.93 and 0.96),and the smallest RMSE and MAE values of 4.78 MPa and 3.76MPa,are observed on the testing dataset.The proposed hybrid model can be considered a reliable and effective method for predicting rock UCS in deep mines.展开更多
With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)...With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.展开更多
Mathematical physics equations are often utilized to describe physical phenomena in various fields of science and engineering.One such equation is the Fourier equation,which is a commonly used and effective method for...Mathematical physics equations are often utilized to describe physical phenomena in various fields of science and engineering.One such equation is the Fourier equation,which is a commonly used and effective method for evaluating the effectiveness of temperature control measures for mass concrete.One important measure for temperature control in mass concrete is the use of cooling water pipes.However,the mismatch of grids between large-scale concrete models and small-scale cooling pipe models can result in a significant waste of calculation time when using the finite element method.Moreover,the temperature of the water in the cooling pipe needs to be iteratively calculated during the thermal transfer process.The substructure method can effectively solve this problem,and it has been validated by scholars.The Abaqus/Python secondary development technology provides engineers with enough flexibility to combine the substructure method with an iteration algorithm,which enables the creation of a parametric modeling calculation for cooling water pipes.This paper proposes such a method,which involves iterating the water pipe boundary and establishing the water pipe unit substructure to numerically simulate the concrete temperature field that contains a cooling water pipe.To verify the feasibility and accuracy of the proposed method,two classic numerical examples were analyzed.The results showed that this method has good applicability in cooling pipe calculations.When the value of the iteration parameterαis 0.4,the boundary temperature of the cooling water pipes can meet the accuracy requirements after 4∼5 iterations,effectively improving the computational efficiency.Overall,this approach provides a useful tool for engineers to analyze the temperature control measures accurately and efficiently for mass concrete,such as cooling water pipes,using Abaqus/Python secondary development.展开更多
In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading...In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.展开更多
Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,w...Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,which are commonly utilized in radiology.To fully exploit their potential,researchers have suggested utilizing deep learning methods to construct computer-aided diagnostic systems.However,constructing and compressing these systems presents a significant challenge,as it relies heavily on the expertise of data scientists.To tackle this issue,we propose an automated approach that utilizes an evolutionary algorithm(EA)to optimize the design and compression of a convolutional neural network(CNN)for X-Ray image classification.Our approach accurately classifies radiography images and detects potential chest abnormalities and infections,including COVID-19.Furthermore,our approach incorporates transfer learning,where a pre-trainedCNNmodel on a vast dataset of chest X-Ray images is fine-tuned for the specific task of detecting COVID-19.This method can help reduce the amount of labeled data required for the task and enhance the overall performance of the model.We have validated our method via a series of experiments against state-of-the-art architectures.展开更多
Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face ...Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.展开更多
Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a ...Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a surgical algorithm that addresses the appropriate surgical procedures for different types of nasal alar defects in Asian patients.Methods:A retrospective case note review was conducted on 32 patients with nasal alar defect who underwent reconstruction between 2008 and 2022.Based on careful analysis and our clinical experience,we proposed a classification system for nasal alar defects and presented a reconstructive algorithm.Patient data,including age,sex,diagnosis,surgical options,and complications,were assessed.The extent of surgical scar formation was evaluated using standard photography based on a 4-grade scar scale.Results:Among the 32 patients,there were 20 males and 12 females with nasal alar defects.The predominant cause of trauma in China was industrial factors.The majority of alar defects were classified as type Ⅰ C(n=8,25%),comprising 18 cases(56.2%);there were 5 cases(15.6%)of type Ⅱ defect,7(21.9%)of type Ⅲ defect,and 2(6.3%)of type Ⅳ defect.The most common surgical option was auricular composite graft(n=8,25%),followed by bilobed flap(n=6,18.8%),free auricular composite flap(n=4,12.5%),and primary closure(n=3,9.4%).Satisfactory improvements were observed postoperatively.Conclusion:Factors contributing to classifications were analyzed and defined,providing a framework for the proposed classification system.The reconstructive algorithm offers surgeons appropriate procedures for treating nasal alar defect in Asians.展开更多
Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challen...Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challenging to propose an ideal LSM model.To investigate the impact of different boosting algorithms and hyperparameter optimization algorithms on LSM,this study constructed a geospatial database comprising 12 conditioning factors,such as elevation,stratum,and annual average rainfall.The XGBoost(XGB),LightGBM(LGBM),and CatBoost(CB)algorithms were employed to construct the LSM model.Furthermore,the Bayesian optimization(BO),particle swarm optimization(PSO),and Hyperband optimization(HO)algorithms were applied to optimizing the LSM model.The boosting algorithms exhibited varying performances,with CB demonstrating the highest precision,followed by LGBM,and XGB showing poorer precision.Additionally,the hyperparameter optimization algorithms displayed different performances,with HO outperforming PSO and BO showing poorer performance.The HO-CB model achieved the highest precision,boasting an accuracy of 0.764,an F1-score of 0.777,an area under the curve(AUC)value of 0.837 for the training set,and an AUC value of 0.863 for the test set.The model was interpreted using SHapley Additive exPlanations(SHAP),revealing that slope,curvature,topographic wetness index(TWI),degree of relief,and elevation significantly influenced landslides in the study area.This study offers a scientific reference for LSM and disaster prevention research.This study examines the utilization of various boosting algorithms and hyperparameter optimization algorithms in Wanzhou District.It proposes the HO-CB-SHAP framework as an effective approach to accurately forecast landslide disasters and interpret LSM models.However,limitations exist concerning the generalizability of the model and the data processing,which require further exploration in subsequent studies.展开更多
基金supported by the Opening Fund of Shandong Provincial Key Laboratory of Network based Intelligent Computing,the National Natural Science Foundation of China(52205529,61803192)the Natural Science Foundation of Shandong Province(ZR2021QE195)+1 种基金the Youth Innovation Team Program of Shandong Higher Education Institution(2023KJ206)the Guangyue Youth Scholar Innovation Talent Program support received from Liaocheng University(LCUGYTD2022-03).
文摘Effective path planning is crucial for mobile robots to quickly reach rescue destination and complete rescue tasks in a post-disaster scenario.In this study,we investigated the post-disaster rescue path planning problem and modeled this problem as a variant of the travel salesman problem(TSP)with life-strength constraints.To address this problem,we proposed an improved iterated greedy(IIG)algorithm.First,a push-forward insertion heuristic(PFIH)strategy was employed to generate a high-quality initial solution.Second,a greedy-based insertion strategy was designed and used in the destruction-construction stage to increase the algorithm’s exploration ability.Furthermore,three problem-specific swap operators were developed to improve the algorithm’s exploitation ability.Additionally,an improved simulated annealing(SA)strategy was used as an acceptance criterion to effectively prevent the algorithm from falling into local optima.To verify the effectiveness of the proposed algorithm,the Solomon dataset was extended to generate 27 instances for simulation.Finally,the proposed IIG was compared with five state-of-the-art algorithms.The parameter analysiswas conducted using the design of experiments(DOE)Taguchi method,and the effectiveness analysis of each component has been verified one by one.Simulation results indicate that IIGoutperforms the compared algorithms in terms of the number of rescue survivors and convergence speed,proving the effectiveness of the proposed algorithm.
基金Project supported by the grant from City University of Hong Kong (Grant No. 7008105)
文摘By considering the eigenratio of the Laplacian matrix as the synchronizability measure, this paper presents an efficient method to enhance the synchronizability of undirected and unweighted networks via rewiring. The rewiring method combines the use of tabu search and a local greedy algorithm so that an effective search of solutions can be achieved. As demonstrated in the simulation results, the performance of the proposed approach outperforms the existing methods for a large variety of initial networks, both in terms of speed and quality of solutions.
基金Supported by the National Key Research and Development Program of China(No.2020YFC1807904).
文摘With the wide application of automated guided vehicles(AGVs) in large scale outdoor scenarios with complex terrain,the collaborative work of a large number of AGVs becomes the main trend.The effective multi-agent path finding(MAPF) algorithm is urgently needed to ensure the efficiency and realizability of the whole system. The complex terrain of outdoor scenarios is fully considered by using different values of passage cost to quantify different terrain types. The objective of the MAPF problem is to minimize the cost of passage while the Manhattan distance of paths and the time of passage are also evaluated for a comprehensive comparison. The pre-path-planning and real-time-conflict based greedy(PRG) algorithm is proposed as the solution. Simulation is conducted and the proposed PRG algorithm is compared with waiting-stop A^(*) and conflict based search(CBS) algorithms. Results show that the PRG algorithm outperforms the waiting-stop A^(*) algorithm in all three performance indicators,and it is more applicable than the CBS algorithm when a large number of AGVs are working collaboratively with frequent collisions.
文摘The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study introduces a hybrid optimization algorithm, named the adaptive inertia weight whale optimization algorithm and gannet optimization algorithm (AIWGOA), which addresses challenges in enhancing handwritten documents. The hybrid strategy integrates the strengths of both algorithms, significantly enhancing their capabilities, whereas the adaptive parameter strategy mitigates the need for manual parameter setting. By amalgamating the hybrid strategy and parameter-adaptive approach, the Gannet Optimization Algorithm was refined to yield the AIWGOA. Through a performance analysis of the CEC2013 benchmark, the AIWGOA demonstrates notable advantages across various metrics. Subsequently, an evaluation index was employed to assess the enhanced handwritten documents and images, affirming the superior practical application of the AIWGOA compared with other algorithms.
基金This research was funded by National Natural Science Foundation of China(No.62063006)Guangxi Science and Technology Major Program(No.2022AA05002)+2 种基金Key Laboratory of AI and Information Processing(Hechi University),Education Department of Guangxi Zhuang Autonomous Region(No.2022GXZDSY003)Guangxi Key Laboratory of Spatial Information and Geomatics(Guilin University of Technology)(No.21-238-21-16)Innovation Project of Guangxi Graduate Education(No.YCSW2023352).
文摘A Rapid-exploration Random Tree(RRT)autonomous detection algorithm based on the multi-guide-node deflection strategy and Karto Simultaneous Localization and Mapping(SLAM)algorithm was proposed to solve the problems of low efficiency of detecting frontier boundary points and drift distortion in the process of map building in the traditional RRT algorithm in the autonomous detection strategy of mobile robot.Firstly,an RRT global frontier boundary point detection algorithm based on the multi-guide-node deflection strategy was put forward,which introduces the reference value of guide nodes’deflection probability into the random sampling function so that the global search tree can detect frontier boundary points towards the guide nodes according to random probability.After that,a new autonomous detection algorithm for mobile robots was proposed by combining the graph optimization-based Karto SLAM algorithm with the previously improved RRT algorithm.The algorithm simulation platform based on the Gazebo platform was built.The simulation results show that compared with the traditional RRT algorithm,the proposed RRT autonomous detection algorithm can effectively reduce the time of autonomous detection,plan the length of detection trajectory under the condition of high average detection coverage,and complete the task of autonomous detection mapping more efficiently.Finally,with the help of the ROS-based mobile robot experimental platform,the performance of the proposed algorithm was verified in the real environment of different obstacles.The experimental results show that in the actual environment of simple and complex obstacles,the proposed RRT autonomous detection algorithm was superior to the traditional RRT autonomous detection algorithm in the time of detection,length of detection trajectory,and average coverage,thus improving the efficiency and accuracy of autonomous detection.
文摘The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.
文摘Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach essential for minimizing unnecessary transmission energy consumption with sustained network lifetime. This clustering process is identified as the Non-deterministic Polynomial(NP)-hard optimization problems which has the maximized probability of being solved through metaheuristic algorithms.This adoption of hybrid metaheuristic algorithm concentrates on the identification of the optimal or nearoptimal solutions which aids in better energy stability during Cluster Head(CH) selection. In this paper,Hybrid Seagull and Whale Optimization Algorithmbased Dynamic Clustering Protocol(HSWOA-DCP)is proposed with the exploitation benefits of WOA and exploration merits of SEOA to optimal CH selection for maintaining energy stability with prolonged network lifetime. This HSWOA-DCP adopted the modified version of SEagull Optimization Algorithm(SEOA) to handle the problem of premature convergence and computational accuracy which is maximally possible during CH selection. The inclusion of SEOA into WOA improved the global searching capability during the selection of CH and prevents worst fitness nodes from being selected as CH, since the spiral attacking behavior of SEOA is similar to the bubble-net characteristics of WOA. This CH selection integrates the spiral attacking principles of SEOA and contraction surrounding mechanism of WOA for improving computation accuracy to prevent frequent election process. It also included the strategy of levy flight strategy into SEOA for potentially avoiding premature convergence to attain better trade-off between the rate of exploration and exploitation in a more effective manner. The simulation results of the proposed HSWOADCP confirmed better network survivability rate, network residual energy and network overall throughput on par with the competitive CH selection schemes under different number of data transmission rounds.The statistical analysis of the proposed HSWOA-DCP scheme also confirmed its energy stability with respect to ANOVA test.
基金the deputyship for Research&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the Project Number(IFP-2022-34).
文摘In the cloud environment,ensuring a high level of data security is in high demand.Data planning storage optimization is part of the whole security process in the cloud environment.It enables data security by avoiding the risk of data loss and data overlapping.The development of data flow scheduling approaches in the cloud environment taking security parameters into account is insufficient.In our work,we propose a data scheduling model for the cloud environment.Themodel is made up of three parts that together help dispatch user data flow to the appropriate cloudVMs.The first component is the Collector Agent whichmust periodically collect information on the state of the network links.The second one is the monitoring agent which must then analyze,classify,and make a decision on the state of the link and finally transmit this information to the scheduler.The third one is the scheduler who must consider previous information to transfer user data,including fair distribution and reliable paths.It should be noted that each part of the proposedmodel requires the development of its algorithms.In this article,we are interested in the development of data transfer algorithms,including fairness distribution with the consideration of a stable link state.These algorithms are based on the grouping of transmitted files and the iterative method.The proposed algorithms showthe performances to obtain an approximate solution to the studied problem which is an NP-hard(Non-Polynomial solution)problem.The experimental results show that the best algorithm is the half-grouped minimum excluding(HME),with a percentage of 91.3%,an average deviation of 0.042,and an execution time of 0.001 s.
文摘This advanced paper presents a new approach to improving image steganography using the Ant Colony Optimization(ACO)algorithm.Image steganography,a technique of embedding hidden information in digital photographs,should ideally achieve the dual purposes of maximum data hiding and maintenance of the integrity of the cover media so that it is least suspect.The contemporary methods of steganography are at best a compromise between these two.In this paper,we present our approach,entitled Ant Colony Optimization(ACO)-Least Significant Bit(LSB),which attempts to optimize the capacity in steganographic embedding.The approach makes use of a grayscale cover image to hide the confidential data with an additional bit pair per byte,both for integrity verification and the file checksumof the secret data.This approach encodes confidential information into four pairs of bits and embeds it within uncompressed grayscale images.The ACO algorithm uses adaptive exploration to select some pixels,maximizing the capacity of data embedding whileminimizing the degradation of visual quality.Pheromone evaporation is introduced through iterations to avoid stagnation in solution refinement.The levels of pheromone are modified to reinforce successful pixel choices.Experimental results obtained through the ACO-LSB method reveal that it clearly improves image steganography capabilities by providing an increase of up to 30%in the embedding capacity compared with traditional approaches;the average Peak Signal to Noise Ratio(PSNR)is 40.5 dB with a Structural Index Similarity(SSIM)of 0.98.The approach also demonstrates very high resistance to detection,cutting down the rate by 20%.Implemented in MATLAB R2023a,the model was tested against one thousand publicly available grayscale images,thus providing robust evidence of its effectiveness.
文摘In Wireless Sensor Networks(WSNs),Clustering process is widely utilized for increasing the lifespan with sustained energy stability during data transmission.Several clustering protocols were devised for extending network lifetime,but most of them failed in handling the problem of fixed clustering,static rounds,and inadequate Cluster Head(CH)selection criteria which consumes more energy.In this paper,Stochastic Ranking Improved Teaching-Learning and Adaptive Grasshopper Optimization Algorithm(SRITL-AGOA)-based Clustering Scheme for energy stabilization and extending network lifespan.This SRITL-AGOA selected CH depending on the weightage of factors such as node mobility degree,neighbour's density distance to sink,single-hop or multihop communication and Residual Energy(RE)that directly influences the energy consumption of sensor nodes.In specific,Grasshopper Optimization Algorithm(GOA)is improved through tangent-based nonlinear strategy for enhancing the ability of global optimization.On the other hand,stochastic ranking and violation constraint handling strategies are embedded into Teaching-Learning-based Optimization Algorithm(TLOA)for improving its exploitation tendencies.Then,SR and VCH improved TLOA is embedded into the exploitation phase of AGOA for selecting better CH by maintaining better balance amid exploration and exploitation.Simulation results confirmed that the proposed SRITL-AGOA improved throughput by 21.86%,network stability by 18.94%,load balancing by 16.14%with minimized energy depletion by19.21%,compared to the competitive CH selection approaches.
基金supported by the National Research Foundation of Korea(NRF)Grant funded by the Korea government(MSIT)(No.RS-2023-00218176)the Soonchunhyang University Research Fund.
文摘This study proposes a hybridization of two efficient algorithm’s Multi-objective Ant Lion Optimizer Algorithm(MOALO)which is a multi-objective enhanced version of the Ant Lion Optimizer Algorithm(ALO)and the Genetic Algorithm(GA).MOALO version has been employed to address those problems containing many objectives and an archive has been employed for retaining the non-dominated solutions.The uniqueness of the hybrid is that the operators like mutation and crossover of GA are employed in the archive to update the solutions and later those solutions go through the process of MOALO.A first-time hybrid of these algorithms is employed to solve multi-objective problems.The hybrid algorithm overcomes the limitation of ALO of getting caught in the local optimum and the requirement of more computational effort to converge GA.To evaluate the hybridized algorithm’s performance,a set of constrained,unconstrained test problems and engineering design problems were employed and compared with five well-known computational algorithms-MOALO,Multi-objective Crystal Structure Algorithm(MOCryStAl),Multi-objective Particle Swarm Optimization(MOPSO),Multi-objective Multiverse Optimization Algorithm(MOMVO),Multi-objective Salp Swarm Algorithm(MSSA).The outcomes of five performance metrics are statistically analyzed and the most efficient Pareto fronts comparison has been obtained.The proposed hybrid surpasses MOALO based on the results of hypervolume(HV),Spread,and Spacing.So primary objective of developing this hybrid approach has been achieved successfully.The proposed approach demonstrates superior performance on the test functions,showcasing robust convergence and comprehensive coverage that surpasses other existing algorithms.
文摘One of the most dangerous safety hazard in underground coal mines is roof falls during retreat mining.Roof falls may cause life-threatening and non-fatal injuries to miners and impede mining and transportation operations.As a result,a reliable roof fall prediction model is essential to tackle such challenges.Different parameters that substantially impact roof falls are ill-defined and intangible,making this an uncertain and challenging research issue.The National Institute for Occupational Safety and Health assembled a national database of roof performance from 37 coal mines to explore the factors contributing to roof falls.Data acquired for 37 mines is limited due to several restrictions,which increased the likelihood of incompleteness.Fuzzy logic is a technique for coping with ambiguity,incompleteness,and uncertainty.Therefore,In this paper,the fuzzy inference method is presented,which employs a genetic algorithm to create fuzzy rules based on 109 records of roof fall data and pattern search to refine the membership functions of parameters.The performance of the deployed model is evaluated using statistical measures such as the Root-Mean-Square Error,Mean-Absolute-Error,and coefficient of determination(R_(2)).Based on these criteria,the suggested model outperforms the existing models to precisely predict roof fall rates using fewer fuzzy rules.
基金supported by the National Natural Science Foundation of China(Grant No.52374153).
文摘Traditional laboratory tests for measuring rock uniaxial compressive strength(UCS)are tedious and timeconsuming.There is a pressing need for more effective methods to determine rock UCS,especially in deep mining environments under high in-situ stress.Thus,this study aims to develop an advanced model for predicting the UCS of rockmaterial in deepmining environments by combining three boosting-basedmachine learning methods with four optimization algorithms.For this purpose,the Lead-Zinc mine in Southwest China is considered as the case study.Rock density,P-wave velocity,and point load strength index are used as input variables,and UCS is regarded as the output.Subsequently,twelve hybrid predictive models are obtained.Root mean square error(RMSE),mean absolute error(MAE),coefficient of determination(R2),and the proportion of the mean absolute percentage error less than 20%(A-20)are selected as the evaluation metrics.Experimental results showed that the hybridmodel consisting of the extreme gradient boostingmethod and the artificial bee colony algorithm(XGBoost-ABC)achieved satisfactory results on the training dataset and exhibited the best generalization performance on the testing dataset.The values of R2,A-20,RMSE,and MAE on the training dataset are 0.98,1.0,3.11 MPa,and 2.23MPa,respectively.The highest values of R2 and A-20(0.93 and 0.96),and the smallest RMSE and MAE values of 4.78 MPa and 3.76MPa,are observed on the testing dataset.The proposed hybrid model can be considered a reliable and effective method for predicting rock UCS in deep mines.
基金supported by the National Natural Science Foundation of China(Grant No.62072031)the Applied Basic Research Foundation of Yunnan Province(Grant No.2019FD071)the Yunnan Scientific Research Foundation Project(Grant 2019J0187).
文摘With the development of vehicles towards intelligence and connectivity,vehicular data is diversifying and growing dramatically.A task allocation model and algorithm for heterogeneous Intelligent Connected Vehicle(ICV)applications are proposed for the dispersed computing network composed of heterogeneous task vehicles and Network Computing Points(NCPs).Considering the amount of task data and the idle resources of NCPs,a computing resource scheduling model for NCPs is established.Taking the heterogeneous task execution delay threshold as a constraint,the optimization problem is described as the problem of maximizing the utilization of computing resources by NCPs.The proposed problem is proven to be NP-hard by using the method of reduction to a 0-1 knapsack problem.A many-to-many matching algorithm based on resource preferences is proposed.The algorithm first establishes the mutual preference lists based on the adaptability of the task requirements and the resources provided by NCPs.This enables the filtering out of un-schedulable NCPs in the initial stage of matching,reducing the solution space dimension.To solve the matching problem between ICVs and NCPs,a new manyto-many matching algorithm is proposed to obtain a unique and stable optimal matching result.The simulation results demonstrate that the proposed scheme can improve the resource utilization of NCPs by an average of 9.6%compared to the reference scheme,and the total performance can be improved by up to 15.9%.
文摘Mathematical physics equations are often utilized to describe physical phenomena in various fields of science and engineering.One such equation is the Fourier equation,which is a commonly used and effective method for evaluating the effectiveness of temperature control measures for mass concrete.One important measure for temperature control in mass concrete is the use of cooling water pipes.However,the mismatch of grids between large-scale concrete models and small-scale cooling pipe models can result in a significant waste of calculation time when using the finite element method.Moreover,the temperature of the water in the cooling pipe needs to be iteratively calculated during the thermal transfer process.The substructure method can effectively solve this problem,and it has been validated by scholars.The Abaqus/Python secondary development technology provides engineers with enough flexibility to combine the substructure method with an iteration algorithm,which enables the creation of a parametric modeling calculation for cooling water pipes.This paper proposes such a method,which involves iterating the water pipe boundary and establishing the water pipe unit substructure to numerically simulate the concrete temperature field that contains a cooling water pipe.To verify the feasibility and accuracy of the proposed method,two classic numerical examples were analyzed.The results showed that this method has good applicability in cooling pipe calculations.When the value of the iteration parameterαis 0.4,the boundary temperature of the cooling water pipes can meet the accuracy requirements after 4∼5 iterations,effectively improving the computational efficiency.Overall,this approach provides a useful tool for engineers to analyze the temperature control measures accurately and efficiently for mass concrete,such as cooling water pipes,using Abaqus/Python secondary development.
基金This project was funded by Deanship of Scientific Research(DSR)at King Abdulaziz University,Jeddah underGrant No.(IFPIP-1127-611-1443)the authors,therefore,acknowledge with thanks DSR technical and financial support.
文摘In the rapidly evolving landscape of today’s digital economy,Financial Technology(Fintech)emerges as a trans-formative force,propelled by the dynamic synergy between Artificial Intelligence(AI)and Algorithmic Trading.Our in-depth investigation delves into the intricacies of merging Multi-Agent Reinforcement Learning(MARL)and Explainable AI(XAI)within Fintech,aiming to refine Algorithmic Trading strategies.Through meticulous examination,we uncover the nuanced interactions of AI-driven agents as they collaborate and compete within the financial realm,employing sophisticated deep learning techniques to enhance the clarity and adaptability of trading decisions.These AI-infused Fintech platforms harness collective intelligence to unearth trends,mitigate risks,and provide tailored financial guidance,fostering benefits for individuals and enterprises navigating the digital landscape.Our research holds the potential to revolutionize finance,opening doors to fresh avenues for investment and asset management in the digital age.Additionally,our statistical evaluation yields encouraging results,with metrics such as Accuracy=0.85,Precision=0.88,and F1 Score=0.86,reaffirming the efficacy of our approach within Fintech and emphasizing its reliability and innovative prowess.
基金via funding from Prince Sattam bin Abdulaziz University Project Number(PSAU/2023/R/1444).
文摘Recent developments in Computer Vision have presented novel opportunities to tackle complex healthcare issues,particularly in the field of lung disease diagnosis.One promising avenue involves the use of chest X-Rays,which are commonly utilized in radiology.To fully exploit their potential,researchers have suggested utilizing deep learning methods to construct computer-aided diagnostic systems.However,constructing and compressing these systems presents a significant challenge,as it relies heavily on the expertise of data scientists.To tackle this issue,we propose an automated approach that utilizes an evolutionary algorithm(EA)to optimize the design and compression of a convolutional neural network(CNN)for X-Ray image classification.Our approach accurately classifies radiography images and detects potential chest abnormalities and infections,including COVID-19.Furthermore,our approach incorporates transfer learning,where a pre-trainedCNNmodel on a vast dataset of chest X-Ray images is fine-tuned for the specific task of detecting COVID-19.This method can help reduce the amount of labeled data required for the task and enhance the overall performance of the model.We have validated our method via a series of experiments against state-of-the-art architectures.
基金financially supported by the National Natural Science Foundation of China(No.52174001)the National Natural Science Foundation of China(No.52004064)+1 种基金the Hainan Province Science and Technology Special Fund “Research on Real-time Intelligent Sensing Technology for Closed-loop Drilling of Oil and Gas Reservoirs in Deepwater Drilling”(ZDYF2023GXJS012)Heilongjiang Provincial Government and Daqing Oilfield's first batch of the scientific and technological key project “Research on the Construction Technology of Gulong Shale Oil Big Data Analysis System”(DQYT-2022-JS-750)。
文摘Real-time intelligent lithology identification while drilling is vital to realizing downhole closed-loop drilling. The complex and changeable geological environment in the drilling makes lithology identification face many challenges. This paper studies the problems of difficult feature information extraction,low precision of thin-layer identification and limited applicability of the model in intelligent lithologic identification. The author tries to improve the comprehensive performance of the lithology identification model from three aspects: data feature extraction, class balance, and model design. A new real-time intelligent lithology identification model of dynamic felling strategy weighted random forest algorithm(DFW-RF) is proposed. According to the feature selection results, gamma ray and 2 MHz phase resistivity are the logging while drilling(LWD) parameters that significantly influence lithology identification. The comprehensive performance of the DFW-RF lithology identification model has been verified in the application of 3 wells in different areas. By comparing the prediction results of five typical lithology identification algorithms, the DFW-RF model has a higher lithology identification accuracy rate and F1 score. This model improves the identification accuracy of thin-layer lithology and is effective and feasible in different geological environments. The DFW-RF model plays a truly efficient role in the realtime intelligent identification of lithologic information in closed-loop drilling and has greater applicability, which is worthy of being widely used in logging interpretation.
文摘Background:The nasal alar defect in Asians remains a challenging issue,as do clear classification and algorithm guidance,despite numerous previously described surgical techniques.The aim of this study is to propose a surgical algorithm that addresses the appropriate surgical procedures for different types of nasal alar defects in Asian patients.Methods:A retrospective case note review was conducted on 32 patients with nasal alar defect who underwent reconstruction between 2008 and 2022.Based on careful analysis and our clinical experience,we proposed a classification system for nasal alar defects and presented a reconstructive algorithm.Patient data,including age,sex,diagnosis,surgical options,and complications,were assessed.The extent of surgical scar formation was evaluated using standard photography based on a 4-grade scar scale.Results:Among the 32 patients,there were 20 males and 12 females with nasal alar defects.The predominant cause of trauma in China was industrial factors.The majority of alar defects were classified as type Ⅰ C(n=8,25%),comprising 18 cases(56.2%);there were 5 cases(15.6%)of type Ⅱ defect,7(21.9%)of type Ⅲ defect,and 2(6.3%)of type Ⅳ defect.The most common surgical option was auricular composite graft(n=8,25%),followed by bilobed flap(n=6,18.8%),free auricular composite flap(n=4,12.5%),and primary closure(n=3,9.4%).Satisfactory improvements were observed postoperatively.Conclusion:Factors contributing to classifications were analyzed and defined,providing a framework for the proposed classification system.The reconstructive algorithm offers surgeons appropriate procedures for treating nasal alar defect in Asians.
基金funded by the Natural Science Foundation of Chongqing(Grants No.CSTB2022NSCQ-MSX0594)the Humanities and Social Sciences Research Project of the Ministry of Education(Grants No.16YJCZH061).
文摘Boosting algorithms have been widely utilized in the development of landslide susceptibility mapping(LSM)studies.However,these algorithms possess distinct computational strategies and hyperparameters,making it challenging to propose an ideal LSM model.To investigate the impact of different boosting algorithms and hyperparameter optimization algorithms on LSM,this study constructed a geospatial database comprising 12 conditioning factors,such as elevation,stratum,and annual average rainfall.The XGBoost(XGB),LightGBM(LGBM),and CatBoost(CB)algorithms were employed to construct the LSM model.Furthermore,the Bayesian optimization(BO),particle swarm optimization(PSO),and Hyperband optimization(HO)algorithms were applied to optimizing the LSM model.The boosting algorithms exhibited varying performances,with CB demonstrating the highest precision,followed by LGBM,and XGB showing poorer precision.Additionally,the hyperparameter optimization algorithms displayed different performances,with HO outperforming PSO and BO showing poorer performance.The HO-CB model achieved the highest precision,boasting an accuracy of 0.764,an F1-score of 0.777,an area under the curve(AUC)value of 0.837 for the training set,and an AUC value of 0.863 for the test set.The model was interpreted using SHapley Additive exPlanations(SHAP),revealing that slope,curvature,topographic wetness index(TWI),degree of relief,and elevation significantly influenced landslides in the study area.This study offers a scientific reference for LSM and disaster prevention research.This study examines the utilization of various boosting algorithms and hyperparameter optimization algorithms in Wanzhou District.It proposes the HO-CB-SHAP framework as an effective approach to accurately forecast landslide disasters and interpret LSM models.However,limitations exist concerning the generalizability of the model and the data processing,which require further exploration in subsequent studies.