It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than ...It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.展开更多
The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract ...The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.展开更多
Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identific...Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.展开更多
Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A...Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.展开更多
Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications...Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.展开更多
One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast c...One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.展开更多
This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and int...This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and intercell movements simultaneously,while considering sequence-dependent cell setup times.In the cellular manufacturing systems design and planning,three main steps must be considered,namely cell formation(i.e,piece families and machine grouping),inter and intra-cell layouts,and scheduling issue.Due to the fact that the cellular manufacturing systems problem is NP-Hard,a genetic algorithm as an efficient meta-heuristic method is proposed to solve such a hard problem.Finally,a number of test problems are solved to show the efficiency of the proposed genetic algorithm and the related computational results are compared with the results obtained by the use of an optimization tool.展开更多
Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. ...Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.展开更多
Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swa...Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.展开更多
The uncertainty inherent in power load forecasts represents a major factor in the mismatches between supply and demand in renewables-rich electricity networks, which consequently increases the energy bills and curtail...The uncertainty inherent in power load forecasts represents a major factor in the mismatches between supply and demand in renewables-rich electricity networks, which consequently increases the energy bills and curtailed generation. As the transition to a power grid founded on the so-called grid-of-grids becomes more evident, the need for distributed control algorithms capable of handling computationally challenging problems in the energy sector does so as well. In this light, the consensus-based distributed algorithm has recently been shown to provide an effective platform for solving the complex energy management problem in microgrids. More specifically, in a microgrid context, the consensus-based distributed algorithm requires reliable information exchange with customers to achieve convergence. However, packet losses remain an important issue, which can potentially result in the failure of the overall system. In this setting, this paper introduces a novel method to effectively characterize such packet losses during information exchange between the customers and the microgrid operator, whilst solving the microgrid scheduling optimization problem for a multi-agent-based microgrid. More specifically, the proposed framework leverages the virulence optimization algorithm and the earth-worm optimization algorithm to optimally shift the energy consumption during peak periods to lower-priced off-peak hours. The effectiveness of the proposed method in minimizing the overall active power mismatches in the presence of packet losses has also been demonstrated based on benchmarking the results against the business-as-usual iterative scheduling algorithm. Also, the robustness of the overall meta-heuristic- and multi-agent-based method in producing optimal results is confirmed based on comparing the results obtained by several well-established meta-heuristic optimization algorithms, including the binary particle swarm optimization, the genetic algorithm, and the cuckoo search optimization.展开更多
While solving unimodal function problems,conventional meta-heuristic algorithms often suffer from low accuracy and slow convergence.Therefore,in this paper,a novel meta-heuristic optimization algorithm,named proton-el...While solving unimodal function problems,conventional meta-heuristic algorithms often suffer from low accuracy and slow convergence.Therefore,in this paper,a novel meta-heuristic optimization algorithm,named proton-electron swarm(PES),is proposed based on physical rules.This algorithm simulates the physical phenomena of like-charges repelling each other while opposite charges attracting in protons and electrons,and establishes a mathematical model to realize the optimization process.By balancing the global exploration and local exploitation ability,this algorithm achieves high accuracy and avoids falling into local optimum when solving target problem.In order to evaluate the effectiveness of this algorithm,23 classical benchmark functions were selected for comparative experiments.Experimental results show that,compared with the contrast algorithms,the proposed algorithm cannot only obtain higher accuracy and convergence speed in solving unimodal function problems,but also maintain strong optimization ability in solving multimodal function problems.展开更多
Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance so...Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance solutions, robust meta-heuristic algorithms are used in research. Two main optimization techniques are applied including single-objective and multi-objective optimization. Genetic algorithms (GA), particle swarm optimization (PSO), and combination of genetic algorithm and particle swarm optimization (GAPSO) as single-objective techniques are used, while the non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization (MOPSO) which are sufficient for solving computationally complex large-size optimization problems as multi-objective techniques are applied and compared. A real case study from the rural transportation network of Iran is employed to illustrate the sufficiency of the optimum algorithm. The formulation of the optimization model is carried out in such a way that a cost-effective maintenance strategy is reached by preserving the performance level of the road network at a desirable level. So, the objective functions are pavement performance maximization and maintenance cost minimization. It is concluded that multi-objective algorithms including non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization performed better than the single objective algorithms due to the capability to balance between both objectives. And between multi-objective algorithms the NSGAII provides the optimum solution for the road maintenance planning.展开更多
The distinction and precise identification of tumor nodules are crucial for timely lung cancer diagnosis andplanning intervention. This research work addresses the major issues pertaining to the field of medical image...The distinction and precise identification of tumor nodules are crucial for timely lung cancer diagnosis andplanning intervention. This research work addresses the major issues pertaining to the field of medical imageprocessing while focusing on lung cancer Computed Tomography (CT) images. In this context, the paper proposesan improved lung cancer segmentation technique based on the strengths of nature-inspired approaches. Thebetter resolution of CT is exploited to distinguish healthy subjects from those who have lung cancer. In thisprocess, the visual challenges of the K-means are addressed with the integration of four nature-inspired swarmintelligent techniques. The techniques experimented in this paper are K-means with Artificial Bee Colony (ABC),K-means with Cuckoo Search Algorithm (CSA), K-means with Particle Swarm Optimization (PSO), and Kmeanswith Firefly Algorithm (FFA). The testing and evaluation are performed on Early Lung Cancer ActionProgram (ELCAP) database. The simulation analysis is performed using lung cancer images set against metrics:precision, sensitivity, specificity, f-measure, accuracy,Matthews Correlation Coefficient (MCC), Jaccard, and Dice.The detailed evaluation shows that the K-means with Cuckoo Search Algorithm (CSA) significantly improved thequality of lung cancer segmentation in comparison to the other optimization approaches utilized for lung cancerimages. The results exhibit that the proposed approach (K-means with CSA) achieves precision, sensitivity, and Fmeasureof 0.942, 0.964, and 0.953, respectively, and an average accuracy of 93%. The experimental results prove thatK-meanswithABC,K-meanswith PSO,K-meanswith FFA, andK-meanswithCSAhave achieved an improvementof 10.8%, 13.38%, 13.93%, and 15.7%, respectively, for accuracy measure in comparison to K-means segmentationfor lung cancer images. Further, it is highlighted that the proposed K-means with CSA have achieved a significantimprovement in accuracy, hence can be utilized by researchers for improved segmentation processes of medicalimage datasets for identifying the targeted region of interest.展开更多
Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and ...Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and cost incurred in this process,need to be reduced by the method of test case selection and prioritization.It is observed that many nature-inspired techniques are applied in this area.African Buffalo Optimization is one such approach,applied to regression test selection and prioritization.In this paper,the proposed work explains and proves the applicability of the African Buffalo Optimization approach to test case selection and prioritization.The proposed algorithm converges in polynomial time(O(n^(2))).In this paper,the empirical evaluation of applying African Buffalo Optimization for test case prioritization is done on sample data set with multiple iterations.An astounding 62.5%drop in size and a 48.57%drop in the runtime of the original test suite were recorded.The obtained results are compared with Ant Colony Optimization.The comparative analysis indicates that African Buffalo Optimization and Ant Colony Optimization exhibit similar fault detection capabilities(80%),and a reduction in the overall execution time and size of the resultant test suite.The results and analysis,hence,advocate and encourages the use of African Buffalo Optimization in the area of test case selection and prioritization.展开更多
Many complex optimization problems in the real world can easily fall into local optimality and fail to find the optimal solution,so more new techniques and methods are needed to solve such challenges.Metaheuristic alg...Many complex optimization problems in the real world can easily fall into local optimality and fail to find the optimal solution,so more new techniques and methods are needed to solve such challenges.Metaheuristic algorithms have received a lot of attention in recent years because of their efficient performance and simple structure.Sine Cosine Algorithm(SCA)is a recent Metaheuristic algorithm that is based on two trigonometric functions Sine&Cosine.However,like all other metaheuristic algorithms,SCA has a slow convergence and may fail in sub-optimal regions.In this study,an enhanced version of SCA named RDSCA is suggested that depends on two techniques:random spare/replacement and double adaptive weight.The first technique is employed in SCA to speed the convergence whereas the second method is used to enhance exploratory searching capabilities.To evaluate RDSCA,30 functions from CEC 2017 and 4 real-world engineering problems are used.Moreover,a nonparametric test called Wilcoxon signed-rank is carried out at 5%level to evaluate the significance of the obtained results between RDSCA and the other 5 variants of SCA.The results show that RDSCA has competitive results with other metaheuristics algorithms.展开更多
In order to overcome the deficiencies of current methods for the prediction of the productivity of shale gas hor-izontal wells after fracturing,a new sophisticated approach is proposed in this study.This new model stem...In order to overcome the deficiencies of current methods for the prediction of the productivity of shale gas hor-izontal wells after fracturing,a new sophisticated approach is proposed in this study.This new model stems from the combination several techniques,namely,artificial neural network(ANN),particle swarm optimization(PSO),Imperialist Competitive Algorithms(ICA),and Ant Clony Optimization(ACO).These are properly implemented by using the geological and engineering parameters collected from 317 wells.The results show that the optimum PSO-ANN model has a high accuracy,obtaining a R2 of 0.847 on the testing.The partial dependence plots(PDP)indicate that liquid consumption intensity and the proportion of quartz sand are the two most sensitive factors affecting the model’s performance.展开更多
The rapid population growth results in a crucial problem in the early detection of diseases inmedical research.Among all the cancers unveiled,breast cancer is considered the second most severe cancer.Consequently,an e...The rapid population growth results in a crucial problem in the early detection of diseases inmedical research.Among all the cancers unveiled,breast cancer is considered the second most severe cancer.Consequently,an exponential rising in death cases incurred by breast cancer is expected due to the rapid population growth and the lack of resources required for performing medical diagnoses.Utilizing recent advances in machine learning could help medical staff in diagnosing diseases as they offer effective,reliable,and rapid responses,which could help in decreasing the death risk.In this paper,we propose a new algorithm for feature selection based on a hybrid between powerful and recently emerged optimizers,namely,guided whale and dipper throated optimizers.The proposed algorithm is evaluated using four publicly available breast cancer datasets.The evaluation results show the effectiveness of the proposed approach from the accuracy and speed perspectives.To prove the superiority of the proposed algorithm,a set of competing feature selection algorithms were incorporated into the conducted experiments.In addition,a group of statistical analysis experiments was conducted to emphasize the superiority and stability of the proposed algorithm.The best-achieved breast cancer prediction average accuracy based on the proposed algorithm is 99.453%.This result is achieved in an average time of 3.6725 s,the best result among all the competing approaches utilized in the experiments.展开更多
Autism Spectrum Disorder(ASD)is a complicated neurodevelopmen-tal disorder that is often identified in toddlers.The microarray data is used as a diagnostic tool to identify the genetics of the disorder.However,microarr...Autism Spectrum Disorder(ASD)is a complicated neurodevelopmen-tal disorder that is often identified in toddlers.The microarray data is used as a diagnostic tool to identify the genetics of the disorder.However,microarray data is large and has a high volume.Consequently,it suffers from the problem of dimensionality.In microarray data,the sample size and variance of the gene expression will lead to overfitting and misclassification.Identifying the autism gene(feature)subset from microarray data is an important and challenging research area.It has to be efficiently addressed to improve gene feature selection and classification.To overcome the challenges,a novel Intelligent Hybrid Ensem-ble Gene Selection(IHEGS)model is proposed in this paper.The proposed model integrates the intelligence of different feature selection techniques over the data partitions.In this model,the initial gene selection is carried out by data perturba-tion,and thefinal autism gene subset is obtained by functional perturbation,which reduces the problem of dimensionality in microarray data.The functional perturbation module employs three meta-heuristic swarm intelligence-based tech-niques for gene selection.The obtained gene subset is validated by the Deep Neural Network(DNN)model.The proposed model is implemented using python with six National Center for Biotechnology Information(NCBI)gene expression datasets.From the comparative study with other existing state-of-the-art systems,the proposed model provides stable results in terms of feature selection and clas-sification accuracy.展开更多
文摘It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.
文摘The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.
文摘Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.
基金supported by the Center for Mining,Electro-Mechanical research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnamfinancially supported by the Hunan Provincial Department of Education General Project(19C1744)+1 种基金Hunan Province Science Foundation for Youth Scholars of China fund(2018JJ3510)the Innovation-Driven Project of Central South University(2020CX040)。
文摘Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.
文摘Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.
文摘One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.
文摘This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and intercell movements simultaneously,while considering sequence-dependent cell setup times.In the cellular manufacturing systems design and planning,three main steps must be considered,namely cell formation(i.e,piece families and machine grouping),inter and intra-cell layouts,and scheduling issue.Due to the fact that the cellular manufacturing systems problem is NP-Hard,a genetic algorithm as an efficient meta-heuristic method is proposed to solve such a hard problem.Finally,a number of test problems are solved to show the efficiency of the proposed genetic algorithm and the related computational results are compared with the results obtained by the use of an optimization tool.
文摘Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.
文摘Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.
文摘The uncertainty inherent in power load forecasts represents a major factor in the mismatches between supply and demand in renewables-rich electricity networks, which consequently increases the energy bills and curtailed generation. As the transition to a power grid founded on the so-called grid-of-grids becomes more evident, the need for distributed control algorithms capable of handling computationally challenging problems in the energy sector does so as well. In this light, the consensus-based distributed algorithm has recently been shown to provide an effective platform for solving the complex energy management problem in microgrids. More specifically, in a microgrid context, the consensus-based distributed algorithm requires reliable information exchange with customers to achieve convergence. However, packet losses remain an important issue, which can potentially result in the failure of the overall system. In this setting, this paper introduces a novel method to effectively characterize such packet losses during information exchange between the customers and the microgrid operator, whilst solving the microgrid scheduling optimization problem for a multi-agent-based microgrid. More specifically, the proposed framework leverages the virulence optimization algorithm and the earth-worm optimization algorithm to optimally shift the energy consumption during peak periods to lower-priced off-peak hours. The effectiveness of the proposed method in minimizing the overall active power mismatches in the presence of packet losses has also been demonstrated based on benchmarking the results against the business-as-usual iterative scheduling algorithm. Also, the robustness of the overall meta-heuristic- and multi-agent-based method in producing optimal results is confirmed based on comparing the results obtained by several well-established meta-heuristic optimization algorithms, including the binary particle swarm optimization, the genetic algorithm, and the cuckoo search optimization.
基金This work was supported by the National Natural Science Foundation of China(61872126 and 11601129).
文摘While solving unimodal function problems,conventional meta-heuristic algorithms often suffer from low accuracy and slow convergence.Therefore,in this paper,a novel meta-heuristic optimization algorithm,named proton-electron swarm(PES),is proposed based on physical rules.This algorithm simulates the physical phenomena of like-charges repelling each other while opposite charges attracting in protons and electrons,and establishes a mathematical model to realize the optimization process.By balancing the global exploration and local exploitation ability,this algorithm achieves high accuracy and avoids falling into local optimum when solving target problem.In order to evaluate the effectiveness of this algorithm,23 classical benchmark functions were selected for comparative experiments.Experimental results show that,compared with the contrast algorithms,the proposed algorithm cannot only obtain higher accuracy and convergence speed in solving unimodal function problems,but also maintain strong optimization ability in solving multimodal function problems.
文摘Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance solutions, robust meta-heuristic algorithms are used in research. Two main optimization techniques are applied including single-objective and multi-objective optimization. Genetic algorithms (GA), particle swarm optimization (PSO), and combination of genetic algorithm and particle swarm optimization (GAPSO) as single-objective techniques are used, while the non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization (MOPSO) which are sufficient for solving computationally complex large-size optimization problems as multi-objective techniques are applied and compared. A real case study from the rural transportation network of Iran is employed to illustrate the sufficiency of the optimum algorithm. The formulation of the optimization model is carried out in such a way that a cost-effective maintenance strategy is reached by preserving the performance level of the road network at a desirable level. So, the objective functions are pavement performance maximization and maintenance cost minimization. It is concluded that multi-objective algorithms including non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization performed better than the single objective algorithms due to the capability to balance between both objectives. And between multi-objective algorithms the NSGAII provides the optimum solution for the road maintenance planning.
基金the Researchers Supporting Project(RSP2023R395),King Saud University,Riyadh,Saudi Arabia.
文摘The distinction and precise identification of tumor nodules are crucial for timely lung cancer diagnosis andplanning intervention. This research work addresses the major issues pertaining to the field of medical imageprocessing while focusing on lung cancer Computed Tomography (CT) images. In this context, the paper proposesan improved lung cancer segmentation technique based on the strengths of nature-inspired approaches. Thebetter resolution of CT is exploited to distinguish healthy subjects from those who have lung cancer. In thisprocess, the visual challenges of the K-means are addressed with the integration of four nature-inspired swarmintelligent techniques. The techniques experimented in this paper are K-means with Artificial Bee Colony (ABC),K-means with Cuckoo Search Algorithm (CSA), K-means with Particle Swarm Optimization (PSO), and Kmeanswith Firefly Algorithm (FFA). The testing and evaluation are performed on Early Lung Cancer ActionProgram (ELCAP) database. The simulation analysis is performed using lung cancer images set against metrics:precision, sensitivity, specificity, f-measure, accuracy,Matthews Correlation Coefficient (MCC), Jaccard, and Dice.The detailed evaluation shows that the K-means with Cuckoo Search Algorithm (CSA) significantly improved thequality of lung cancer segmentation in comparison to the other optimization approaches utilized for lung cancerimages. The results exhibit that the proposed approach (K-means with CSA) achieves precision, sensitivity, and Fmeasureof 0.942, 0.964, and 0.953, respectively, and an average accuracy of 93%. The experimental results prove thatK-meanswithABC,K-meanswith PSO,K-meanswith FFA, andK-meanswithCSAhave achieved an improvementof 10.8%, 13.38%, 13.93%, and 15.7%, respectively, for accuracy measure in comparison to K-means segmentationfor lung cancer images. Further, it is highlighted that the proposed K-means with CSA have achieved a significantimprovement in accuracy, hence can be utilized by researchers for improved segmentation processes of medicalimage datasets for identifying the targeted region of interest.
基金This research is funded by the Deanship of Scientific Research at Umm Al-Qura University,Grant Code:22UQU4281755DSR02.
文摘Software needs modifications and requires revisions regularly.Owing to these revisions,retesting software becomes essential to ensure that the enhancements made,have not affected its bug-free functioning.The time and cost incurred in this process,need to be reduced by the method of test case selection and prioritization.It is observed that many nature-inspired techniques are applied in this area.African Buffalo Optimization is one such approach,applied to regression test selection and prioritization.In this paper,the proposed work explains and proves the applicability of the African Buffalo Optimization approach to test case selection and prioritization.The proposed algorithm converges in polynomial time(O(n^(2))).In this paper,the empirical evaluation of applying African Buffalo Optimization for test case prioritization is done on sample data set with multiple iterations.An astounding 62.5%drop in size and a 48.57%drop in the runtime of the original test suite were recorded.The obtained results are compared with Ant Colony Optimization.The comparative analysis indicates that African Buffalo Optimization and Ant Colony Optimization exhibit similar fault detection capabilities(80%),and a reduction in the overall execution time and size of the resultant test suite.The results and analysis,hence,advocate and encourages the use of African Buffalo Optimization in the area of test case selection and prioritization.
基金supported in part by the Hangzhou Science and Technology Development Plan Project(Grant No.20191203B30).
文摘Many complex optimization problems in the real world can easily fall into local optimality and fail to find the optimal solution,so more new techniques and methods are needed to solve such challenges.Metaheuristic algorithms have received a lot of attention in recent years because of their efficient performance and simple structure.Sine Cosine Algorithm(SCA)is a recent Metaheuristic algorithm that is based on two trigonometric functions Sine&Cosine.However,like all other metaheuristic algorithms,SCA has a slow convergence and may fail in sub-optimal regions.In this study,an enhanced version of SCA named RDSCA is suggested that depends on two techniques:random spare/replacement and double adaptive weight.The first technique is employed in SCA to speed the convergence whereas the second method is used to enhance exploratory searching capabilities.To evaluate RDSCA,30 functions from CEC 2017 and 4 real-world engineering problems are used.Moreover,a nonparametric test called Wilcoxon signed-rank is carried out at 5%level to evaluate the significance of the obtained results between RDSCA and the other 5 variants of SCA.The results show that RDSCA has competitive results with other metaheuristics algorithms.
基金This study was financially supported by China United Coalbed Methane Corporation,Ltd.(ZZGSSALFGR2021-581),Bin Li received the grant.
文摘In order to overcome the deficiencies of current methods for the prediction of the productivity of shale gas hor-izontal wells after fracturing,a new sophisticated approach is proposed in this study.This new model stems from the combination several techniques,namely,artificial neural network(ANN),particle swarm optimization(PSO),Imperialist Competitive Algorithms(ICA),and Ant Clony Optimization(ACO).These are properly implemented by using the geological and engineering parameters collected from 317 wells.The results show that the optimum PSO-ANN model has a high accuracy,obtaining a R2 of 0.847 on the testing.The partial dependence plots(PDP)indicate that liquid consumption intensity and the proportion of quartz sand are the two most sensitive factors affecting the model’s performance.
基金Princess Nourah bint Abdulrahman University Researchers Supporting Project Number (PNURSP2022R104),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘The rapid population growth results in a crucial problem in the early detection of diseases inmedical research.Among all the cancers unveiled,breast cancer is considered the second most severe cancer.Consequently,an exponential rising in death cases incurred by breast cancer is expected due to the rapid population growth and the lack of resources required for performing medical diagnoses.Utilizing recent advances in machine learning could help medical staff in diagnosing diseases as they offer effective,reliable,and rapid responses,which could help in decreasing the death risk.In this paper,we propose a new algorithm for feature selection based on a hybrid between powerful and recently emerged optimizers,namely,guided whale and dipper throated optimizers.The proposed algorithm is evaluated using four publicly available breast cancer datasets.The evaluation results show the effectiveness of the proposed approach from the accuracy and speed perspectives.To prove the superiority of the proposed algorithm,a set of competing feature selection algorithms were incorporated into the conducted experiments.In addition,a group of statistical analysis experiments was conducted to emphasize the superiority and stability of the proposed algorithm.The best-achieved breast cancer prediction average accuracy based on the proposed algorithm is 99.453%.This result is achieved in an average time of 3.6725 s,the best result among all the competing approaches utilized in the experiments.
文摘Autism Spectrum Disorder(ASD)is a complicated neurodevelopmen-tal disorder that is often identified in toddlers.The microarray data is used as a diagnostic tool to identify the genetics of the disorder.However,microarray data is large and has a high volume.Consequently,it suffers from the problem of dimensionality.In microarray data,the sample size and variance of the gene expression will lead to overfitting and misclassification.Identifying the autism gene(feature)subset from microarray data is an important and challenging research area.It has to be efficiently addressed to improve gene feature selection and classification.To overcome the challenges,a novel Intelligent Hybrid Ensem-ble Gene Selection(IHEGS)model is proposed in this paper.The proposed model integrates the intelligence of different feature selection techniques over the data partitions.In this model,the initial gene selection is carried out by data perturba-tion,and thefinal autism gene subset is obtained by functional perturbation,which reduces the problem of dimensionality in microarray data.The functional perturbation module employs three meta-heuristic swarm intelligence-based tech-niques for gene selection.The obtained gene subset is validated by the Deep Neural Network(DNN)model.The proposed model is implemented using python with six National Center for Biotechnology Information(NCBI)gene expression datasets.From the comparative study with other existing state-of-the-art systems,the proposed model provides stable results in terms of feature selection and clas-sification accuracy.