Hybridizing metaheuristic algorithms involves synergistically combining different optimization techniques to effectively address complex and challenging optimization problems.This approach aims to leverage the strengt...Hybridizing metaheuristic algorithms involves synergistically combining different optimization techniques to effectively address complex and challenging optimization problems.This approach aims to leverage the strengths of multiple algorithms,enhancing solution quality,convergence speed,and robustness,thereby offering a more versatile and efficient means of solving intricate real-world optimization tasks.In this paper,we introduce a hybrid algorithm that amalgamates three distinct metaheuristics:the Beluga Whale Optimization(BWO),the Honey Badger Algorithm(HBA),and the Jellyfish Search(JS)optimizer.The proposed hybrid algorithm will be referred to as BHJO.Through this fusion,the BHJO algorithm aims to leverage the strengths of each optimizer.Before this hybridization,we thoroughly examined the exploration and exploitation capabilities of the BWO,HBA,and JS metaheuristics,as well as their ability to strike a balance between exploration and exploitation.This meticulous analysis allowed us to identify the pros and cons of each algorithm,enabling us to combine them in a novel hybrid approach that capitalizes on their respective strengths for enhanced optimization performance.In addition,the BHJO algorithm incorporates Opposition-Based Learning(OBL)to harness the advantages offered by this technique,leveraging its diverse exploration,accelerated convergence,and improved solution quality to enhance the overall performance and effectiveness of the hybrid algorithm.Moreover,the performance of the BHJO algorithm was evaluated across a range of both unconstrained and constrained optimization problems,providing a comprehensive assessment of its efficacy and applicability in diverse problem domains.Similarly,the BHJO algorithm was subjected to a comparative analysis with several renowned algorithms,where mean and standard deviation values were utilized as evaluation metrics.This rigorous comparison aimed to assess the performance of the BHJOalgorithmabout its counterparts,shedding light on its effectiveness and reliability in solving optimization problems.Finally,the obtained numerical statistics underwent rigorous analysis using the Friedman post hoc Dunn’s test.The resulting numerical values revealed the BHJO algorithm’s competitiveness in tackling intricate optimization problems,affirming its capability to deliver favorable outcomes in challenging scenarios.展开更多
Geopolymer concrete emerges as a promising avenue for sustainable development and offers an effective solution to environmental problems.Its attributes as a non-toxic,low-carbon,and economical substitute for conventio...Geopolymer concrete emerges as a promising avenue for sustainable development and offers an effective solution to environmental problems.Its attributes as a non-toxic,low-carbon,and economical substitute for conventional cement concrete,coupled with its elevated compressive strength and reduced shrinkage properties,position it as a pivotal material for diverse applications spanning from architectural structures to transportation infrastructure.In this context,this study sets out the task of using machine learning(ML)algorithms to increase the accuracy and interpretability of predicting the compressive strength of geopolymer concrete in the civil engineering field.To achieve this goal,a new approach using convolutional neural networks(CNNs)has been adopted.This study focuses on creating a comprehensive dataset consisting of compositional and strength parameters of 162 geopolymer concrete mixes,all containing Class F fly ash.The selection of optimal input parameters is guided by two distinct criteria.The first criterion leverages insights garnered from previous research on the influence of individual features on compressive strength.The second criterion scrutinizes the impact of these features within the model’s predictive framework.Key to enhancing the CNN model’s performance is the meticulous determination of the optimal hyperparameters.Through a systematic trial-and-error process,the study ascertains the ideal number of epochs for data division and the optimal value of k for k-fold cross-validation—a technique vital to the model’s robustness.The model’s predictive prowess is rigorously assessed via a suite of performance metrics and comprehensive score analyses.Furthermore,the model’s adaptability is gauged by integrating a secondary dataset into its predictive framework,facilitating a comparative evaluation against conventional prediction methods.To unravel the intricacies of the CNN model’s learning trajectory,a loss plot is deployed to elucidate its learning rate.The study culminates in compelling findings that underscore the CNN model’s accurate prediction of geopolymer concrete compressive strength.To maximize the dataset’s potential,the application of bivariate plots unveils nuanced trends and interactions among variables,fortifying the consistency with earlier research.Evidenced by promising prediction accuracy,the study’s outcomes hold significant promise in guiding the development of innovative geopolymer concrete formulations,thereby reinforcing its role as an eco-conscious and robust construction material.The findings prove that the CNN model accurately estimated geopolymer concrete’s compressive strength.The results show that the prediction accuracy is promising and can be used for the development of new geopolymer concrete mixes.The outcomes not only underscore the significance of leveraging technology for sustainable construction practices but also pave the way for innovation and efficiency in the field of civil engineering.展开更多
Pupil dynamics are the important characteristics of face spoofing detection.The face recognition system is one of the most used biometrics for authenticating individual identity.The main threats to the facial recognit...Pupil dynamics are the important characteristics of face spoofing detection.The face recognition system is one of the most used biometrics for authenticating individual identity.The main threats to the facial recognition system are different types of presentation attacks like print attacks,3D mask attacks,replay attacks,etc.The proposed model uses pupil characteristics for liveness detection during the authentication process.The pupillary light reflex is an involuntary reaction controlling the pupil’s diameter at different light intensities.The proposed framework consists of two-phase methodologies.In the first phase,the pupil’s diameter is calculated by applying stimulus(light)in one eye of the subject and calculating the constriction of the pupil size on both eyes in different video frames.The above measurement is converted into feature space using Kohn and Clynes model-defined parameters.The Support Vector Machine is used to classify legitimate subjects when the diameter change is normal(or when the eye is alive)or illegitimate subjects when there is no change or abnormal oscillations of pupil behavior due to the presence of printed photograph,video,or 3D mask of the subject in front of the camera.In the second phase,we perform the facial recognition process.Scale-invariant feature transform(SIFT)is used to find the features from the facial images,with each feature having a size of a 128-dimensional vector.These features are scale,rotation,and orientation invariant and are used for recognizing facial images.The brute force matching algorithm is used for matching features of two different images.The threshold value we considered is 0.08 for good matches.To analyze the performance of the framework,we tested our model in two Face antispoofing datasets named Replay attack datasets and CASIA-SURF datasets,which were used because they contain the videos of the subjects in each sample having three modalities(RGB,IR,Depth).The CASIA-SURF datasets showed an 89.9%Equal Error Rate,while the Replay Attack datasets showed a 92.1%Equal Error Rate.展开更多
The intuitive fuzzy set has found important application in decision-making and machine learning.To enrich and utilize the intuitive fuzzy set,this study designed and developed a deep neural network-based glaucoma eye ...The intuitive fuzzy set has found important application in decision-making and machine learning.To enrich and utilize the intuitive fuzzy set,this study designed and developed a deep neural network-based glaucoma eye detection using fuzzy difference equations in the domain where the retinal images converge.Retinal image detections are categorized as normal eye recognition,suspected glaucomatous eye recognition,and glaucomatous eye recognition.Fuzzy degrees associated with weighted values are calculated to determine the level of concentration between the fuzzy partition and the retinal images.The proposed model was used to diagnose glaucoma using retinal images and involved utilizing the Convolutional Neural Network(CNN)and deep learning to identify the fuzzy weighted regularization between images.This methodology was used to clarify the input images and make them adequate for the process of glaucoma detection.The objective of this study was to propose a novel approach to the early diagnosis of glaucoma using the Fuzzy Expert System(FES)and Fuzzy differential equation(FDE).The intensities of the different regions in the images and their respective peak levels were determined.Once the peak regions were identified,the recurrence relationships among those peaks were then measured.Image partitioning was done due to varying degrees of similar and dissimilar concentrations in the image.Similar and dissimilar concentration levels and spatial frequency generated a threshold image from the combined fuzzy matrix and FDE.This distinguished between a normal and abnormal eye condition,thus detecting patients with glaucomatous eyes.展开更多
In blood or bone marrow,leukemia is a form of cancer.A person with leukemia has an expansion of white blood cells(WBCs).It primarily affects children and rarely affects adults.Treatment depends on the type of leukemia...In blood or bone marrow,leukemia is a form of cancer.A person with leukemia has an expansion of white blood cells(WBCs).It primarily affects children and rarely affects adults.Treatment depends on the type of leukemia and the extent to which cancer has established throughout the body.Identifying leukemia in the initial stage is vital to providing timely patient care.Medical image-analysis-related approaches grant safer,quicker,and less costly solutions while ignoring the difficulties of these invasive processes.It can be simple to generalize Computer vision(CV)-based and image-processing techniques and eradicate human error.Many researchers have implemented computer-aided diagnosticmethods andmachine learning(ML)for laboratory image analysis,hopefully overcoming the limitations of late leukemia detection and determining its subgroups.This study establishes a Marine Predators Algorithm with Deep Learning Leukemia Cancer Classification(MPADL-LCC)algorithm onMedical Images.The projectedMPADL-LCC system uses a bilateral filtering(BF)technique to pre-process medical images.The MPADL-LCC system uses Faster SqueezeNet withMarine Predators Algorithm(MPA)as a hyperparameter optimizer for feature extraction.Lastly,the denoising autoencoder(DAE)methodology can be executed to accurately detect and classify leukemia cancer.The hyperparameter tuning process using MPA helps enhance leukemia cancer classification performance.Simulation results are compared with other recent approaches concerning various measurements and the MPADL-LCC algorithm exhibits the best results over other recent approaches.展开更多
Machine learning(ML)has taken the world by a tornado with its prevalent applications in automating ordinary tasks and using turbulent insights throughout scientific research and design strolls.ML is a massive area wit...Machine learning(ML)has taken the world by a tornado with its prevalent applications in automating ordinary tasks and using turbulent insights throughout scientific research and design strolls.ML is a massive area within artificial intelligence(AI)that focuses on obtaining valuable information out of data,explaining why ML has often been related to stats and data science.An advanced meta-heuristic optimization algorithm is proposed in this work for the optimization problem of antenna architecture design.The algorithm is designed,depending on the hybrid between the Sine Cosine Algorithm(SCA)and the Grey Wolf Optimizer(GWO),to train neural networkbased Multilayer Perceptron(MLP).The proposed optimization algorithm is a practical,versatile,and trustworthy platform to recognize the design parameters in an optimal way for an endorsement double T-shaped monopole antenna.The proposed algorithm likewise shows a comparative and statistical analysis by different curves in addition to the ANOVA and T-Test.It offers the superiority and validation stability evaluation of the predicted results to verify the procedures’accuracy.展开更多
Metamaterial Antenna is a special class of antennas that uses metamaterial to enhance their performance.Antenna size affects the quality factor and the radiation loss of the antenna.Metamaterial antennas can overcome ...Metamaterial Antenna is a special class of antennas that uses metamaterial to enhance their performance.Antenna size affects the quality factor and the radiation loss of the antenna.Metamaterial antennas can overcome the limitation of bandwidth for small antennas.Machine learning(ML)model is recently applied to predict antenna parameters.ML can be used as an alternative approach to the trial-and-error process of finding proper parameters of the simulated antenna.The accuracy of the prediction depends mainly on the selected model.Ensemble models combine two or more base models to produce a better-enhanced model.In this paper,a weighted average ensemble model is proposed to predict the bandwidth of the Metamaterial Antenna.Two base models are used namely:Multilayer Perceptron(MLP)and Support Vector Machines(SVM).To calculate the weights for each model,an optimization algorithm is used to find the optimal weights of the ensemble.Dynamic Group-Based Cooperative Optimizer(DGCO)is employed to search for optimal weight for the base models.The proposed model is compared with three based models and the average ensemble model.The results show that the proposed model is better than other models and can predict antenna bandwidth efficiently.展开更多
Graphs are used in various disciplines such as telecommunication,biological networks,as well as social networks.In large-scale networks,it is challenging to detect the communities by learning the distinct properties o...Graphs are used in various disciplines such as telecommunication,biological networks,as well as social networks.In large-scale networks,it is challenging to detect the communities by learning the distinct properties of the graph.As deep learning hasmade contributions in a variety of domains,we try to use deep learning techniques to mine the knowledge from large-scale graph networks.In this paper,we aim to provide a strategy for detecting communities using deep autoencoders and obtain generic neural attention to graphs.The advantages of neural attention are widely seen in the field of NLP and computer vision,which has low computational complexity for large-scale graphs.The contributions of the paper are summarized as follows.Firstly,a transformer is utilized to downsample the first-order proximities of the graph into a latent space,which can result in the structural properties and eventually assist in detecting the communities.Secondly,the fine-tuning task is conducted by tuning variant hyperparameters cautiously,which is applied to multiple social networks(Facebook and Twitch).Furthermore,the objective function(crossentropy)is tuned by L0 regularization.Lastly,the reconstructed model forms communities that present the relationship between the groups.The proposed robust model provides good generalization and is applicable to obtaining not only the community structures in social networks but also the node classification.The proposed graph-transformer shows advanced performance on the social networks with the average NMIs of 0.67±0.04,0.198±0.02,0.228±0.02,and 0.68±0.03 on Wikipedia crocodiles,Github Developers,Twitch England,and Facebook Page-Page networks,respectively.展开更多
Determining the optimum location of facilities is critical in many fields,particularly in healthcare.This study proposes the application of a suitable location model for field hospitals during the novel coronavirus 20...Determining the optimum location of facilities is critical in many fields,particularly in healthcare.This study proposes the application of a suitable location model for field hospitals during the novel coronavirus 2019(COVID-19)pandemic.The used model is the most appropriate among the three most common location models utilized to solve healthcare problems(the set covering model,the maximal covering model,and the P-median model).The proposed nonlinear binary constrained model is a slight modification of the maximal covering model with a set of nonlinear constraints.The model is used to determine the optimum location of field hospitals for COVID-19 risk reduction.The designed mathematical model and the solution method are used to deploy field hospitals in eight governorates in Upper Egypt.In this case study,a discrete binary gaining–sharing knowledge-based optimization(DBGSK)algorithm is proposed.The DBGSK algorithm is based on how humans acquire and share knowledge throughout their life.The DBGSK algorithm mainly depends on two junior and senior binary stages.These two stages enable DBGSK to explore and exploit the search space efficiently and effectively,and thus it can solve problems in binary space.展开更多
Commercial airline companies are continuously seeking to implement strategies for minimizing costs of fuel for their flight routes as acquiring jet fuel represents a significant part of operating and managing expenses...Commercial airline companies are continuously seeking to implement strategies for minimizing costs of fuel for their flight routes as acquiring jet fuel represents a significant part of operating and managing expenses for airline activities.A nonlinear mixed binary mathematical programming model for the airline fuel task is presented to minimize the total cost of refueling in an entire flight route problem.The model is enhanced to include possible discounts in fuel prices,which are performed by adding dummy variables and some restrictive constraints,or by fitting a suitable distribution function that relates prices to purchased quantities.The obtained fuel plan explains exactly the amounts of fuel in gallons to be purchased from each airport considering tankering strategy while minimizing the pertinent cost of the whole flight route.The relation between the amount of extra burnt fuel taken through tinkering strategy and the total flight time is also considered.A case study is introduced for a certain flight rotation in domestic US air transport route.The mathematical model including stepped discounted fuel prices is formulated.The problem has a stochastic nature as the total flight time is a random variable,the stochastic nature of the problem is realistic and more appropriate than the deterministic case.The stochastic style of the problem is simulated by introducing a suitable probability distribution for the flight time duration and generating enough number of runs to mimic the probabilistic real situation.Many similar real application problems are modelled as nonlinear mixed binary ones that are difficult to handle by exact methods.Therefore,metaheuristic approaches are widely used in treating such different optimization tasks.In this paper,a gaining sharing knowledge-based procedure is used to handle the mathematical model.The algorithm basically based on the process of gaining and sharing knowledge throughout the human lifetime.The generated simulation runs of the example are solved using the proposed algorithm,and the resulting distribution outputs for the optimum purchased fuel amounts from each airport and for the total cost and are obtained.展开更多
COVID-19 is a growing problem worldwide with a high mortality rate.As a result,the World Health Organization(WHO)declared it a pandemic.In order to limit the spread of the disease,a fast and accurate diagnosis is requ...COVID-19 is a growing problem worldwide with a high mortality rate.As a result,the World Health Organization(WHO)declared it a pandemic.In order to limit the spread of the disease,a fast and accurate diagnosis is required.A reverse transcript polymerase chain reaction(RT-PCR)test is often used to detect the disease.However,since this test is time-consuming,a chest computed tomography(CT)or plain chest X-ray(CXR)is sometimes indicated.The value of automated diagnosis is that it saves time and money by minimizing human effort.Three significant contributions are made by our research.Its initial purpose is to use the essential finetuning methodology to test the action and efficiency of a variety of vision models,ranging from Inception to Neural Architecture Search(NAS)networks.Second,by plotting class activationmaps(CAMs)for individual networks and assessing classification efficiency with AUC-ROC curves,the behavior of these models is visually analyzed.Finally,stacked ensembles techniques were used to provide greater generalization by combining finetuned models with six ensemble neural networks.Using stacked ensembles,the generalization of the models improved.Furthermore,the ensemble model created by combining all of the finetuned networks obtained a state-of-the-art COVID-19 accuracy detection score of 99.17%.The precision and recall rates were 99.99%and 89.79%,respectively,highlighting the robustness of stacked ensembles.The proposed ensemble approach performed well in the classification of the COVID-19 lesions on CXR according to the experimental results.展开更多
Recent years witness a great deal of interest in artificial intelligence(AI)tools in the area of optimization.AI has developed a large number of tools to solve themost difficult search-and-optimization problems in com...Recent years witness a great deal of interest in artificial intelligence(AI)tools in the area of optimization.AI has developed a large number of tools to solve themost difficult search-and-optimization problems in computer science and operations research.Indeed,metaheuristic-based algorithms are a sub-field of AI.This study presents the use of themetaheuristic algorithm,that is,water cycle algorithm(WCA),in the transportation problem.A stochastic transportation problem is considered in which the parameters supply and demand are considered as random variables that follow the Weibull distribution.Since the parameters are stochastic,the corresponding constraints are probabilistic.They are converted into deterministic constraints using the stochastic programming approach.In this study,we propose evolutionary algorithms to handle the difficulties of the complex high-dimensional optimization problems.WCA is influenced by the water cycle process of how streams and rivers flow toward the sea(optimal solution).WCA is applied to the stochastic transportation problem,and obtained results are compared with that of the new metaheuristic optimization algorithm,namely the neural network algorithm which is inspired by the biological nervous system.It is concluded that WCA presents better results when compared with the neural network algorithm.展开更多
Efficient decision-making remains an open challenge in the research community,and many researchers are working to improve accuracy through the use of various computational techniques.In this case,the fuzzification and...Efficient decision-making remains an open challenge in the research community,and many researchers are working to improve accuracy through the use of various computational techniques.In this case,the fuzzification and defuzzification processes can be very useful.Defuzzification is an effective process to get a single number from the output of a fuzzy set.Considering defuzzification as a center point of this research paper,to analyze and understand the effect of different types of vehicles according to their performance.In this paper,the multi-criteria decision-making(MCDM)process under uncertainty and defuzzification is discussed by using the center of the area(COA)or centroidmethod.Further,to find the best solution,Hurwicz criteria are used on the defuzzified data.Anewdecision-making technique is proposed using Hurwicz criteria for triangular and trapezoidal fuzzy numbers.The proposed technique considers all types of decision makers’perspectives such as optimistic,neutral,and pessimistic which is crucial in solving decisionmaking problems.A simple case study is used to demonstrate and discuss the Centroid Method and Hurwicz Criteria for measuring risk attitudes among decision-makers.The significance of the proposed defuzzification method is demonstrated by comparing it to previous defuzzification procedures with its application.展开更多
Unmanned Aerial Vehicles(UAVs)provide a reliable and energyefficient solution for data collection from the Narrowband Internet of Things(NB-IoT)devices.However,the UAV’s deployment optimization,including locations of...Unmanned Aerial Vehicles(UAVs)provide a reliable and energyefficient solution for data collection from the Narrowband Internet of Things(NB-IoT)devices.However,the UAV’s deployment optimization,including locations of the UAV’s stop points,is a necessity to minimize the energy consumption of the UAV and the NB-IoT devices and also to conduct the data collection efficiently.In this regard,this paper proposes GainingSharing Knowledge(GSK)algorithm for optimizing the UAV’s deployment.In GSK,the number of UAV’s stop points in the three-dimensional space is encapsulated into a single individual with a fixed length representing an entire deployment.The superiority of using GSK in the tackled problem is verified by simulation in seven scenarios.It provides significant results in all seven scenarios compared with other four optimization algorithms used before with the same problem.Besides,the NB-IoT is proposed as the wireless communication technology between the UAV and IoT devices.展开更多
This paper presents a novel application of metaheuristic algorithmsfor solving stochastic programming problems using a recently developed gaining sharing knowledge based optimization (GSK) algorithm. The algorithmis b...This paper presents a novel application of metaheuristic algorithmsfor solving stochastic programming problems using a recently developed gaining sharing knowledge based optimization (GSK) algorithm. The algorithmis based on human behavior in which people gain and share their knowledgewith others. Different types of stochastic fractional programming problemsare considered in this study. The augmented Lagrangian method (ALM)is used to handle these constrained optimization problems by convertingthem into unconstrained optimization problems. Three examples from theliterature are considered and transformed into their deterministic form usingthe chance-constrained technique. The transformed problems are solved usingGSK algorithm and the results are compared with eight other state-of-the-artmetaheuristic algorithms. The obtained results are also compared with theoptimal global solution and the results quoted in the literature. To investigatethe performance of the GSK algorithm on a real-world problem, a solidstochastic fixed charge transportation problem is examined, in which theparameters of the problem are considered as random variables. The obtainedresults show that the GSK algorithm outperforms other algorithms in termsof convergence, robustness, computational time, and quality of obtainedsolutions.展开更多
The optimum delivery of safeguarding substances is a major part of supply chain management and a crucial issue in the mitigation against the outbreak of pandemics.A problem arises for a decision maker who wants to opt...The optimum delivery of safeguarding substances is a major part of supply chain management and a crucial issue in the mitigation against the outbreak of pandemics.A problem arises for a decision maker who wants to optimally choose a subset of candidate consumers to maximize the distributed quantities of the needed safeguarding substances within a specic time period.A nonlinear binary mathematical programming model for the problem is formulated.The decision variables are binary ones that represent whether to choose a specic consumer,and design constraints are formulated to keep track of the chosen route.To better illustrate the problem,objective,and problem constraints,a real application case study is presented.The case study involves the optimum delivery of safeguarding substances to several hospitals in the Al-Gharbia Governorate in Egypt.The hospitals are selected to represent the consumers of safeguarding substances,as they are the rst crucial frontline for mitigation against a pandemic outbreak.A distribution truck is used to distribute the substances from the main store to the hospitals in specied required quantities during a given working shift.The objective function is formulated in order to maximize the total amount of delivered quantities during the specied time period.The case study is solved using a novel Discrete Binary Gaining Sharing Knowledge-based Optimization algorithm(DBGSK),which involves two main stages:discrete binary junior and senior gaining and sharing stages.DBGSK has the ability of nding the solutions of the introduced problem,and the obtained results demonstrate robustness and convergence toward the optimal solutions.展开更多
基金funded by the Researchers Supporting Program at King Saud University(RSPD2024R809).
文摘Hybridizing metaheuristic algorithms involves synergistically combining different optimization techniques to effectively address complex and challenging optimization problems.This approach aims to leverage the strengths of multiple algorithms,enhancing solution quality,convergence speed,and robustness,thereby offering a more versatile and efficient means of solving intricate real-world optimization tasks.In this paper,we introduce a hybrid algorithm that amalgamates three distinct metaheuristics:the Beluga Whale Optimization(BWO),the Honey Badger Algorithm(HBA),and the Jellyfish Search(JS)optimizer.The proposed hybrid algorithm will be referred to as BHJO.Through this fusion,the BHJO algorithm aims to leverage the strengths of each optimizer.Before this hybridization,we thoroughly examined the exploration and exploitation capabilities of the BWO,HBA,and JS metaheuristics,as well as their ability to strike a balance between exploration and exploitation.This meticulous analysis allowed us to identify the pros and cons of each algorithm,enabling us to combine them in a novel hybrid approach that capitalizes on their respective strengths for enhanced optimization performance.In addition,the BHJO algorithm incorporates Opposition-Based Learning(OBL)to harness the advantages offered by this technique,leveraging its diverse exploration,accelerated convergence,and improved solution quality to enhance the overall performance and effectiveness of the hybrid algorithm.Moreover,the performance of the BHJO algorithm was evaluated across a range of both unconstrained and constrained optimization problems,providing a comprehensive assessment of its efficacy and applicability in diverse problem domains.Similarly,the BHJO algorithm was subjected to a comparative analysis with several renowned algorithms,where mean and standard deviation values were utilized as evaluation metrics.This rigorous comparison aimed to assess the performance of the BHJOalgorithmabout its counterparts,shedding light on its effectiveness and reliability in solving optimization problems.Finally,the obtained numerical statistics underwent rigorous analysis using the Friedman post hoc Dunn’s test.The resulting numerical values revealed the BHJO algorithm’s competitiveness in tackling intricate optimization problems,affirming its capability to deliver favorable outcomes in challenging scenarios.
基金funded by the Researchers Supporting Program at King Saud University(RSPD2023R809).
文摘Geopolymer concrete emerges as a promising avenue for sustainable development and offers an effective solution to environmental problems.Its attributes as a non-toxic,low-carbon,and economical substitute for conventional cement concrete,coupled with its elevated compressive strength and reduced shrinkage properties,position it as a pivotal material for diverse applications spanning from architectural structures to transportation infrastructure.In this context,this study sets out the task of using machine learning(ML)algorithms to increase the accuracy and interpretability of predicting the compressive strength of geopolymer concrete in the civil engineering field.To achieve this goal,a new approach using convolutional neural networks(CNNs)has been adopted.This study focuses on creating a comprehensive dataset consisting of compositional and strength parameters of 162 geopolymer concrete mixes,all containing Class F fly ash.The selection of optimal input parameters is guided by two distinct criteria.The first criterion leverages insights garnered from previous research on the influence of individual features on compressive strength.The second criterion scrutinizes the impact of these features within the model’s predictive framework.Key to enhancing the CNN model’s performance is the meticulous determination of the optimal hyperparameters.Through a systematic trial-and-error process,the study ascertains the ideal number of epochs for data division and the optimal value of k for k-fold cross-validation—a technique vital to the model’s robustness.The model’s predictive prowess is rigorously assessed via a suite of performance metrics and comprehensive score analyses.Furthermore,the model’s adaptability is gauged by integrating a secondary dataset into its predictive framework,facilitating a comparative evaluation against conventional prediction methods.To unravel the intricacies of the CNN model’s learning trajectory,a loss plot is deployed to elucidate its learning rate.The study culminates in compelling findings that underscore the CNN model’s accurate prediction of geopolymer concrete compressive strength.To maximize the dataset’s potential,the application of bivariate plots unveils nuanced trends and interactions among variables,fortifying the consistency with earlier research.Evidenced by promising prediction accuracy,the study’s outcomes hold significant promise in guiding the development of innovative geopolymer concrete formulations,thereby reinforcing its role as an eco-conscious and robust construction material.The findings prove that the CNN model accurately estimated geopolymer concrete’s compressive strength.The results show that the prediction accuracy is promising and can be used for the development of new geopolymer concrete mixes.The outcomes not only underscore the significance of leveraging technology for sustainable construction practices but also pave the way for innovation and efficiency in the field of civil engineering.
基金funded by Researchers Supporting Program at King Saud University (RSPD2023R809).
文摘Pupil dynamics are the important characteristics of face spoofing detection.The face recognition system is one of the most used biometrics for authenticating individual identity.The main threats to the facial recognition system are different types of presentation attacks like print attacks,3D mask attacks,replay attacks,etc.The proposed model uses pupil characteristics for liveness detection during the authentication process.The pupillary light reflex is an involuntary reaction controlling the pupil’s diameter at different light intensities.The proposed framework consists of two-phase methodologies.In the first phase,the pupil’s diameter is calculated by applying stimulus(light)in one eye of the subject and calculating the constriction of the pupil size on both eyes in different video frames.The above measurement is converted into feature space using Kohn and Clynes model-defined parameters.The Support Vector Machine is used to classify legitimate subjects when the diameter change is normal(or when the eye is alive)or illegitimate subjects when there is no change or abnormal oscillations of pupil behavior due to the presence of printed photograph,video,or 3D mask of the subject in front of the camera.In the second phase,we perform the facial recognition process.Scale-invariant feature transform(SIFT)is used to find the features from the facial images,with each feature having a size of a 128-dimensional vector.These features are scale,rotation,and orientation invariant and are used for recognizing facial images.The brute force matching algorithm is used for matching features of two different images.The threshold value we considered is 0.08 for good matches.To analyze the performance of the framework,we tested our model in two Face antispoofing datasets named Replay attack datasets and CASIA-SURF datasets,which were used because they contain the videos of the subjects in each sample having three modalities(RGB,IR,Depth).The CASIA-SURF datasets showed an 89.9%Equal Error Rate,while the Replay Attack datasets showed a 92.1%Equal Error Rate.
基金funding the publication of this research through the Researchers Supporting Program (RSPD2023R809),King Saud University,Riyadh,Saudi Arabia.
文摘The intuitive fuzzy set has found important application in decision-making and machine learning.To enrich and utilize the intuitive fuzzy set,this study designed and developed a deep neural network-based glaucoma eye detection using fuzzy difference equations in the domain where the retinal images converge.Retinal image detections are categorized as normal eye recognition,suspected glaucomatous eye recognition,and glaucomatous eye recognition.Fuzzy degrees associated with weighted values are calculated to determine the level of concentration between the fuzzy partition and the retinal images.The proposed model was used to diagnose glaucoma using retinal images and involved utilizing the Convolutional Neural Network(CNN)and deep learning to identify the fuzzy weighted regularization between images.This methodology was used to clarify the input images and make them adequate for the process of glaucoma detection.The objective of this study was to propose a novel approach to the early diagnosis of glaucoma using the Fuzzy Expert System(FES)and Fuzzy differential equation(FDE).The intensities of the different regions in the images and their respective peak levels were determined.Once the peak regions were identified,the recurrence relationships among those peaks were then measured.Image partitioning was done due to varying degrees of similar and dissimilar concentrations in the image.Similar and dissimilar concentration levels and spatial frequency generated a threshold image from the combined fuzzy matrix and FDE.This distinguished between a normal and abnormal eye condition,thus detecting patients with glaucomatous eyes.
基金funded by Researchers Supporting Program at King Saud University,(RSPD2024R809).
文摘In blood or bone marrow,leukemia is a form of cancer.A person with leukemia has an expansion of white blood cells(WBCs).It primarily affects children and rarely affects adults.Treatment depends on the type of leukemia and the extent to which cancer has established throughout the body.Identifying leukemia in the initial stage is vital to providing timely patient care.Medical image-analysis-related approaches grant safer,quicker,and less costly solutions while ignoring the difficulties of these invasive processes.It can be simple to generalize Computer vision(CV)-based and image-processing techniques and eradicate human error.Many researchers have implemented computer-aided diagnosticmethods andmachine learning(ML)for laboratory image analysis,hopefully overcoming the limitations of late leukemia detection and determining its subgroups.This study establishes a Marine Predators Algorithm with Deep Learning Leukemia Cancer Classification(MPADL-LCC)algorithm onMedical Images.The projectedMPADL-LCC system uses a bilateral filtering(BF)technique to pre-process medical images.The MPADL-LCC system uses Faster SqueezeNet withMarine Predators Algorithm(MPA)as a hyperparameter optimizer for feature extraction.Lastly,the denoising autoencoder(DAE)methodology can be executed to accurately detect and classify leukemia cancer.The hyperparameter tuning process using MPA helps enhance leukemia cancer classification performance.Simulation results are compared with other recent approaches concerning various measurements and the MPADL-LCC algorithm exhibits the best results over other recent approaches.
文摘Machine learning(ML)has taken the world by a tornado with its prevalent applications in automating ordinary tasks and using turbulent insights throughout scientific research and design strolls.ML is a massive area within artificial intelligence(AI)that focuses on obtaining valuable information out of data,explaining why ML has often been related to stats and data science.An advanced meta-heuristic optimization algorithm is proposed in this work for the optimization problem of antenna architecture design.The algorithm is designed,depending on the hybrid between the Sine Cosine Algorithm(SCA)and the Grey Wolf Optimizer(GWO),to train neural networkbased Multilayer Perceptron(MLP).The proposed optimization algorithm is a practical,versatile,and trustworthy platform to recognize the design parameters in an optimal way for an endorsement double T-shaped monopole antenna.The proposed algorithm likewise shows a comparative and statistical analysis by different curves in addition to the ANOVA and T-Test.It offers the superiority and validation stability evaluation of the predicted results to verify the procedures’accuracy.
文摘Metamaterial Antenna is a special class of antennas that uses metamaterial to enhance their performance.Antenna size affects the quality factor and the radiation loss of the antenna.Metamaterial antennas can overcome the limitation of bandwidth for small antennas.Machine learning(ML)model is recently applied to predict antenna parameters.ML can be used as an alternative approach to the trial-and-error process of finding proper parameters of the simulated antenna.The accuracy of the prediction depends mainly on the selected model.Ensemble models combine two or more base models to produce a better-enhanced model.In this paper,a weighted average ensemble model is proposed to predict the bandwidth of the Metamaterial Antenna.Two base models are used namely:Multilayer Perceptron(MLP)and Support Vector Machines(SVM).To calculate the weights for each model,an optimization algorithm is used to find the optimal weights of the ensemble.Dynamic Group-Based Cooperative Optimizer(DGCO)is employed to search for optimal weight for the base models.The proposed model is compared with three based models and the average ensemble model.The results show that the proposed model is better than other models and can predict antenna bandwidth efficiently.
基金The research is funded by the Researchers Supporting Project at King Saud University(Project#RSP-2021/305).
文摘Graphs are used in various disciplines such as telecommunication,biological networks,as well as social networks.In large-scale networks,it is challenging to detect the communities by learning the distinct properties of the graph.As deep learning hasmade contributions in a variety of domains,we try to use deep learning techniques to mine the knowledge from large-scale graph networks.In this paper,we aim to provide a strategy for detecting communities using deep autoencoders and obtain generic neural attention to graphs.The advantages of neural attention are widely seen in the field of NLP and computer vision,which has low computational complexity for large-scale graphs.The contributions of the paper are summarized as follows.Firstly,a transformer is utilized to downsample the first-order proximities of the graph into a latent space,which can result in the structural properties and eventually assist in detecting the communities.Secondly,the fine-tuning task is conducted by tuning variant hyperparameters cautiously,which is applied to multiple social networks(Facebook and Twitch).Furthermore,the objective function(crossentropy)is tuned by L0 regularization.Lastly,the reconstructed model forms communities that present the relationship between the groups.The proposed robust model provides good generalization and is applicable to obtaining not only the community structures in social networks but also the node classification.The proposed graph-transformer shows advanced performance on the social networks with the average NMIs of 0.67±0.04,0.198±0.02,0.228±0.02,and 0.68±0.03 on Wikipedia crocodiles,Github Developers,Twitch England,and Facebook Page-Page networks,respectively.
基金funded by Deanship of Scientific Research,King Saud University,through the Vice Deanship of Scientific Research.
文摘Determining the optimum location of facilities is critical in many fields,particularly in healthcare.This study proposes the application of a suitable location model for field hospitals during the novel coronavirus 2019(COVID-19)pandemic.The used model is the most appropriate among the three most common location models utilized to solve healthcare problems(the set covering model,the maximal covering model,and the P-median model).The proposed nonlinear binary constrained model is a slight modification of the maximal covering model with a set of nonlinear constraints.The model is used to determine the optimum location of field hospitals for COVID-19 risk reduction.The designed mathematical model and the solution method are used to deploy field hospitals in eight governorates in Upper Egypt.In this case study,a discrete binary gaining–sharing knowledge-based optimization(DBGSK)algorithm is proposed.The DBGSK algorithm is based on how humans acquire and share knowledge throughout their life.The DBGSK algorithm mainly depends on two junior and senior binary stages.These two stages enable DBGSK to explore and exploit the search space efficiently and effectively,and thus it can solve problems in binary space.
基金The research is funded by Deanship of Scientific Research at King Saud University research group number RG-1436-040.
文摘Commercial airline companies are continuously seeking to implement strategies for minimizing costs of fuel for their flight routes as acquiring jet fuel represents a significant part of operating and managing expenses for airline activities.A nonlinear mixed binary mathematical programming model for the airline fuel task is presented to minimize the total cost of refueling in an entire flight route problem.The model is enhanced to include possible discounts in fuel prices,which are performed by adding dummy variables and some restrictive constraints,or by fitting a suitable distribution function that relates prices to purchased quantities.The obtained fuel plan explains exactly the amounts of fuel in gallons to be purchased from each airport considering tankering strategy while minimizing the pertinent cost of the whole flight route.The relation between the amount of extra burnt fuel taken through tinkering strategy and the total flight time is also considered.A case study is introduced for a certain flight rotation in domestic US air transport route.The mathematical model including stepped discounted fuel prices is formulated.The problem has a stochastic nature as the total flight time is a random variable,the stochastic nature of the problem is realistic and more appropriate than the deterministic case.The stochastic style of the problem is simulated by introducing a suitable probability distribution for the flight time duration and generating enough number of runs to mimic the probabilistic real situation.Many similar real application problems are modelled as nonlinear mixed binary ones that are difficult to handle by exact methods.Therefore,metaheuristic approaches are widely used in treating such different optimization tasks.In this paper,a gaining sharing knowledge-based procedure is used to handle the mathematical model.The algorithm basically based on the process of gaining and sharing knowledge throughout the human lifetime.The generated simulation runs of the example are solved using the proposed algorithm,and the resulting distribution outputs for the optimum purchased fuel amounts from each airport and for the total cost and are obtained.
基金The research is funded by the Researchers Supporting Project at King Saud University,(Project#RSP-2021/305).
文摘COVID-19 is a growing problem worldwide with a high mortality rate.As a result,the World Health Organization(WHO)declared it a pandemic.In order to limit the spread of the disease,a fast and accurate diagnosis is required.A reverse transcript polymerase chain reaction(RT-PCR)test is often used to detect the disease.However,since this test is time-consuming,a chest computed tomography(CT)or plain chest X-ray(CXR)is sometimes indicated.The value of automated diagnosis is that it saves time and money by minimizing human effort.Three significant contributions are made by our research.Its initial purpose is to use the essential finetuning methodology to test the action and efficiency of a variety of vision models,ranging from Inception to Neural Architecture Search(NAS)networks.Second,by plotting class activationmaps(CAMs)for individual networks and assessing classification efficiency with AUC-ROC curves,the behavior of these models is visually analyzed.Finally,stacked ensembles techniques were used to provide greater generalization by combining finetuned models with six ensemble neural networks.Using stacked ensembles,the generalization of the models improved.Furthermore,the ensemble model created by combining all of the finetuned networks obtained a state-of-the-art COVID-19 accuracy detection score of 99.17%.The precision and recall rates were 99.99%and 89.79%,respectively,highlighting the robustness of stacked ensembles.The proposed ensemble approach performed well in the classification of the COVID-19 lesions on CXR according to the experimental results.
基金This work was funded by the Deanship of Scientific Research at King Saud University through research Group Number RG-1436-040.
文摘Recent years witness a great deal of interest in artificial intelligence(AI)tools in the area of optimization.AI has developed a large number of tools to solve themost difficult search-and-optimization problems in computer science and operations research.Indeed,metaheuristic-based algorithms are a sub-field of AI.This study presents the use of themetaheuristic algorithm,that is,water cycle algorithm(WCA),in the transportation problem.A stochastic transportation problem is considered in which the parameters supply and demand are considered as random variables that follow the Weibull distribution.Since the parameters are stochastic,the corresponding constraints are probabilistic.They are converted into deterministic constraints using the stochastic programming approach.In this study,we propose evolutionary algorithms to handle the difficulties of the complex high-dimensional optimization problems.WCA is influenced by the water cycle process of how streams and rivers flow toward the sea(optimal solution).WCA is applied to the stochastic transportation problem,and obtained results are compared with that of the new metaheuristic optimization algorithm,namely the neural network algorithm which is inspired by the biological nervous system.It is concluded that WCA presents better results when compared with the neural network algorithm.
基金The Research Center for Advanced Materials Science(RCAMS)at King Khalid University,Saudi Arabia,for funding this work under the Grant Number RCAMS/KKU/019-20.
文摘Efficient decision-making remains an open challenge in the research community,and many researchers are working to improve accuracy through the use of various computational techniques.In this case,the fuzzification and defuzzification processes can be very useful.Defuzzification is an effective process to get a single number from the output of a fuzzy set.Considering defuzzification as a center point of this research paper,to analyze and understand the effect of different types of vehicles according to their performance.In this paper,the multi-criteria decision-making(MCDM)process under uncertainty and defuzzification is discussed by using the center of the area(COA)or centroidmethod.Further,to find the best solution,Hurwicz criteria are used on the defuzzified data.Anewdecision-making technique is proposed using Hurwicz criteria for triangular and trapezoidal fuzzy numbers.The proposed technique considers all types of decision makers’perspectives such as optimistic,neutral,and pessimistic which is crucial in solving decisionmaking problems.A simple case study is used to demonstrate and discuss the Centroid Method and Hurwicz Criteria for measuring risk attitudes among decision-makers.The significance of the proposed defuzzification method is demonstrated by comparing it to previous defuzzification procedures with its application.
文摘Unmanned Aerial Vehicles(UAVs)provide a reliable and energyefficient solution for data collection from the Narrowband Internet of Things(NB-IoT)devices.However,the UAV’s deployment optimization,including locations of the UAV’s stop points,is a necessity to minimize the energy consumption of the UAV and the NB-IoT devices and also to conduct the data collection efficiently.In this regard,this paper proposes GainingSharing Knowledge(GSK)algorithm for optimizing the UAV’s deployment.In GSK,the number of UAV’s stop points in the three-dimensional space is encapsulated into a single individual with a fixed length representing an entire deployment.The superiority of using GSK in the tackled problem is verified by simulation in seven scenarios.It provides significant results in all seven scenarios compared with other four optimization algorithms used before with the same problem.Besides,the NB-IoT is proposed as the wireless communication technology between the UAV and IoT devices.
基金The research is funded by Researchers Supporting Program at King Saud University,(Project#RSP-2021/305).
文摘This paper presents a novel application of metaheuristic algorithmsfor solving stochastic programming problems using a recently developed gaining sharing knowledge based optimization (GSK) algorithm. The algorithmis based on human behavior in which people gain and share their knowledgewith others. Different types of stochastic fractional programming problemsare considered in this study. The augmented Lagrangian method (ALM)is used to handle these constrained optimization problems by convertingthem into unconstrained optimization problems. Three examples from theliterature are considered and transformed into their deterministic form usingthe chance-constrained technique. The transformed problems are solved usingGSK algorithm and the results are compared with eight other state-of-the-artmetaheuristic algorithms. The obtained results are also compared with theoptimal global solution and the results quoted in the literature. To investigatethe performance of the GSK algorithm on a real-world problem, a solidstochastic fixed charge transportation problem is examined, in which theparameters of the problem are considered as random variables. The obtainedresults show that the GSK algorithm outperforms other algorithms in termsof convergence, robustness, computational time, and quality of obtainedsolutions.
基金funded by Deanship of Scientic Research,King Saud University through the Vice Deanship of Scientic Research.
文摘The optimum delivery of safeguarding substances is a major part of supply chain management and a crucial issue in the mitigation against the outbreak of pandemics.A problem arises for a decision maker who wants to optimally choose a subset of candidate consumers to maximize the distributed quantities of the needed safeguarding substances within a specic time period.A nonlinear binary mathematical programming model for the problem is formulated.The decision variables are binary ones that represent whether to choose a specic consumer,and design constraints are formulated to keep track of the chosen route.To better illustrate the problem,objective,and problem constraints,a real application case study is presented.The case study involves the optimum delivery of safeguarding substances to several hospitals in the Al-Gharbia Governorate in Egypt.The hospitals are selected to represent the consumers of safeguarding substances,as they are the rst crucial frontline for mitigation against a pandemic outbreak.A distribution truck is used to distribute the substances from the main store to the hospitals in specied required quantities during a given working shift.The objective function is formulated in order to maximize the total amount of delivered quantities during the specied time period.The case study is solved using a novel Discrete Binary Gaining Sharing Knowledge-based Optimization algorithm(DBGSK),which involves two main stages:discrete binary junior and senior gaining and sharing stages.DBGSK has the ability of nding the solutions of the introduced problem,and the obtained results demonstrate robustness and convergence toward the optimal solutions.