Skin cancer segmentation is a critical task in a clinical decision support system for skin cancer detection.The suggested enhanced cuckoo search based optimization model will be used to evaluate several metrics in the...Skin cancer segmentation is a critical task in a clinical decision support system for skin cancer detection.The suggested enhanced cuckoo search based optimization model will be used to evaluate several metrics in the skin cancer pic-ture segmentation process.Because time and resources are always limited,the proposed enhanced cuckoo search optimization algorithm is one of the most effec-tive strategies for dealing with global optimization difficulties.One of the most significant requirements is to design optimal solutions to optimize their use.There is no particular technique that can answer all optimization issues.The proposed enhanced cuckoo search optimization method indicates a constructive precision for skin cancer over with all image segmentation in computerized diagnosis.The accuracy of the proposed enhanced cuckoo search based optimization for melanoma has increased with a 23%to 29%improvement than other optimization algorithm.The total sensitivity and specificity attained in the proposed system are 99.56%and 99.73%respectively.The proposed method outperforms by offering accuracy of 99.26%in comparisons to other conventional methods.The proposed enhanced optimization technique achieved 98.75%,98.96%for Dice and Jaccard coefficient.The model trained using the suggested measure outperforms those trained using the conventional method in the segmentation of skin cancer picture data.展开更多
In this paper, a new class of three term memory gradient method with non-monotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Comb...In this paper, a new class of three term memory gradient method with non-monotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Combining the quasi-Newton method with the new method, the former is modified to have global convergence property. Numerical results show that the new algorithm is efficient.展开更多
The genetic algorithm has been widely used in many fields as an easy robust global search and optimization method. In this paper, a new generic algorithm based on niche technique and local search method is presented u...The genetic algorithm has been widely used in many fields as an easy robust global search and optimization method. In this paper, a new generic algorithm based on niche technique and local search method is presented under the consideration of inadequacies of the simple genetic algorithm. In order to prove the adaptability and validity of the improved genetic algorithm, optimization problems of multimodal functions with equal peaks, unequal peaks and complicated peak distribution are discussed. The simulation results show that compared to other niching methods, this improved genetic algorithm has obvious potential on many respects, such as convergence speed, solution accuracy, ability of global optimization, etc.展开更多
This paper presents an optimal proposed allocating procedure for hybrid wind energy combined with proton exchange membrane fuel cell (WE/PEMFC) system to improve the operation performance of the electrical distributio...This paper presents an optimal proposed allocating procedure for hybrid wind energy combined with proton exchange membrane fuel cell (WE/PEMFC) system to improve the operation performance of the electrical distribution system (EDS). Egypt has an excellent wind regime with wind speeds of about 10 m/s at many areas. The disadvantage of wind energy is its seasonal variations. So, if wind power is to supply a significant portion of the demand, either backup power or electrical energy storage (EES) system is needed to ensure that loads will be supplied in reliable way. So, the hybrid WE/PEMFC system is designed to completely supply a part of the Egyptian distribution system, in attempt to isolate it from the grid. However, the optimal allocation of the hybrid units is obtained, in order to enhance their benefits in the distribution networks. The critical buses that are necessary to install the hybrid WE/ PEMFC system, are chosen using sensitivity analysis. Then, the binary Crow search algorithm (BCSA), discrete Jaya algorithm (DJA) and binary particle swarm optimization (BPSO) techniques are proposed to determine the optimal operation of power systems using single and multi-objective functions (SOF/MOF). Then, the results of the three optimization techniques are compared with each other. Three sensitivity factors are employed in this paper, which are voltage sensitivity factor (VSF), active losses sensitivity factor (ALSF) and reactive losses sensitivity factor (RLSF). The effects of the sensitivity factors (SFs) on the SOF/MOF are studied. The improvement of voltage profile and minimizing active and reactive power losses of the EDS are considered as objective functions. Backward/forward sweep (BFS) method is used for the load flow calculations. The system load demand is predicted up to year 2022 for Mersi-Matrouh City as a part of Egyptian distribution network, and the design of the hybrid WE/PEMFC system is applied. The PEMFC system is designed considering simplified mathematical expressions. The economics of operation of both WE and PEMFC system are also presented. The results prove the capability of the proposed procedure to find the optimal allocation for the hybrid WE/PEMFC system to improve the system voltage profile and to minimize both active and reactive power losses for the EDS of Mersi-Matrough City.展开更多
Concrete is the most commonly used construction material.However,its production leads to high carbon dioxide(CO_(2))emissions and energy consumption.Therefore,developing waste-substitutable concrete components is nece...Concrete is the most commonly used construction material.However,its production leads to high carbon dioxide(CO_(2))emissions and energy consumption.Therefore,developing waste-substitutable concrete components is necessary.Improving the sustainability and greenness of concrete is the focus of this research.In this regard,899 data points were collected from existing studies where cement,slag,fly ash,superplasticizer,coarse aggregate,and fine aggregate were considered potential influential factors.The complex relationship between influential factors and concrete compressive strength makes the prediction and estimation of compressive strength difficult.Instead of the traditional compressive strength test,this study combines five novel metaheuristic algorithms with extreme gradient boosting(XGB)to predict the compressive strength of green concrete based on fly ash and blast furnace slag.The intelligent prediction models were assessed using the root mean square error(RMSE),coefficient of determination(R^(2)),mean absolute error(MAE),and variance accounted for(VAF).The results indicated that the squirrel search algorithm-extreme gradient boosting(SSA-XGB)yielded the best overall prediction performance with R^(2) values of 0.9930 and 0.9576,VAF values of 99.30 and 95.79,MAE values of 0.52 and 2.50,RMSE of 1.34 and 3.31 for the training and testing sets,respectively.The remaining five prediction methods yield promising results.Therefore,the developed hybrid XGB model can be introduced as an accurate and fast technique for the performance prediction of green concrete.Finally,the developed SSA-XGB considered the effects of all the input factors on the compressive strength.The ability of the model to predict the performance of concrete with unknown proportions can play a significant role in accelerating the development and application of sustainable concrete and furthering a sustainable economy.展开更多
This paper presents the search technique for a lost target. A lost target is random walker on one of two intersected real lines, and the purpose is to detect the target as fast as possible. We have four searchers star...This paper presents the search technique for a lost target. A lost target is random walker on one of two intersected real lines, and the purpose is to detect the target as fast as possible. We have four searchers start from the point of intersection, they follow the so called Quasi-Coordinated search plan. The expected value of the first meeting time between one of the searchers and the target is investigated, also we show the existence of the optimal search strategy which minimizes this first meeting time.展开更多
The maximum satisfiability problem (MAX-SAT) refers to the task of finding a variable assignment that satisfies the maximum number of clauses (or the sum of weight of satisfied clauses) in a Boolean Formula. Most loca...The maximum satisfiability problem (MAX-SAT) refers to the task of finding a variable assignment that satisfies the maximum number of clauses (or the sum of weight of satisfied clauses) in a Boolean Formula. Most local search algorithms including tabu search rely on the 1-flip neighbourhood structure. In this work, we introduce a tabu search algorithm that makes use of the multilevel paradigm for solving MAX-SAT problems. The multilevel paradigm refers to the process of dividing large and difficult problems into smaller ones, which are hopefully much easier to solve, and then work backward towards the solution of the original problem, using a solution from a previous level as a starting solution at the next level. This process aims at looking at the search as a multilevel process operating in a coarse-to-fine strategy evolving from k-flip neighbourhood to 1-flip neighbourhood-based structure. Experimental results comparing the multilevel tabu search against its single level variant are presented.展开更多
With the flood of information on the Web, it has become increasingly necessary for users to utilize automated tools in order to find, extract, filter, and evaluate the desired information and knowledge discovery. In t...With the flood of information on the Web, it has become increasingly necessary for users to utilize automated tools in order to find, extract, filter, and evaluate the desired information and knowledge discovery. In this research, we will present a preliminary discussion about using the dominant meaning technique to improve Google Image Web search engine. Google search engine analyzes the text on the page adjacent to the image, the image caption and dozens of other factors to determine the image content. To improve the results, we looked for building a dominant meaning classification model. This paper investigated the influence of using this model to retrieve more efficient images, through sequential procedures to formulate a suitable query. In order to build this model, the specific dataset related to an application domain was collected;K-means algorithm was used to cluster the dataset into K-clusters, and the dominant meaning technique is used to construct a hierarchy model of these clusters. This hierarchy model is used to reformulate a new query. We perform some experiments on Google and validate the effectiveness of the proposed approach. The proposed approach is improved for in precision, recall and F1-measure by 57%, 70%, and 61% respectively.展开更多
The paper's aim is how to forecast data with variations involving at times series data to get the best forecasting model. When researchers are going to forecast data with variations involving at times series data (i...The paper's aim is how to forecast data with variations involving at times series data to get the best forecasting model. When researchers are going to forecast data with variations involving at times series data (i.e., secular trends, cyclical variations, seasonal effects, and stochastic variations), they believe the best forecasting model is the one which realistically considers the underlying causal factors in a situational relationship and therefore has the best "track records" in generating data. Paper's models can be adjusted for variations in related a time series which processes a great deal of randomness, to improve the accuracy of the financial forecasts. Because of Na'fve forecasting models are based on an extrapolation of past values for future. These models may be adjusted for seasonal, secular, and cyclical trends in related data. When a data series processes a great deal of randomness, smoothing techniques, such as moving averages and exponential smoothing, may improve the accuracy of the financial forecasts. But neither Na'fve models nor smoothing techniques are capable of identifying major future changes in the direction of a situational data series. Hereby, nonlinear techniques, like direct and sequential search approaches, overcome those shortcomings can be used. The methodology which we have used is based on inferential analysis. To build the models to identify the major future changes in the direction of a situational data series, a comparative model building is applied. Hereby, the paper suggests using some of the nonlinear techniques, like direct and sequential search approaches, to reduce the technical shortcomings. The final result of the paper is to manipulate, to prepare, and to integrate heuristic non-linear searching methods to serve calculating adjusted factors to produce the best forecast data.展开更多
On one hand, compared with traditional rela- tional and XML models, graphs have more expressive power and are widely used today. On the other hand, various ap- plications of social computing trigger the pressing need ...On one hand, compared with traditional rela- tional and XML models, graphs have more expressive power and are widely used today. On the other hand, various ap- plications of social computing trigger the pressing need of a new search paradigm. In this article, we argue that big graph search is the one filling this gap. We first introduce the ap- plication of graph search in various scenarios. We then for- malize the graph search problem, and give an analysis of graph search from an evolutionary point of view, followed by the evidences from both the industry and academia. After that, we analyze the difficulties and challenges of big graph search. Finally, we present three classes of techniques to- wards big graph search: query techniques, data techniques and distributed computing techniques.展开更多
文摘Skin cancer segmentation is a critical task in a clinical decision support system for skin cancer detection.The suggested enhanced cuckoo search based optimization model will be used to evaluate several metrics in the skin cancer pic-ture segmentation process.Because time and resources are always limited,the proposed enhanced cuckoo search optimization algorithm is one of the most effec-tive strategies for dealing with global optimization difficulties.One of the most significant requirements is to design optimal solutions to optimize their use.There is no particular technique that can answer all optimization issues.The proposed enhanced cuckoo search optimization method indicates a constructive precision for skin cancer over with all image segmentation in computerized diagnosis.The accuracy of the proposed enhanced cuckoo search based optimization for melanoma has increased with a 23%to 29%improvement than other optimization algorithm.The total sensitivity and specificity attained in the proposed system are 99.56%and 99.73%respectively.The proposed method outperforms by offering accuracy of 99.26%in comparisons to other conventional methods.The proposed enhanced optimization technique achieved 98.75%,98.96%for Dice and Jaccard coefficient.The model trained using the suggested measure outperforms those trained using the conventional method in the segmentation of skin cancer picture data.
文摘In this paper, a new class of three term memory gradient method with non-monotone line search technique for unconstrained optimization is presented. Global convergence properties of the new methods are discussed. Combining the quasi-Newton method with the new method, the former is modified to have global convergence property. Numerical results show that the new algorithm is efficient.
文摘The genetic algorithm has been widely used in many fields as an easy robust global search and optimization method. In this paper, a new generic algorithm based on niche technique and local search method is presented under the consideration of inadequacies of the simple genetic algorithm. In order to prove the adaptability and validity of the improved genetic algorithm, optimization problems of multimodal functions with equal peaks, unequal peaks and complicated peak distribution are discussed. The simulation results show that compared to other niching methods, this improved genetic algorithm has obvious potential on many respects, such as convergence speed, solution accuracy, ability of global optimization, etc.
文摘This paper presents an optimal proposed allocating procedure for hybrid wind energy combined with proton exchange membrane fuel cell (WE/PEMFC) system to improve the operation performance of the electrical distribution system (EDS). Egypt has an excellent wind regime with wind speeds of about 10 m/s at many areas. The disadvantage of wind energy is its seasonal variations. So, if wind power is to supply a significant portion of the demand, either backup power or electrical energy storage (EES) system is needed to ensure that loads will be supplied in reliable way. So, the hybrid WE/PEMFC system is designed to completely supply a part of the Egyptian distribution system, in attempt to isolate it from the grid. However, the optimal allocation of the hybrid units is obtained, in order to enhance their benefits in the distribution networks. The critical buses that are necessary to install the hybrid WE/ PEMFC system, are chosen using sensitivity analysis. Then, the binary Crow search algorithm (BCSA), discrete Jaya algorithm (DJA) and binary particle swarm optimization (BPSO) techniques are proposed to determine the optimal operation of power systems using single and multi-objective functions (SOF/MOF). Then, the results of the three optimization techniques are compared with each other. Three sensitivity factors are employed in this paper, which are voltage sensitivity factor (VSF), active losses sensitivity factor (ALSF) and reactive losses sensitivity factor (RLSF). The effects of the sensitivity factors (SFs) on the SOF/MOF are studied. The improvement of voltage profile and minimizing active and reactive power losses of the EDS are considered as objective functions. Backward/forward sweep (BFS) method is used for the load flow calculations. The system load demand is predicted up to year 2022 for Mersi-Matrouh City as a part of Egyptian distribution network, and the design of the hybrid WE/PEMFC system is applied. The PEMFC system is designed considering simplified mathematical expressions. The economics of operation of both WE and PEMFC system are also presented. The results prove the capability of the proposed procedure to find the optimal allocation for the hybrid WE/PEMFC system to improve the system voltage profile and to minimize both active and reactive power losses for the EDS of Mersi-Matrough City.
基金funding provided by the China Scholarship Council (Nos.202008440524 and 202006370006)supported by the Distinguished Youth Science Foundation of Hunan Province of China (No.2022JJ10073)+1 种基金Innovation Driven Project of Central South University (No.2020CX040)Shenzhen Sciencee and Technology Plan (No.JCYJ20190808123013260).
文摘Concrete is the most commonly used construction material.However,its production leads to high carbon dioxide(CO_(2))emissions and energy consumption.Therefore,developing waste-substitutable concrete components is necessary.Improving the sustainability and greenness of concrete is the focus of this research.In this regard,899 data points were collected from existing studies where cement,slag,fly ash,superplasticizer,coarse aggregate,and fine aggregate were considered potential influential factors.The complex relationship between influential factors and concrete compressive strength makes the prediction and estimation of compressive strength difficult.Instead of the traditional compressive strength test,this study combines five novel metaheuristic algorithms with extreme gradient boosting(XGB)to predict the compressive strength of green concrete based on fly ash and blast furnace slag.The intelligent prediction models were assessed using the root mean square error(RMSE),coefficient of determination(R^(2)),mean absolute error(MAE),and variance accounted for(VAF).The results indicated that the squirrel search algorithm-extreme gradient boosting(SSA-XGB)yielded the best overall prediction performance with R^(2) values of 0.9930 and 0.9576,VAF values of 99.30 and 95.79,MAE values of 0.52 and 2.50,RMSE of 1.34 and 3.31 for the training and testing sets,respectively.The remaining five prediction methods yield promising results.Therefore,the developed hybrid XGB model can be introduced as an accurate and fast technique for the performance prediction of green concrete.Finally,the developed SSA-XGB considered the effects of all the input factors on the compressive strength.The ability of the model to predict the performance of concrete with unknown proportions can play a significant role in accelerating the development and application of sustainable concrete and furthering a sustainable economy.
文摘This paper presents the search technique for a lost target. A lost target is random walker on one of two intersected real lines, and the purpose is to detect the target as fast as possible. We have four searchers start from the point of intersection, they follow the so called Quasi-Coordinated search plan. The expected value of the first meeting time between one of the searchers and the target is investigated, also we show the existence of the optimal search strategy which minimizes this first meeting time.
文摘The maximum satisfiability problem (MAX-SAT) refers to the task of finding a variable assignment that satisfies the maximum number of clauses (or the sum of weight of satisfied clauses) in a Boolean Formula. Most local search algorithms including tabu search rely on the 1-flip neighbourhood structure. In this work, we introduce a tabu search algorithm that makes use of the multilevel paradigm for solving MAX-SAT problems. The multilevel paradigm refers to the process of dividing large and difficult problems into smaller ones, which are hopefully much easier to solve, and then work backward towards the solution of the original problem, using a solution from a previous level as a starting solution at the next level. This process aims at looking at the search as a multilevel process operating in a coarse-to-fine strategy evolving from k-flip neighbourhood to 1-flip neighbourhood-based structure. Experimental results comparing the multilevel tabu search against its single level variant are presented.
文摘With the flood of information on the Web, it has become increasingly necessary for users to utilize automated tools in order to find, extract, filter, and evaluate the desired information and knowledge discovery. In this research, we will present a preliminary discussion about using the dominant meaning technique to improve Google Image Web search engine. Google search engine analyzes the text on the page adjacent to the image, the image caption and dozens of other factors to determine the image content. To improve the results, we looked for building a dominant meaning classification model. This paper investigated the influence of using this model to retrieve more efficient images, through sequential procedures to formulate a suitable query. In order to build this model, the specific dataset related to an application domain was collected;K-means algorithm was used to cluster the dataset into K-clusters, and the dominant meaning technique is used to construct a hierarchy model of these clusters. This hierarchy model is used to reformulate a new query. We perform some experiments on Google and validate the effectiveness of the proposed approach. The proposed approach is improved for in precision, recall and F1-measure by 57%, 70%, and 61% respectively.
文摘The paper's aim is how to forecast data with variations involving at times series data to get the best forecasting model. When researchers are going to forecast data with variations involving at times series data (i.e., secular trends, cyclical variations, seasonal effects, and stochastic variations), they believe the best forecasting model is the one which realistically considers the underlying causal factors in a situational relationship and therefore has the best "track records" in generating data. Paper's models can be adjusted for variations in related a time series which processes a great deal of randomness, to improve the accuracy of the financial forecasts. Because of Na'fve forecasting models are based on an extrapolation of past values for future. These models may be adjusted for seasonal, secular, and cyclical trends in related data. When a data series processes a great deal of randomness, smoothing techniques, such as moving averages and exponential smoothing, may improve the accuracy of the financial forecasts. But neither Na'fve models nor smoothing techniques are capable of identifying major future changes in the direction of a situational data series. Hereby, nonlinear techniques, like direct and sequential search approaches, overcome those shortcomings can be used. The methodology which we have used is based on inferential analysis. To build the models to identify the major future changes in the direction of a situational data series, a comparative model building is applied. Hereby, the paper suggests using some of the nonlinear techniques, like direct and sequential search approaches, to reduce the technical shortcomings. The final result of the paper is to manipulate, to prepare, and to integrate heuristic non-linear searching methods to serve calculating adjusted factors to produce the best forecast data.
基金This work was supported in part by 973 program (2014CB340300), National Natural Science Foundation of China (Grant No. 61322207) and the Fundamental Research Funds for the Central Universi- ties.
文摘On one hand, compared with traditional rela- tional and XML models, graphs have more expressive power and are widely used today. On the other hand, various ap- plications of social computing trigger the pressing need of a new search paradigm. In this article, we argue that big graph search is the one filling this gap. We first introduce the ap- plication of graph search in various scenarios. We then for- malize the graph search problem, and give an analysis of graph search from an evolutionary point of view, followed by the evidences from both the industry and academia. After that, we analyze the difficulties and challenges of big graph search. Finally, we present three classes of techniques to- wards big graph search: query techniques, data techniques and distributed computing techniques.