In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications...In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.展开更多
The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,a...The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.展开更多
Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical appl...Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.展开更多
The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy ...The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy can obtain good accuracy, they come at the cost of enormous computational time, and are therefore not applicable to practical scenarios in large-scale networks. In addition, the centrality heuristic algorithms that are based on network topology can be completed in relatively less time. However, they tend to fail to achieve satisfactory results because of drawbacks such as overlapped influence spread. In this work, we propose a discrete two-stage metaheuristic optimization combining quantum-behaved particle swarm optimization with Lévy flight to identify a set of the most influential spreaders. According to the framework,first, the particles in the population are tasked to conduct an exploration in the global solution space to eventually converge to an acceptable solution through the crossover and replacement operations. Second, the Lévy flight mechanism is used to perform a wandering walk on the optimal candidate solution in the population to exploit the potentially unidentified influential nodes in the network. Experiments on six real-world social networks show that the proposed algorithm achieves more satisfactory results when compared to other well-known algorithms.展开更多
Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for ...Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for the company’s transportation operations.Logistics firms must discern the ideal location for establishing a logistics hub,which is challenging due to the simplicity of existing models and the intricate delivery factors.To simulate the drone logistics environment,this study presents a new mathematical model.The model not only retains the aspects of the current models,but also considers the degree of transportation difficulty from the logistics hub to the village,the capacity of drones for transportation,and the distribution of logistics hub locations.Moreover,this paper proposes an improved particle swarm optimization(PSO)algorithm which is a diversity-based hybrid PSO(DHPSO)algorithm to solve this model.In DHPSO,the Gaussian random walk can enhance global search in the model space,while the bubble-net attacking strategy can speed convergence.Besides,Archimedes spiral strategy is employed to overcome the local optima trap in the model and improve the exploitation of the algorithm.DHPSO maintains a balance between exploration and exploitation while better defining the distribution of logistics hub locations Numerical experiments show that the newly proposed model always achieves better locations than the current model.Comparing DHPSO with other state-of-the-art intelligent algorithms,the efficiency of the scheme can be improved by 42.58%.This means that logistics companies can reduce distribution costs and consumers can enjoy a more enjoyable shopping experience by using DHPSO’s location selection.All the results show the location of the drone logistics hub is solved by DHPSO effectively.展开更多
The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques we...The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.展开更多
In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parame...In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parameter accuracy.This work proposes a fuzzy particle swarm optimization approach based on the transformation function and the filled function.This approach addresses the topic of particle swarmoptimization in parameter identification from two perspectives.Firstly,the algorithm uses a transformation function to change the form of the fitness function without changing the position of the extreme point of the fitness function,making the extreme point of the fitness function more prominent and improving the algorithm’s search ability while reducing the algorithm’s computational burden.Secondly,on the basis of themulti-loop fuzzy control systembased onmultiplemembership functions,it is merged with the filled function to improve the algorithm’s capacity to skip out of the local optimal solution.This approach can be used to identify the parameters of permanent magnet synchronous motors by sampling only the stator current,voltage,and speed data.The simulation results show that the method can effectively identify the electrical parameters of a permanent magnet synchronous motor,and it has superior global convergence performance and robustness.展开更多
Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the chall...Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the challenges for some algorithms in resource scheduling scenarios.In this work,the Hierarchical Particle Swarm Optimization-Evolutionary Artificial Bee Colony Algorithm(HPSO-EABC)has been proposed,which hybrids our presented Evolutionary Artificial Bee Colony(EABC),and Hierarchical Particle Swarm Optimization(HPSO)algorithm.The HPSO-EABC algorithm incorporates both the advantages of the HPSO and the EABC algorithm.Comprehensive testing including evaluations of algorithm convergence speed,resource execution time,load balancing,and operational costs has been done.The results indicate that the EABC algorithm exhibits greater parallelism compared to the Artificial Bee Colony algorithm.Compared with the Particle Swarm Optimization algorithm,the HPSO algorithmnot only improves the global search capability but also effectively mitigates getting stuck in local optima.As a result,the hybrid HPSO-EABC algorithm demonstrates significant improvements in terms of stability and convergence speed.Moreover,it exhibits enhanced resource scheduling performance in both homogeneous and heterogeneous environments,effectively reducing execution time and cost,which also is verified by the ablation experimental.展开更多
Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embe...Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embedded sensors working as the primary nodes,termed Wireless Sensor Networks(WSNs),in which numerous sensors are connected to at least one Base Station(BS).These sensors gather information from the environment and transmit it to a BS or gathering location.WSNs have several challenges,including throughput,energy usage,and network lifetime concerns.Different strategies have been applied to get over these restrictions.Clustering may,therefore,be thought of as the best way to solve such issues.Consequently,it is crucial to analyze effective Cluster Head(CH)selection to maximize efficiency throughput,extend the network lifetime,and minimize energy consumption.This paper proposed an Accelerated Particle Swarm Optimization(APSO)algorithm based on the Low Energy Adaptive Clustering Hierarchy(LEACH),Neighboring Based Energy Efficient Routing(NBEER),Cooperative Energy Efficient Routing(CEER),and Cooperative Relay Neighboring Based Energy Efficient Routing(CR-NBEER)techniques.With the help of APSO in the implementation of the WSN,the main methodology of this article has taken place.The simulation findings in this study demonstrated that the suggested approach uses less energy,with respective energy consumption ranges of 0.1441 to 0.013 for 5 CH,1.003 to 0.0521 for 10 CH,and 0.1734 to 0.0911 for 15 CH.The sending packets ratio was also raised for all three CH selection scenarios,increasing from 659 to 1730.The number of dead nodes likewise dropped for the given combination,falling between 71 and 66.The network lifetime was deemed to have risen based on the results found.A hybrid with a few valuable parameters can further improve the suggested APSO-based protocol.Similar to underwater,WSN can make use of the proposed protocol.The overall results have been evaluated and compared with the existing approaches of sensor networks.展开更多
The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optim...The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.展开更多
In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible t...In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible to unsafe events(such as falls)that can have disastrous consequences.However,automatically detecting falls fromvideo data is challenging,and automatic fall detection methods usually require large volumes of training data,which can be difficult to acquire.To address this problem,video kinematic data can be used as training data,thereby avoiding the requirement of creating a large fall data set.This study integrated an improved particle swarm optimization method into a double interactively recurrent fuzzy cerebellar model articulation controller model to develop a costeffective and accurate fall detection system.First,it obtained an optical flow(OF)trajectory diagram from image sequences by using the OF method,and it solved problems related to focal length and object offset by employing the discrete Fourier transform(DFT)algorithm.Second,this study developed the D-IRFCMAC model,which combines spatial and temporal(recurrent)information.Third,it designed an IPSO(Improved Particle Swarm Optimization)algorithm that effectively strengthens the exploratory capabilities of the proposed D-IRFCMAC(Double-Interactively Recurrent Fuzzy Cerebellar Model Articulation Controller)model in the global search space.The proposed approach outperforms existing state-of-the-art methods in terms of action recognition accuracy on the UR-Fall,UP-Fall,and PRECIS HAR data sets.The UCF11 dataset had an average accuracy of 93.13%,whereas the UCF101 dataset had an average accuracy of 92.19%.The UR-Fall dataset had an accuracy of 100%,the UP-Fall dataset had an accuracy of 99.25%,and the PRECIS HAR dataset had an accuracy of 99.07%.展开更多
This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root dis...This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root distance mechanism into the external archives to enhance the diversity. We evaluate the performance of the proposed approach on a set of constrained and unconstrained multiobjective test functions, establishing a benchmark for comparison. In order to gauge its effectiveness relative to established techniques, we conduct a comprehensive comparison with well-known approaches such as SMPSO, NSGA2 and SPEA2. The numerical results demonstrate that our method not only achieves efficiency but also exhibits competitiveness when compared to evolutionary algorithms. Particularly noteworthy is its superior performance in terms of convergence and diversification, surpassing the capabilities of its predecessors.展开更多
Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the ne...Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.展开更多
At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer fr...At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer from the problem that when the nodes and edges increase,the structure learning difficulty increases and algorithms become inefficient.To solve this problem,heuristic optimization algorithms are used,which tend to find a near-optimal answer rather than an exact one,with particle swarm optimization(PSO)being one of them.PSO is a swarm intelligence-based algorithm having basic inspiration from flocks of birds(how they search for food).PSO is employed widely because it is easier to code,converges quickly,and can be parallelized easily.We use a recently proposed version of PSO called generalized particle swarm optimization(GEPSO)to learn bayesian network structure.We construct an initial directed acyclic graph(DAG)by using the max-min parent’s children(MMPC)algorithm and cross relative average entropy.ThisDAGis used to create a population for theGEPSO optimization procedure.Moreover,we propose a velocity update procedure to increase the efficiency of the algorithmic search process.Results of the experiments show that as the complexity of the dataset increases,our algorithm Bayesian network generalized particle swarm optimization(BN-GEPSO)outperforms the PSO algorithm in terms of the Bayesian information criterion(BIC)score.展开更多
The problems associated with vibrations of viaducts and low-frequency structural noise radiation caused by train excitation continue to increase in importance.A new floating-slab track vibration isolator-non-obstructi...The problems associated with vibrations of viaducts and low-frequency structural noise radiation caused by train excitation continue to increase in importance.A new floating-slab track vibration isolator-non-obstructive particle damping-phononic crystal vibration isolator is proposed herein,which uses the particle damping vibration absorption technology and bandgap vibration control theory.The vibration reduction performance of the NOPD-PCVI was analyzed from the perspective of vibration control.The paper explores the structure-borne noise reduction performance of the NOPD-PCVIs installed on different bridge structures under varying service conditions encountered in practical engineering applications.The load transferred to the bridge is obtained from a coupled train-FST-bridge analytical model considering the different structural parameters of bridges.The vibration responses are obtained using the finite element method,while the structural noise radiation is simulated using the frequency-domain boundary element method.Using the particle swarm optimization algorithm,the parameters of the NOPD-PCVI are optimized so that its frequency bandgap matches the dominant bridge structural noise frequency range.The noise reduction performance of the NOPD-PCVIs is compared to the steel-spring isolation under different service conditions.展开更多
Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel meth...Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel method based on CPSO.We first evaluate the clustering performance of this model using the variance ratio criterion(VRC)as the evaluation metric.The effectiveness of the CPSO algorithm is compared with that of the traditional particle swarm optimization(PSO)algorithm.The CPSO aims to improve the VRC value while avoiding local optimal solutions.The simulated dataset is set at three levels of overlapping:non-overlapping,partial overlapping,and severe overlapping.Finally,we compare CPSO with two other methods.Results:By observing the comparative results,our proposed CPSO method performs outstandingly.In the conditions of non-overlapping,partial overlapping,and severe overlapping,our method has the best VRC values of 1683.2,620.5,and 275.6,respectively.The mean VRC values in these three cases are 1683.2,617.8,and 222.6.Conclusion:The CPSO performed better than other methods for cluster analysis problems.CPSO is effective for cluster analysis.展开更多
Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that red...Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that reduces the power outputs of wind turbines located in downstream.Wind farm layout optimization(WFLO)aims to reduce the wake effect for maximizing the power outputs of the wind farm.Nevertheless,the wake effect among wind turbines increases significantly as the number of wind turbines increases in the wind farm,which severely affect power conversion efficiency.Conventional heuristic algorithms suffer from issues of low solution quality and local optimum for large-scale WFLO under complex wind scenarios.Thus,a chaotic local search-based genetic learning particle swarm optimizer(CGPSO)is proposed to optimize large-scale WFLO problems.CGPSO is tested on four larger-scale wind farms under four complex wind scenarios and compares with eight state-of-the-art algorithms.The experiment results indicate that CGPSO significantly outperforms its competitors in terms of performance,stability,and robustness.To be specific,a success and failure memories-based selection is proposed to choose a chaotic map for chaotic search local.It improves the solution quality.The parameter and search pattern of chaotic local search are also analyzed for WFLO problems.展开更多
Gasoline blending scheduling optimization can bring significant economic and efficient benefits to refineries.However,the optimization model is complex and difficult to build,which is a typical mixed integer nonlinear...Gasoline blending scheduling optimization can bring significant economic and efficient benefits to refineries.However,the optimization model is complex and difficult to build,which is a typical mixed integer nonlinear programming(MINLP)problem.Considering the large scale of the MINLP model,in order to improve the efficiency of the solution,the mixed integer linear programming-nonlinear programming(MILP-NLP)strategy is used to solve the problem.This paper uses the linear blending rules plus the blending effect correction to build the gasoline blending model,and a relaxed MILP model is constructed on this basis.The particle swarm optimization algorithm with niche technology(NPSO)is proposed to optimize the solution,and the high-precision soft-sensor method is used to calculate the deviation of gasoline attributes,the blending effect is dynamically corrected to ensure the accuracy of the blending effect and optimization results,thus forming a prediction-verification-reprediction closed-loop scheduling optimization strategy suitable for engineering applications.The optimization result of the MILP model provides a good initial point.By fixing the integer variables to the MILPoptimal value,the approximate MINLP optimal solution can be obtained through a NLP solution.The above solution strategy has been successfully applied to the actual gasoline production case of a refinery(3.5 million tons per year),and the results show that the strategy is effective and feasible.The optimization results based on the closed-loop scheduling optimization strategy have higher reliability.Compared with the standard particle swarm optimization algorithm,NPSO algorithm improves the optimization ability and efficiency to a certain extent,effectively reduces the blending cost while ensuring the convergence speed.展开更多
Cascade refrigeration system(CRS)can meet a wider range of refrigeration temperature requirements and is more energy efficient than single-refrigerant refrigeration system,making it more widely used in low-temperature...Cascade refrigeration system(CRS)can meet a wider range of refrigeration temperature requirements and is more energy efficient than single-refrigerant refrigeration system,making it more widely used in low-temperature industry processes.The synthesis of a CRS with simultaneous consideration of heat integration between refrigerant and process streams is challenging but promising for significant cost saving and reduction of carbon emission.This study presented a stochastic optimization method for the synthesis of CRS.An MINLP model was formulated based on the superstructure developed for the CRS,and an optimization framework was proposed,where simulated annealing algorithm was used to evolve the numbers of pressure/temperature levels for all sub-refrigeration systems,and particle swarm optimization algorithm was employed to optimize the continuous variables.The effectiveness of the proposed methodology was verified by a case study of CRS optimization in an ethylene plant with 21.89%the total annual cost saving.展开更多
In airborne gamma ray spectrum processing,different analysis methods,technical requirements,analysis models,and calculation methods need to be established.To meet the engineering practice requirements of airborne gamm...In airborne gamma ray spectrum processing,different analysis methods,technical requirements,analysis models,and calculation methods need to be established.To meet the engineering practice requirements of airborne gamma-ray measurements and improve computational efficiency,an improved shuffled frog leaping algorithm-particle swarm optimization convolutional neural network(SFLA-PSO CNN)for large-sample quantitative analysis of airborne gamma-ray spectra is proposed herein.This method was used to train the weight of the neural network,optimize the structure of the network,delete redundant connections,and enable the neural network to acquire the capability of quantitative spectrum processing.In full-spectrum data processing,this method can perform the functions of energy spectrum peak searching and peak area calculations.After network training,the mean SNR and RMSE of the spectral lines were 31.27 and 2.75,respectively,satisfying the demand for noise reduction.To test the processing ability of the algorithm in large samples of airborne gamma spectra,this study considered the measured data from the Saihangaobi survey area as an example to conduct data spectral analysis.The results show that calculation of the single-peak area takes only 0.13~0.15 ms,and the average relative errors of the peak area in the U,Th,and K spectra are 3.11,9.50,and 6.18%,indicating the high processing efficiency and accuracy of this algorithm.The performance of the model can be further improved by optimizing related parameters,but it can already meet the requirements of practical engineering measurement.This study provides a new idea for the full-spectrum processing of airborne gamma rays.展开更多
基金This work was supported in part by the National Science and Technology Council of Taiwan,under Contract NSTC 112-2410-H-324-001-MY2.
文摘In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.
基金funded by the University of Jeddah,Jeddah,Saudi Arabia,under Grant No.(UJ-23-DR-26)。
文摘The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.
基金supported by the National Science Foundation of China(42107183).
文摘Driven piles are used in many geological environments as a practical and convenient structural component.Hence,the determination of the drivability of piles is actually of great importance in complex geotechnical applications.Conventional methods of predicting pile drivability often rely on simplified physicalmodels or empirical formulas,whichmay lack accuracy or applicability in complex geological conditions.Therefore,this study presents a practical machine learning approach,namely a Random Forest(RF)optimized by Bayesian Optimization(BO)and Particle Swarm Optimization(PSO),which not only enhances prediction accuracy but also better adapts to varying geological environments to predict the drivability parameters of piles(i.e.,maximumcompressive stress,maximum tensile stress,and blow per foot).In addition,support vector regression,extreme gradient boosting,k nearest neighbor,and decision tree are also used and applied for comparison purposes.In order to train and test these models,among the 4072 datasets collected with 17model inputs,3258 datasets were randomly selected for training,and the remaining 814 datasets were used for model testing.Lastly,the results of these models were compared and evaluated using two performance indices,i.e.,the root mean square error(RMSE)and the coefficient of determination(R2).The results indicate that the optimized RF model achieved lower RMSE than other prediction models in predicting the three parameters,specifically 0.044,0.438,and 0.146;and higher R^(2) values than other implemented techniques,specifically 0.966,0.884,and 0.977.In addition,the sensitivity and uncertainty of the optimized RF model were analyzed using Sobol sensitivity analysis and Monte Carlo(MC)simulation.It can be concluded that the optimized RF model could be used to predict the performance of the pile,and it may provide a useful reference for solving some problems under similar engineering conditions.
基金Project supported by the Zhejiang Provincial Natural Science Foundation (Grant No.LQ20F020011)the Gansu Provincial Foundation for Distinguished Young Scholars (Grant No.23JRRA766)+1 种基金the National Natural Science Foundation of China (Grant No.62162040)the National Key Research and Development Program of China (Grant No.2020YFB1713600)。
文摘The influence maximization problem aims to select a small set of influential nodes, termed a seed set, to maximize their influence coverage in social networks. Although the methods that are based on a greedy strategy can obtain good accuracy, they come at the cost of enormous computational time, and are therefore not applicable to practical scenarios in large-scale networks. In addition, the centrality heuristic algorithms that are based on network topology can be completed in relatively less time. However, they tend to fail to achieve satisfactory results because of drawbacks such as overlapped influence spread. In this work, we propose a discrete two-stage metaheuristic optimization combining quantum-behaved particle swarm optimization with Lévy flight to identify a set of the most influential spreaders. According to the framework,first, the particles in the population are tasked to conduct an exploration in the global solution space to eventually converge to an acceptable solution through the crossover and replacement operations. Second, the Lévy flight mechanism is used to perform a wandering walk on the optimal candidate solution in the population to exploit the potentially unidentified influential nodes in the network. Experiments on six real-world social networks show that the proposed algorithm achieves more satisfactory results when compared to other well-known algorithms.
基金supported by the NationalNatural Science Foundation of China(No.61866023).
文摘Drone logistics is a novel method of distribution that will become prevalent.The advantageous location of the logistics hub enables quicker customer deliveries and lower fuel consumption,resulting in cost savings for the company’s transportation operations.Logistics firms must discern the ideal location for establishing a logistics hub,which is challenging due to the simplicity of existing models and the intricate delivery factors.To simulate the drone logistics environment,this study presents a new mathematical model.The model not only retains the aspects of the current models,but also considers the degree of transportation difficulty from the logistics hub to the village,the capacity of drones for transportation,and the distribution of logistics hub locations.Moreover,this paper proposes an improved particle swarm optimization(PSO)algorithm which is a diversity-based hybrid PSO(DHPSO)algorithm to solve this model.In DHPSO,the Gaussian random walk can enhance global search in the model space,while the bubble-net attacking strategy can speed convergence.Besides,Archimedes spiral strategy is employed to overcome the local optima trap in the model and improve the exploitation of the algorithm.DHPSO maintains a balance between exploration and exploitation while better defining the distribution of logistics hub locations Numerical experiments show that the newly proposed model always achieves better locations than the current model.Comparing DHPSO with other state-of-the-art intelligent algorithms,the efficiency of the scheme can be improved by 42.58%.This means that logistics companies can reduce distribution costs and consumers can enjoy a more enjoyable shopping experience by using DHPSO’s location selection.All the results show the location of the drone logistics hub is solved by DHPSO effectively.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program(Grant no.2019QZKK0904)Natural Science Foundation of Hebei Province(Grant no.D2022403032)S&T Program of Hebei(Grant no.E2021403001).
文摘The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.
基金the Natural Science Foundation of China under Grant 52077027in part by the Liaoning Province Science and Technology Major Project No.2020JH1/10100020.
文摘In the process of identifying parameters for a permanent magnet synchronous motor,the particle swarm optimization method is prone to being stuck in local optima in the later stages of iteration,resulting in low parameter accuracy.This work proposes a fuzzy particle swarm optimization approach based on the transformation function and the filled function.This approach addresses the topic of particle swarmoptimization in parameter identification from two perspectives.Firstly,the algorithm uses a transformation function to change the form of the fitness function without changing the position of the extreme point of the fitness function,making the extreme point of the fitness function more prominent and improving the algorithm’s search ability while reducing the algorithm’s computational burden.Secondly,on the basis of themulti-loop fuzzy control systembased onmultiplemembership functions,it is merged with the filled function to improve the algorithm’s capacity to skip out of the local optimal solution.This approach can be used to identify the parameters of permanent magnet synchronous motors by sampling only the stator current,voltage,and speed data.The simulation results show that the method can effectively identify the electrical parameters of a permanent magnet synchronous motor,and it has superior global convergence performance and robustness.
基金jointly supported by the Jiangsu Postgraduate Research and Practice Innovation Project under Grant KYCX22_1030,SJCX22_0283 and SJCX23_0293the NUPTSF under Grant NY220201.
文摘Task scheduling plays a key role in effectively managing and allocating computing resources to meet various computing tasks in a cloud computing environment.Short execution time and low load imbalance may be the challenges for some algorithms in resource scheduling scenarios.In this work,the Hierarchical Particle Swarm Optimization-Evolutionary Artificial Bee Colony Algorithm(HPSO-EABC)has been proposed,which hybrids our presented Evolutionary Artificial Bee Colony(EABC),and Hierarchical Particle Swarm Optimization(HPSO)algorithm.The HPSO-EABC algorithm incorporates both the advantages of the HPSO and the EABC algorithm.Comprehensive testing including evaluations of algorithm convergence speed,resource execution time,load balancing,and operational costs has been done.The results indicate that the EABC algorithm exhibits greater parallelism compared to the Artificial Bee Colony algorithm.Compared with the Particle Swarm Optimization algorithm,the HPSO algorithmnot only improves the global search capability but also effectively mitigates getting stuck in local optima.As a result,the hybrid HPSO-EABC algorithm demonstrates significant improvements in terms of stability and convergence speed.Moreover,it exhibits enhanced resource scheduling performance in both homogeneous and heterogeneous environments,effectively reducing execution time and cost,which also is verified by the ablation experimental.
文摘Numerous wireless networks have emerged that can be used for short communication ranges where the infrastructure-based networks may fail because of their installation and cost.One of them is a sensor network with embedded sensors working as the primary nodes,termed Wireless Sensor Networks(WSNs),in which numerous sensors are connected to at least one Base Station(BS).These sensors gather information from the environment and transmit it to a BS or gathering location.WSNs have several challenges,including throughput,energy usage,and network lifetime concerns.Different strategies have been applied to get over these restrictions.Clustering may,therefore,be thought of as the best way to solve such issues.Consequently,it is crucial to analyze effective Cluster Head(CH)selection to maximize efficiency throughput,extend the network lifetime,and minimize energy consumption.This paper proposed an Accelerated Particle Swarm Optimization(APSO)algorithm based on the Low Energy Adaptive Clustering Hierarchy(LEACH),Neighboring Based Energy Efficient Routing(NBEER),Cooperative Energy Efficient Routing(CEER),and Cooperative Relay Neighboring Based Energy Efficient Routing(CR-NBEER)techniques.With the help of APSO in the implementation of the WSN,the main methodology of this article has taken place.The simulation findings in this study demonstrated that the suggested approach uses less energy,with respective energy consumption ranges of 0.1441 to 0.013 for 5 CH,1.003 to 0.0521 for 10 CH,and 0.1734 to 0.0911 for 15 CH.The sending packets ratio was also raised for all three CH selection scenarios,increasing from 659 to 1730.The number of dead nodes likewise dropped for the given combination,falling between 71 and 66.The network lifetime was deemed to have risen based on the results found.A hybrid with a few valuable parameters can further improve the suggested APSO-based protocol.Similar to underwater,WSN can make use of the proposed protocol.The overall results have been evaluated and compared with the existing approaches of sensor networks.
基金the National Natural Science Foundation of China(52177074).
文摘The escalating deployment of distributed power sources and random loads in DC distribution networks hasamplified the potential consequences of faults if left uncontrolled. To expedite the process of achieving an optimalconfiguration of measurement points, this paper presents an optimal configuration scheme for fault locationmeasurement points in DC distribution networks based on an improved particle swarm optimization algorithm.Initially, a measurement point distribution optimization model is formulated, leveraging compressive sensing.The model aims to achieve the minimum number of measurement points while attaining the best compressivesensing reconstruction effect. It incorporates constraints from the compressive sensing algorithm and networkwide viewability. Subsequently, the traditional particle swarm algorithm is enhanced by utilizing the Haltonsequence for population initialization, generating uniformly distributed individuals. This enhancement reducesindividual search blindness and overlap probability, thereby promoting population diversity. Furthermore, anadaptive t-distribution perturbation strategy is introduced during the particle update process to enhance the globalsearch capability and search speed. The established model for the optimal configuration of measurement points issolved, and the results demonstrate the efficacy and practicality of the proposed method. The optimal configurationreduces the number of measurement points, enhances localization accuracy, and improves the convergence speedof the algorithm. These findings validate the effectiveness and utility of the proposed approach.
基金supported by the National Science and Technology Council under grants NSTC 112-2221-E-320-002the Buddhist Tzu Chi Medical Foundation in Taiwan under Grant TCMMP 112-02-02.
文摘In many Eastern and Western countries,falling birth rates have led to the gradual aging of society.Older adults are often left alone at home or live in a long-term care center,which results in them being susceptible to unsafe events(such as falls)that can have disastrous consequences.However,automatically detecting falls fromvideo data is challenging,and automatic fall detection methods usually require large volumes of training data,which can be difficult to acquire.To address this problem,video kinematic data can be used as training data,thereby avoiding the requirement of creating a large fall data set.This study integrated an improved particle swarm optimization method into a double interactively recurrent fuzzy cerebellar model articulation controller model to develop a costeffective and accurate fall detection system.First,it obtained an optical flow(OF)trajectory diagram from image sequences by using the OF method,and it solved problems related to focal length and object offset by employing the discrete Fourier transform(DFT)algorithm.Second,this study developed the D-IRFCMAC model,which combines spatial and temporal(recurrent)information.Third,it designed an IPSO(Improved Particle Swarm Optimization)algorithm that effectively strengthens the exploratory capabilities of the proposed D-IRFCMAC(Double-Interactively Recurrent Fuzzy Cerebellar Model Articulation Controller)model in the global search space.The proposed approach outperforms existing state-of-the-art methods in terms of action recognition accuracy on the UR-Fall,UP-Fall,and PRECIS HAR data sets.The UCF11 dataset had an average accuracy of 93.13%,whereas the UCF101 dataset had an average accuracy of 92.19%.The UR-Fall dataset had an accuracy of 100%,the UP-Fall dataset had an accuracy of 99.25%,and the PRECIS HAR dataset had an accuracy of 99.07%.
文摘This paper introduces a novel variant of particle swarm optimization that leverages local displacements through attractors for addressing multiobjective optimization problems. The method incorporates a square root distance mechanism into the external archives to enhance the diversity. We evaluate the performance of the proposed approach on a set of constrained and unconstrained multiobjective test functions, establishing a benchmark for comparison. In order to gauge its effectiveness relative to established techniques, we conduct a comprehensive comparison with well-known approaches such as SMPSO, NSGA2 and SPEA2. The numerical results demonstrate that our method not only achieves efficiency but also exhibits competitiveness when compared to evolutionary algorithms. Particularly noteworthy is its superior performance in terms of convergence and diversification, surpassing the capabilities of its predecessors.
文摘Analyzing big data, especially medical data, helps to provide good health care to patients and face the risks of death. The COVID-19 pandemic has had a significant impact on public health worldwide, emphasizing the need for effective risk prediction models. Machine learning (ML) techniques have shown promise in analyzing complex data patterns and predicting disease outcomes. The accuracy of these techniques is greatly affected by changing their parameters. Hyperparameter optimization plays a crucial role in improving model performance. In this work, the Particle Swarm Optimization (PSO) algorithm was used to effectively search the hyperparameter space and improve the predictive power of the machine learning models by identifying the optimal hyperparameters that can provide the highest accuracy. A dataset with a variety of clinical and epidemiological characteristics linked to COVID-19 cases was used in this study. Various machine learning models, including Random Forests, Decision Trees, Support Vector Machines, and Neural Networks, were utilized to capture the complex relationships present in the data. To evaluate the predictive performance of the models, the accuracy metric was employed. The experimental findings showed that the suggested method of estimating COVID-19 risk is effective. When compared to baseline models, the optimized machine learning models performed better and produced better results.
基金The authors extended their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through the Large Groups Project under grant number RGP.2/132/43。
文摘At present Bayesian Networks(BN)are being used widely for demonstrating uncertain knowledge in many disciplines,including biology,computer science,risk analysis,service quality analysis,and business.But they suffer from the problem that when the nodes and edges increase,the structure learning difficulty increases and algorithms become inefficient.To solve this problem,heuristic optimization algorithms are used,which tend to find a near-optimal answer rather than an exact one,with particle swarm optimization(PSO)being one of them.PSO is a swarm intelligence-based algorithm having basic inspiration from flocks of birds(how they search for food).PSO is employed widely because it is easier to code,converges quickly,and can be parallelized easily.We use a recently proposed version of PSO called generalized particle swarm optimization(GEPSO)to learn bayesian network structure.We construct an initial directed acyclic graph(DAG)by using the max-min parent’s children(MMPC)algorithm and cross relative average entropy.ThisDAGis used to create a population for theGEPSO optimization procedure.Moreover,we propose a velocity update procedure to increase the efficiency of the algorithmic search process.Results of the experiments show that as the complexity of the dataset increases,our algorithm Bayesian network generalized particle swarm optimization(BN-GEPSO)outperforms the PSO algorithm in terms of the Bayesian information criterion(BIC)score.
基金Project(51978585)supported by the National Natural Science Foundation,ChinaProject(2022YFB2603404)supported by the National Key Research and Development Program,China+1 种基金Project(U1734207)supported by the High-speed Rail Joint Fund Key Projects of Basic Research,ChinaProject(2023NSFSC1975)supported by the Sichuan Nature and Science Foundation Innovation Research Group Project,China。
文摘The problems associated with vibrations of viaducts and low-frequency structural noise radiation caused by train excitation continue to increase in importance.A new floating-slab track vibration isolator-non-obstructive particle damping-phononic crystal vibration isolator is proposed herein,which uses the particle damping vibration absorption technology and bandgap vibration control theory.The vibration reduction performance of the NOPD-PCVI was analyzed from the perspective of vibration control.The paper explores the structure-borne noise reduction performance of the NOPD-PCVIs installed on different bridge structures under varying service conditions encountered in practical engineering applications.The load transferred to the bridge is obtained from a coupled train-FST-bridge analytical model considering the different structural parameters of bridges.The vibration responses are obtained using the finite element method,while the structural noise radiation is simulated using the frequency-domain boundary element method.Using the particle swarm optimization algorithm,the parameters of the NOPD-PCVI are optimized so that its frequency bandgap matches the dominant bridge structural noise frequency range.The noise reduction performance of the NOPD-PCVIs is compared to the steel-spring isolation under different service conditions.
文摘Background:To solve the cluster analysis better,we propose a new method based on the chaotic particle swarm optimization(CPSO)algorithm.Methods:In order to enhance the performance in clustering,we propose a novel method based on CPSO.We first evaluate the clustering performance of this model using the variance ratio criterion(VRC)as the evaluation metric.The effectiveness of the CPSO algorithm is compared with that of the traditional particle swarm optimization(PSO)algorithm.The CPSO aims to improve the VRC value while avoiding local optimal solutions.The simulated dataset is set at three levels of overlapping:non-overlapping,partial overlapping,and severe overlapping.Finally,we compare CPSO with two other methods.Results:By observing the comparative results,our proposed CPSO method performs outstandingly.In the conditions of non-overlapping,partial overlapping,and severe overlapping,our method has the best VRC values of 1683.2,620.5,and 275.6,respectively.The mean VRC values in these three cases are 1683.2,617.8,and 222.6.Conclusion:The CPSO performed better than other methods for cluster analysis problems.CPSO is effective for cluster analysis.
基金partially supported by the Japan Society for the Promotion of Science(JSPS)KAKENHI(JP22H03643)Japan Science and Technology Agency(JST)Support for Pioneering Research Initiated by the Next Generation(SPRING)(JPMJSP2145)JST through the Establishment of University Fellowships towards the Creation of Science Technology Innovation(JPMJFS2115)。
文摘Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that reduces the power outputs of wind turbines located in downstream.Wind farm layout optimization(WFLO)aims to reduce the wake effect for maximizing the power outputs of the wind farm.Nevertheless,the wake effect among wind turbines increases significantly as the number of wind turbines increases in the wind farm,which severely affect power conversion efficiency.Conventional heuristic algorithms suffer from issues of low solution quality and local optimum for large-scale WFLO under complex wind scenarios.Thus,a chaotic local search-based genetic learning particle swarm optimizer(CGPSO)is proposed to optimize large-scale WFLO problems.CGPSO is tested on four larger-scale wind farms under four complex wind scenarios and compares with eight state-of-the-art algorithms.The experiment results indicate that CGPSO significantly outperforms its competitors in terms of performance,stability,and robustness.To be specific,a success and failure memories-based selection is proposed to choose a chaotic map for chaotic search local.It improves the solution quality.The parameter and search pattern of chaotic local search are also analyzed for WFLO problems.
基金supported by National Natural Science Foundation of China(Basic Science Center Program:61988101)Shanghai Committee of Science and Technology(22DZ1101500)+1 种基金the National Natural Science Foundation of China(61973124,62073142)Fundamental Research Funds for the Central Universities。
文摘Gasoline blending scheduling optimization can bring significant economic and efficient benefits to refineries.However,the optimization model is complex and difficult to build,which is a typical mixed integer nonlinear programming(MINLP)problem.Considering the large scale of the MINLP model,in order to improve the efficiency of the solution,the mixed integer linear programming-nonlinear programming(MILP-NLP)strategy is used to solve the problem.This paper uses the linear blending rules plus the blending effect correction to build the gasoline blending model,and a relaxed MILP model is constructed on this basis.The particle swarm optimization algorithm with niche technology(NPSO)is proposed to optimize the solution,and the high-precision soft-sensor method is used to calculate the deviation of gasoline attributes,the blending effect is dynamically corrected to ensure the accuracy of the blending effect and optimization results,thus forming a prediction-verification-reprediction closed-loop scheduling optimization strategy suitable for engineering applications.The optimization result of the MILP model provides a good initial point.By fixing the integer variables to the MILPoptimal value,the approximate MINLP optimal solution can be obtained through a NLP solution.The above solution strategy has been successfully applied to the actual gasoline production case of a refinery(3.5 million tons per year),and the results show that the strategy is effective and feasible.The optimization results based on the closed-loop scheduling optimization strategy have higher reliability.Compared with the standard particle swarm optimization algorithm,NPSO algorithm improves the optimization ability and efficiency to a certain extent,effectively reduces the blending cost while ensuring the convergence speed.
基金supported by the National Natural Science Foundation of China(21978203)the Natural Science Foundation of Tianjin City(19JCYBJC20300)。
文摘Cascade refrigeration system(CRS)can meet a wider range of refrigeration temperature requirements and is more energy efficient than single-refrigerant refrigeration system,making it more widely used in low-temperature industry processes.The synthesis of a CRS with simultaneous consideration of heat integration between refrigerant and process streams is challenging but promising for significant cost saving and reduction of carbon emission.This study presented a stochastic optimization method for the synthesis of CRS.An MINLP model was formulated based on the superstructure developed for the CRS,and an optimization framework was proposed,where simulated annealing algorithm was used to evolve the numbers of pressure/temperature levels for all sub-refrigeration systems,and particle swarm optimization algorithm was employed to optimize the continuous variables.The effectiveness of the proposed methodology was verified by a case study of CRS optimization in an ethylene plant with 21.89%the total annual cost saving.
基金the National Natural Science Foundation of China(No.42127807)Natural Science Foundation of Sichuan Province(Nos.23NSFSCC0116 and 2022NSFSC12333)the Nuclear Energy Development Project(No.[2021]-88).
文摘In airborne gamma ray spectrum processing,different analysis methods,technical requirements,analysis models,and calculation methods need to be established.To meet the engineering practice requirements of airborne gamma-ray measurements and improve computational efficiency,an improved shuffled frog leaping algorithm-particle swarm optimization convolutional neural network(SFLA-PSO CNN)for large-sample quantitative analysis of airborne gamma-ray spectra is proposed herein.This method was used to train the weight of the neural network,optimize the structure of the network,delete redundant connections,and enable the neural network to acquire the capability of quantitative spectrum processing.In full-spectrum data processing,this method can perform the functions of energy spectrum peak searching and peak area calculations.After network training,the mean SNR and RMSE of the spectral lines were 31.27 and 2.75,respectively,satisfying the demand for noise reduction.To test the processing ability of the algorithm in large samples of airborne gamma spectra,this study considered the measured data from the Saihangaobi survey area as an example to conduct data spectral analysis.The results show that calculation of the single-peak area takes only 0.13~0.15 ms,and the average relative errors of the peak area in the U,Th,and K spectra are 3.11,9.50,and 6.18%,indicating the high processing efficiency and accuracy of this algorithm.The performance of the model can be further improved by optimizing related parameters,but it can already meet the requirements of practical engineering measurement.This study provides a new idea for the full-spectrum processing of airborne gamma rays.