The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow S...The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.展开更多
Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Ext...Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.展开更多
Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identific...Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.展开更多
A new secured database management system architecture using intrusion detection systems(IDS)is proposed in this paper for organizations with no previous role mapping for users.A simple representation of Structured Que...A new secured database management system architecture using intrusion detection systems(IDS)is proposed in this paper for organizations with no previous role mapping for users.A simple representation of Structured Query Language queries is proposed to easily permit the use of the worked clustering algorithm.A new clustering algorithm that uses a tube search with adaptive memory is applied to database log files to create users’profiles.Then,queries issued for each user are checked against the related user profile using a classifier to determine whether or not each query is malicious.The IDS will stop query execution or report the threat to the responsible person if the query is malicious.A simple classifier based on the Euclidean distance is used and the issued query is transformed to the proposed simple representation using a classifier,where the Euclidean distance between the centers and the profile’s issued query is calculated.A synthetic data set is used for our experimental evaluations.Normal user access behavior in relation to the database is modelled using the data set.The false negative(FN)and false positive(FP)rates are used to compare our proposed algorithm with other methods.The experimental results indicate that our proposed method results in very small FN and FP rates.展开更多
Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A...Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.展开更多
Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications...Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.展开更多
It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than ...It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.展开更多
One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast c...One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.展开更多
The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract ...The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.展开更多
Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human ...Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human intervention.Evolutionary algorithms(EAs)for NAS can find better solutions than human-designed architectures by exploring a large search space for possible architectures.Using multiobjective EAs for NAS,optimal neural architectures that meet various performance criteria can be explored and discovered efficiently.Furthermore,hardware-accelerated NAS methods can improve the efficiency of the NAS.While existing reviews have mainly focused on different strategies to complete NAS,a few studies have explored the use of EAs for NAS.In this paper,we summarize and explore the use of EAs for NAS,as well as large-scale multiobjective optimization strategies and hardware-accelerated NAS methods.NAS performs well in healthcare applications,such as medical image analysis,classification of disease diagnosis,and health monitoring.EAs for NAS can automate the search process and optimize multiple objectives simultaneously in a given healthcare task.Deep neural network has been successfully used in healthcare,but it lacks interpretability.Medical data is highly sensitive,and privacy leaks are frequently reported in the healthcare industry.To solve these problems,in healthcare,we propose an interpretable neuroevolution framework based on federated learning to address search efficiency and privacy protection.Moreover,we also point out future research directions for evolutionary NAS.Overall,for researchers who want to use EAs to optimize NNs in healthcare,we analyze the advantages and disadvantages of doing so to provide detailed guidance,and propose an interpretable privacy-preserving framework for healthcare applications.展开更多
Improvement of integrated battlefield situational awareness in complex environments involving dynamic factors such as restricted communications and electromagnetic interference(EMI)has become a contentious research pr...Improvement of integrated battlefield situational awareness in complex environments involving dynamic factors such as restricted communications and electromagnetic interference(EMI)has become a contentious research problem.In certain mission environments,due to the impact of many interference sources on real-time communication or mission requirements such as the need to implement communication regulations,the mission stages are represented as a dynamic combination of several communication-available and communication-unavailable stages.Furthermore,the data interaction between unmanned aerial vehicles(UAVs)can only be performed in specific communication-available stages.Traditional cooperative search algorithms cannot handle such situations well.To solve this problem,this study constructed a distributed model predictive control(DMPC)architecture for a collaborative control of UAVs and used the Voronoi diagram generation method to re-plan the search areas of all UAVs in real time to avoid repetition of search areas and UAV collisions while improving the search efficiency and safety factor.An attention mechanism ant-colony optimization(AACO)algorithm is proposed for UAV search-control decision planning.The search strategy is adaptively updated by introducing an attention mechanism for regular instruction information,a priori information,and emergent information of the mission to satisfy different search expectations to the maximum extent.Simulation results show that the proposed algorithm achieves better search performance than traditional algorithms in restricted communication constraint scenarios.展开更多
This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and int...This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and intercell movements simultaneously,while considering sequence-dependent cell setup times.In the cellular manufacturing systems design and planning,three main steps must be considered,namely cell formation(i.e,piece families and machine grouping),inter and intra-cell layouts,and scheduling issue.Due to the fact that the cellular manufacturing systems problem is NP-Hard,a genetic algorithm as an efficient meta-heuristic method is proposed to solve such a hard problem.Finally,a number of test problems are solved to show the efficiency of the proposed genetic algorithm and the related computational results are compared with the results obtained by the use of an optimization tool.展开更多
Aiming at the practical application of Unmanned Underwater Vehicle(UUV)in underwater combat,this paper proposes a battlefield ambush scene with UUV considering ocean current.Firstly,by establishing these mathematical ...Aiming at the practical application of Unmanned Underwater Vehicle(UUV)in underwater combat,this paper proposes a battlefield ambush scene with UUV considering ocean current.Firstly,by establishing these mathematical models of ocean current environment,target movement,and sonar detection,the probability calculation methods of single UUV searching target and multiple UUV cooperatively searching target are given respectively.Then,based on the Hybrid Quantum-behaved Particle Swarm Optimization(HQPSO)algorithm,the path with the highest target search probability is found.Finally,through simulation calculations,the influence of different UUV parameters and target parameters on the target search probability is analyzed,and the minimum number of UUVs that need to be deployed to complete the ambush task is demonstrated,and the optimal search path scheme is obtained.The method proposed in this paper provides a theoretical basis for the practical application of UUV in the future combat.展开更多
Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. ...Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.展开更多
In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search spa...In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search space with different complexity according to various operations.Meanwhile rationalizing the search strategies to explore the well-defined search space will further improve the speed and efficiency of architecture search.With this in mind,we propose a faster and more efficient differentiable architecture search method,AllegroNAS.Firstly,we introduce a more efficient search space enriched by the introduction of two redefined convolution modules.Secondly,we utilize a more efficient architectural parameter regularization method,mitigating the overfitting problem during the search process and reducing the error brought about by gradient approximation.Meanwhile,we introduce a natural exponential cosine annealing method to make the learning rate of the neural network training process more suitable for the search procedure.Moreover,group convolution and data augmentation are employed to reduce the computational cost.Finally,through extensive experiments on several public datasets,we demonstrate that our method can more swiftly search for better-performing neural network architectures in a more efficient search space,thus validating the effectiveness of our approach.展开更多
Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange...Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange with partners and interference with communication of opponents.In this work,we introduce a Bridge bidding agent that combines supervised learning,deep reinforcement learning via self-play,and a test-time search approach.Our experiments demonstrate that our agent outperforms WBridge5,a highly regarded computer Bridge software that has won multiple world championships,by a performance of 0.98 IMPs(international match points)per deal over 10000 deals,with a much cost-effective approach.The performance significantly surpasses previous state-of-the-art(0.85 IMPs per deal).Note 0.1 IMPs per deal is a significant improvement in Bridge bidding.展开更多
Electronic medical records (EMR) facilitate the sharing of medical data, but existing sharing schemes suffer fromprivacy leakage and inefficiency. This article proposes a lightweight, searchable, and controllable EMR ...Electronic medical records (EMR) facilitate the sharing of medical data, but existing sharing schemes suffer fromprivacy leakage and inefficiency. This article proposes a lightweight, searchable, and controllable EMR sharingscheme, which employs a large attribute domain and a linear secret sharing structure (LSSS), the computationaloverhead of encryption and decryption reaches a lightweight constant level, and supports keyword search andpolicy hiding, which improves the high efficiency of medical data sharing. The dynamic accumulator technologyis utilized to enable data owners to flexibly authorize or revoke the access rights of data visitors to the datato achieve controllability of the data. Meanwhile, the data is re-encrypted by Intel Software Guard Extensions(SGX) technology to realize resistance to offline dictionary guessing attacks. In addition, blockchain technology isutilized to achieve credible accountability for abnormal behaviors in the sharing process. The experiments reflectthe obvious advantages of the scheme in terms of encryption and decryption computation overhead and storageoverhead, and theoretically prove the security and controllability in the sharing process, providing a feasible solutionfor the safe and efficient sharing of EMR.展开更多
基金partially supported by the Guangdong Basic and Applied Basic Research Foundation(2023A1515011531)the National Natural Science Foundation of China under Grant 62173356+2 种基金the Science and Technology Development Fund(FDCT),Macao SAR,under Grant 0019/2021/AZhuhai Industry-University-Research Project with Hongkong and Macao under Grant ZH22017002210014PWCthe Key Technologies for Scheduling and Optimization of Complex Distributed Manufacturing Systems(22JR10KA007).
文摘The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.
基金the University of Transport Technology under grant number DTTD2022-12.
文摘Determination of Shear Bond strength(SBS)at interlayer of double-layer asphalt concrete is crucial in flexible pavement structures.The study used three Machine Learning(ML)models,including K-Nearest Neighbors(KNN),Extra Trees(ET),and Light Gradient Boosting Machine(LGBM),to predict SBS based on easily determinable input parameters.Also,the Grid Search technique was employed for hyper-parameter tuning of the ML models,and cross-validation and learning curve analysis were used for training the models.The models were built on a database of 240 experimental results and three input variables:temperature,normal pressure,and tack coat rate.Model validation was performed using three statistical criteria:the coefficient of determination(R2),the Root Mean Square Error(RMSE),and the mean absolute error(MAE).Additionally,SHAP analysis was also used to validate the importance of the input variables in the prediction of the SBS.Results show that these models accurately predict SBS,with LGBM providing outstanding performance.SHAP(Shapley Additive explanation)analysis for LGBM indicates that temperature is the most influential factor on SBS.Consequently,the proposed ML models can quickly and accurately predict SBS between two layers of asphalt concrete,serving practical applications in flexible pavement structure design.
文摘Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.
文摘A new secured database management system architecture using intrusion detection systems(IDS)is proposed in this paper for organizations with no previous role mapping for users.A simple representation of Structured Query Language queries is proposed to easily permit the use of the worked clustering algorithm.A new clustering algorithm that uses a tube search with adaptive memory is applied to database log files to create users’profiles.Then,queries issued for each user are checked against the related user profile using a classifier to determine whether or not each query is malicious.The IDS will stop query execution or report the threat to the responsible person if the query is malicious.A simple classifier based on the Euclidean distance is used and the issued query is transformed to the proposed simple representation using a classifier,where the Euclidean distance between the centers and the profile’s issued query is calculated.A synthetic data set is used for our experimental evaluations.Normal user access behavior in relation to the database is modelled using the data set.The false negative(FN)and false positive(FP)rates are used to compare our proposed algorithm with other methods.The experimental results indicate that our proposed method results in very small FN and FP rates.
基金supported by the Center for Mining,Electro-Mechanical research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnamfinancially supported by the Hunan Provincial Department of Education General Project(19C1744)+1 种基金Hunan Province Science Foundation for Youth Scholars of China fund(2018JJ3510)the Innovation-Driven Project of Central South University(2020CX040)。
文摘Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.
文摘Cloud computing infrastructure has been evolving as a cost-effective platform for providing computational resources in the form of high-performance computing as a service(HPCaaS)to users for executing HPC applications.However,the broader use of the Cloud services,the rapid increase in the size,and the capacity of Cloud data centers bring a remarkable rise in energy consumption leading to a significant rise in the system provider expenses and carbon emissions in the environment.Besides this,users have become more demanding in terms of Quality-of-service(QoS)expectations in terms of execution time,budget cost,utilization,and makespan.This situation calls for the design of task scheduling policy,which ensures efficient task sequencing and allocation of computing resources to tasks to meet the trade-off between QoS promises and service provider requirements.Moreover,the task scheduling in the Cloud is a prevalent NP-Hard problem.Motivated by these concerns,this paper introduces and implements a QoS-aware Energy-Efficient Scheduling policy called as CSPSO,for scheduling tasks in Cloud systems to reduce the energy consumption of cloud resources and minimize the makespan of workload.The proposed multi-objective CSPSO policy hybridizes the search qualities of two robust metaheuristics viz.cuckoo search(CS)and particle swarm optimization(PSO)to overcome the slow convergence and lack of diversity of standard CS algorithm.A fitness-aware resource allocation(FARA)heuristic was developed and used by the proposed policy to allocate resources to tasks efficiently.A velocity update mechanism for cuckoo individuals is designed and incorporated in the proposed CSPSO policy.Further,the proposed scheduling policy has been implemented in the CloudSim simulator and tested with real supercomputing workload traces.The comparative analysis validated that the proposed scheduling policy can produce efficient schedules with better performance over other well-known heuristics and meta-heuristics scheduling policies.
文摘It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.
文摘One of the most common kinds of cancer is breast cancer.The early detection of it may help lower its overall rates of mortality.In this paper,we robustly propose a novel approach for detecting and classifying breast cancer regions in thermal images.The proposed approach starts with data preprocessing the input images and segmenting the significant regions of interest.In addition,to properly train the machine learning models,data augmentation is applied to increase the number of segmented regions using various scaling ratios.On the other hand,to extract the relevant features from the breast cancer cases,a set of deep neural networks(VGGNet,ResNet-50,AlexNet,and GoogLeNet)are employed.The resulting set of features is processed using the binary dipper throated algorithm to select the most effective features that can realize high classification accuracy.The selected features are used to train a neural network to finally classify the thermal images of breast cancer.To achieve accurate classification,the parameters of the employed neural network are optimized using the continuous dipper throated optimization algorithm.Experimental results show the effectiveness of the proposed approach in classifying the breast cancer cases when compared to other recent approaches in the literature.Moreover,several experiments were conducted to compare the performance of the proposed approach with the other approaches.The results of these experiments emphasized the superiority of the proposed approach.
文摘The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.
基金supported in part by the National Natural Science Foundation of China (NSFC) under Grant No.61976242in part by the Natural Science Fund of Hebei Province for Distinguished Young Scholars under Grant No.F2021202010+2 种基金in part by the Fundamental Scientific Research Funds for Interdisciplinary Team of Hebei University of Technology under Grant No.JBKYTD2002funded by Science and Technology Project of Hebei Education Department under Grant No.JZX2023007supported by 2022 Interdisciplinary Postgraduate Training Program of Hebei University of Technology under Grant No.HEBUT-YXKJC-2022122.
文摘Most of the neural network architectures are based on human experience,which requires a long and tedious trial-and-error process.Neural architecture search(NAS)attempts to detect effective architectures without human intervention.Evolutionary algorithms(EAs)for NAS can find better solutions than human-designed architectures by exploring a large search space for possible architectures.Using multiobjective EAs for NAS,optimal neural architectures that meet various performance criteria can be explored and discovered efficiently.Furthermore,hardware-accelerated NAS methods can improve the efficiency of the NAS.While existing reviews have mainly focused on different strategies to complete NAS,a few studies have explored the use of EAs for NAS.In this paper,we summarize and explore the use of EAs for NAS,as well as large-scale multiobjective optimization strategies and hardware-accelerated NAS methods.NAS performs well in healthcare applications,such as medical image analysis,classification of disease diagnosis,and health monitoring.EAs for NAS can automate the search process and optimize multiple objectives simultaneously in a given healthcare task.Deep neural network has been successfully used in healthcare,but it lacks interpretability.Medical data is highly sensitive,and privacy leaks are frequently reported in the healthcare industry.To solve these problems,in healthcare,we propose an interpretable neuroevolution framework based on federated learning to address search efficiency and privacy protection.Moreover,we also point out future research directions for evolutionary NAS.Overall,for researchers who want to use EAs to optimize NNs in healthcare,we analyze the advantages and disadvantages of doing so to provide detailed guidance,and propose an interpretable privacy-preserving framework for healthcare applications.
基金the support of the National Natural Science Foundation of China(Grant No.62076204)the Seed Foundation of Innovation and Creation for Graduate Students in Northwestern Polytechnical University(Grant No.CX2020019)in part by the China Postdoctoral Science Foundation(Grants No.2021M700337)。
文摘Improvement of integrated battlefield situational awareness in complex environments involving dynamic factors such as restricted communications and electromagnetic interference(EMI)has become a contentious research problem.In certain mission environments,due to the impact of many interference sources on real-time communication or mission requirements such as the need to implement communication regulations,the mission stages are represented as a dynamic combination of several communication-available and communication-unavailable stages.Furthermore,the data interaction between unmanned aerial vehicles(UAVs)can only be performed in specific communication-available stages.Traditional cooperative search algorithms cannot handle such situations well.To solve this problem,this study constructed a distributed model predictive control(DMPC)architecture for a collaborative control of UAVs and used the Voronoi diagram generation method to re-plan the search areas of all UAVs in real time to avoid repetition of search areas and UAV collisions while improving the search efficiency and safety factor.An attention mechanism ant-colony optimization(AACO)algorithm is proposed for UAV search-control decision planning.The search strategy is adaptively updated by introducing an attention mechanism for regular instruction information,a priori information,and emergent information of the mission to satisfy different search expectations to the maximum extent.Simulation results show that the proposed algorithm achieves better search performance than traditional algorithms in restricted communication constraint scenarios.
文摘This paper presents a new,bi-criteria mixed_integer programming model for scheduling cells and pieces within each cell in a manufacturing cellular system.The objective of this model is to minimize the makespan and intercell movements simultaneously,while considering sequence-dependent cell setup times.In the cellular manufacturing systems design and planning,three main steps must be considered,namely cell formation(i.e,piece families and machine grouping),inter and intra-cell layouts,and scheduling issue.Due to the fact that the cellular manufacturing systems problem is NP-Hard,a genetic algorithm as an efficient meta-heuristic method is proposed to solve such a hard problem.Finally,a number of test problems are solved to show the efficiency of the proposed genetic algorithm and the related computational results are compared with the results obtained by the use of an optimization tool.
文摘Aiming at the practical application of Unmanned Underwater Vehicle(UUV)in underwater combat,this paper proposes a battlefield ambush scene with UUV considering ocean current.Firstly,by establishing these mathematical models of ocean current environment,target movement,and sonar detection,the probability calculation methods of single UUV searching target and multiple UUV cooperatively searching target are given respectively.Then,based on the Hybrid Quantum-behaved Particle Swarm Optimization(HQPSO)algorithm,the path with the highest target search probability is found.Finally,through simulation calculations,the influence of different UUV parameters and target parameters on the target search probability is analyzed,and the minimum number of UUVs that need to be deployed to complete the ambush task is demonstrated,and the optimal search path scheme is obtained.The method proposed in this paper provides a theoretical basis for the practical application of UUV in the future combat.
文摘Cloud computing has emerged as a new style of computing in distributed environment. An efficient and dependable Workflow Scheduling is crucial for achieving high performance and incorporating with enterprise systems. As an effective security services aggregation methodology, Trust Work-flow Technology (TWT) has been used to construct composite services. However, in cloud environment, the existing closed network services are maintained and functioned by third-party organizations or enterprises. Therefore service-oriented trust strategies must be considered in workflow scheduling. TWFS related algorithms consist of trust policies and strategies to overcome the threats of the application with heuristic workflow scheduling. As a significance of this work, trust based Meta heuristic workflow scheduling (TMWS) is proposed. The TMWS algorithm will improve the efficiency and reliability of the operation in the cloud system and the results show that the TMWS approach is effective and feasible.
基金This work was supported in part by the National Natural Science Foundation of China under Grant 61305001the Natural Science Foundation of Heilongjiang Province of China under Grant F201222.
文摘In differentiable search architecture search methods,a more efficient search space design can significantly improve the performance of the searched architecture,thus requiring people to carefully define the search space with different complexity according to various operations.Meanwhile rationalizing the search strategies to explore the well-defined search space will further improve the speed and efficiency of architecture search.With this in mind,we propose a faster and more efficient differentiable architecture search method,AllegroNAS.Firstly,we introduce a more efficient search space enriched by the introduction of two redefined convolution modules.Secondly,we utilize a more efficient architectural parameter regularization method,mitigating the overfitting problem during the search process and reducing the error brought about by gradient approximation.Meanwhile,we introduce a natural exponential cosine annealing method to make the learning rate of the neural network training process more suitable for the search procedure.Moreover,group convolution and data augmentation are employed to reduce the computational cost.Finally,through extensive experiments on several public datasets,we demonstrate that our method can more swiftly search for better-performing neural network architectures in a more efficient search space,thus validating the effectiveness of our approach.
文摘Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange with partners and interference with communication of opponents.In this work,we introduce a Bridge bidding agent that combines supervised learning,deep reinforcement learning via self-play,and a test-time search approach.Our experiments demonstrate that our agent outperforms WBridge5,a highly regarded computer Bridge software that has won multiple world championships,by a performance of 0.98 IMPs(international match points)per deal over 10000 deals,with a much cost-effective approach.The performance significantly surpasses previous state-of-the-art(0.85 IMPs per deal).Note 0.1 IMPs per deal is a significant improvement in Bridge bidding.
基金the Natural Science Foundation of Hebei Province under Grant Number F2021201052.
文摘Electronic medical records (EMR) facilitate the sharing of medical data, but existing sharing schemes suffer fromprivacy leakage and inefficiency. This article proposes a lightweight, searchable, and controllable EMR sharingscheme, which employs a large attribute domain and a linear secret sharing structure (LSSS), the computationaloverhead of encryption and decryption reaches a lightweight constant level, and supports keyword search andpolicy hiding, which improves the high efficiency of medical data sharing. The dynamic accumulator technologyis utilized to enable data owners to flexibly authorize or revoke the access rights of data visitors to the datato achieve controllability of the data. Meanwhile, the data is re-encrypted by Intel Software Guard Extensions(SGX) technology to realize resistance to offline dictionary guessing attacks. In addition, blockchain technology isutilized to achieve credible accountability for abnormal behaviors in the sharing process. The experiments reflectthe obvious advantages of the scheme in terms of encryption and decryption computation overhead and storageoverhead, and theoretically prove the security and controllability in the sharing process, providing a feasible solutionfor the safe and efficient sharing of EMR.