Artificial rabbits optimization(ARO)is a recently proposed biology-based optimization algorithm inspired by the detour foraging and random hiding behavior of rabbits in nature.However,for solving optimization problems...Artificial rabbits optimization(ARO)is a recently proposed biology-based optimization algorithm inspired by the detour foraging and random hiding behavior of rabbits in nature.However,for solving optimization problems,the ARO algorithm shows slow convergence speed and can fall into local minima.To overcome these drawbacks,this paper proposes chaotic opposition-based learning ARO(COARO),an improved version of the ARO algorithm that incorporates opposition-based learning(OBL)and chaotic local search(CLS)techniques.By adding OBL to ARO,the convergence speed of the algorithm increases and it explores the search space better.Chaotic maps in CLS provide rapid convergence by scanning the search space efficiently,since their ergodicity and non-repetitive properties.The proposed COARO algorithm has been tested using thirty-three distinct benchmark functions.The outcomes have been compared with the most recent optimization algorithms.Additionally,the COARO algorithm’s problem-solving capabilities have been evaluated using six different engineering design problems and compared with various other algorithms.This study also introduces a binary variant of the continuous COARO algorithm,named BCOARO.The performance of BCOARO was evaluated on the breast cancer dataset.The effectiveness of BCOARO has been compared with different feature selection algorithms.The proposed BCOARO outperforms alternative algorithms,according to the findings obtained for real applications in terms of accuracy performance,and fitness value.Extensive experiments show that the COARO and BCOARO algorithms achieve promising results compared to other metaheuristic algorithms.展开更多
Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange...Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange with partners and interference with communication of opponents.In this work,we introduce a Bridge bidding agent that combines supervised learning,deep reinforcement learning via self-play,and a test-time search approach.Our experiments demonstrate that our agent outperforms WBridge5,a highly regarded computer Bridge software that has won multiple world championships,by a performance of 0.98 IMPs(international match points)per deal over 10000 deals,with a much cost-effective approach.The performance significantly surpasses previous state-of-the-art(0.85 IMPs per deal).Note 0.1 IMPs per deal is a significant improvement in Bridge bidding.展开更多
With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate...With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.展开更多
Pipeline isolation plugging robot (PIPR) is an important tool in pipeline maintenance operation. During the plugging process, the violent vibration will occur by the flow field, which can cause serious damage to the p...Pipeline isolation plugging robot (PIPR) is an important tool in pipeline maintenance operation. During the plugging process, the violent vibration will occur by the flow field, which can cause serious damage to the pipeline and PIPR. In this paper, we propose a dynamic regulating strategy to reduce the plugging-induced vibration by regulating the spoiler angle and plugging velocity. Firstly, the dynamic plugging simulation and experiment are performed to study the flow field changes during dynamic plugging. And the pressure difference is proposed to evaluate the degree of flow field vibration. Secondly, the mathematical models of pressure difference with plugging states and spoiler angles are established based on the extreme learning machine (ELM) optimized by improved sparrow search algorithm (ISSA). Finally, a modified Q-learning algorithm based on simulated annealing is applied to determine the optimal strategy for the spoiler angle and plugging velocity in real time. The results show that the proposed method can reduce the plugging-induced vibration by 19.9% and 32.7% on average, compared with single-regulating methods. This study can effectively ensure the stability of the plugging process.展开更多
With the increasing demand for electrical services,wind farm layout optimization has been one of the biggest challenges that we have to deal with.Despite the promising performance of the heuristic algorithm on the rou...With the increasing demand for electrical services,wind farm layout optimization has been one of the biggest challenges that we have to deal with.Despite the promising performance of the heuristic algorithm on the route network design problem,the expressive capability and search performance of the algorithm on multi-objective problems remain unexplored.In this paper,the wind farm layout optimization problem is defined.Then,a multi-objective algorithm based on Graph Neural Network(GNN)and Variable Neighborhood Search(VNS)algorithm is proposed.GNN provides the basis representations for the following search algorithm so that the expressiveness and search accuracy of the algorithm can be improved.The multi-objective VNS algorithm is put forward by combining it with the multi-objective optimization algorithm to solve the problem with multiple objectives.The proposed algorithm is applied to the 18-node simulation example to evaluate the feasibility and practicality of the developed optimization strategy.The experiment on the simulation example shows that the proposed algorithm yields a reduction of 6.1% in Point of Common Coupling(PCC)over the current state-of-the-art algorithm,which means that the proposed algorithm designs a layout that improves the quality of the power supply by 6.1%at the same cost.The ablation experiments show that the proposed algorithm improves the power quality by more than 8.6% and 7.8% compared to both the original VNS algorithm and the multi-objective VNS algorithm.展开更多
Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous r...Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.展开更多
Recently,human healthcare from body sensor data has gained considerable interest from a wide variety of human-computer communication and pattern analysis research owing to their real-time applications namely smart hea...Recently,human healthcare from body sensor data has gained considerable interest from a wide variety of human-computer communication and pattern analysis research owing to their real-time applications namely smart healthcare systems.Even though there are various forms of utilizing distributed sensors to monitor the behavior of people and vital signs,physical human action recognition(HAR)through body sensors gives useful information about the lifestyle and functionality of an individual.This article concentrates on the design of an Improved Transient Search Optimization with Machine Learning based BehaviorRecognition(ITSOMLBR)technique using body sensor data.The presented ITSOML-BR technique collects data from different body sensors namely electrocardiography(ECG),accelerometer,and magnetometer.In addition,the ITSOML-BR technique extract features like variance,mean,skewness,and standard deviation.Moreover,the presented ITSOML-BR technique executes a micro neural network(MNN)which can be employed for long term healthcare monitoring and classification.Furthermore,the parameters related to the MNN model are optimally selected via the ITSO algorithm.The experimental result analysis of the ITSOML-BR technique is tested on the MHEALTH dataset.The comprehensive comparison study reported a higher result for the ITSOMLBR approach over other existing approaches with maximum accuracy of 99.60%.展开更多
Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that red...Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that reduces the power outputs of wind turbines located in downstream.Wind farm layout optimization(WFLO)aims to reduce the wake effect for maximizing the power outputs of the wind farm.Nevertheless,the wake effect among wind turbines increases significantly as the number of wind turbines increases in the wind farm,which severely affect power conversion efficiency.Conventional heuristic algorithms suffer from issues of low solution quality and local optimum for large-scale WFLO under complex wind scenarios.Thus,a chaotic local search-based genetic learning particle swarm optimizer(CGPSO)is proposed to optimize large-scale WFLO problems.CGPSO is tested on four larger-scale wind farms under four complex wind scenarios and compares with eight state-of-the-art algorithms.The experiment results indicate that CGPSO significantly outperforms its competitors in terms of performance,stability,and robustness.To be specific,a success and failure memories-based selection is proposed to choose a chaotic map for chaotic search local.It improves the solution quality.The parameter and search pattern of chaotic local search are also analyzed for WFLO problems.展开更多
As a complex hot problem in the financial field,stock trend forecasting uses a large amount of data and many related indicators;hence it is difficult to obtain sustainable and effective results only by relying on empi...As a complex hot problem in the financial field,stock trend forecasting uses a large amount of data and many related indicators;hence it is difficult to obtain sustainable and effective results only by relying on empirical analysis.Researchers in the field of machine learning have proved that random forest can form better judgements on this kind of problem,and it has an auxiliary role in the prediction of stock trend.This study uses historical trading data of four listed companies in the USA stock market,and the purpose of this study is to improve the performance of random forest model in medium-and long-term stock trend prediction.This study applies the exponential smoothing method to process the initial data,calculates the relevant technical indicators as the characteristics to be selected,and proposes the D-RF-RS method to optimize random forest.As the random forest is an ensemble learning model and is closely related to decision tree,D-RF-RS method uses a decision tree to screen the importance of features,and obtains the effective strong feature set of the model as input.Then,the parameter combination of the model is optimized through random parameter search.The experimental results show that the average accuracy of random forest is increased by 0.17 after the above process optimization,which is 0.18 higher than the average accuracy of light gradient boosting machine model.Combined with the performance of the ROC curve and Precision–Recall curve,the stability of the model is also guaranteed,which further demonstrates the advantages of random forest in medium-and long-term trend prediction of the stock market.展开更多
The exponential increase in data over the past fewyears,particularly in images,has led to more complex content since visual representation became the new norm.E-commerce and similar platforms maintain large image cata...The exponential increase in data over the past fewyears,particularly in images,has led to more complex content since visual representation became the new norm.E-commerce and similar platforms maintain large image catalogues of their products.In image databases,searching and retrieving similar images is still a challenge,even though several image retrieval techniques have been proposed over the decade.Most of these techniques work well when querying general image databases.However,they often fail in domain-specific image databases,especially for datasets with low intraclass variance.This paper proposes a domain-specific image similarity search engine based on a fused deep learning network.The network is comprised of an improved object localization module,a classification module to narrow down search options and finally a feature extraction and similarity calculation module.The network features both an offline stage for indexing the dataset and an online stage for querying.The dataset used to evaluate the performance of the proposed network is a custom domain-specific dataset related to cosmetics packaging gathered from various online platforms.The proposed method addresses the intraclass variance problem with more precise object localization and the introduction of top result reranking based on object contours.Finally,quantitative and qualitative experiment results are presented,showing improved image similarity search performance.展开更多
Presently,smart cities play a vital role to enhance the quality of living among human beings in several ways such as online shopping,e-learning,ehealthcare,etc.Despite the benefits of advanced technologies,issues are ...Presently,smart cities play a vital role to enhance the quality of living among human beings in several ways such as online shopping,e-learning,ehealthcare,etc.Despite the benefits of advanced technologies,issues are also existed from the transformation of the physical word into digital word,particularly in online social networks(OSN).Cyberbullying(CB)is a major problem in OSN which needs to be addressed by the use of automated natural language processing(NLP)and machine learning(ML)approaches.This article devises a novel search and rescue optimization with machine learning enabled cybersecurity model for online social networks,named SRO-MLCOSN model.The presented SRO-MLCOSN model focuses on the identification of CB that occurred in social networking sites.The SRO-MLCOSN model initially employs Glove technique for word embedding process.Besides,a multiclass-weighted kernel extreme learning machine(M-WKELM)model is utilized for effectual identification and categorization of CB.Finally,Search and Rescue Optimization(SRO)algorithm is exploited to fine tune the parameters involved in the M-WKELM model.The experimental validation of the SRO-MLCOSN model on the benchmark dataset reported significant outcomes over the other approaches with precision,recall,and F1-score of 96.24%,98.71%,and 97.46%respectively.展开更多
Autism Spectrum Disorder (ASD) refers to a neuro-disorder wherean individual has long-lasting effects on communication and interaction withothers.Advanced information technologywhich employs artificial intelligence(AI...Autism Spectrum Disorder (ASD) refers to a neuro-disorder wherean individual has long-lasting effects on communication and interaction withothers.Advanced information technologywhich employs artificial intelligence(AI) model has assisted in early identify ASD by using pattern detection.Recent advances of AI models assist in the automated identification andclassification of ASD, which helps to reduce the severity of the disease.This study introduces an automated ASD classification using owl searchalgorithm with machine learning (ASDC-OSAML) model. The proposedASDC-OSAML model majorly focuses on the identification and classificationof ASD. To attain this, the presentedASDC-OSAML model follows minmaxnormalization approach as a pre-processing stage. Next, the owl searchalgorithm (OSA)-based feature selection (OSA-FS) model is used to derivefeature subsets. Then, beetle swarm antenna search (BSAS) algorithm withIterative Dichotomiser 3 (ID3) classification method was implied for ASDdetection and classification. The design of BSAS algorithm helps to determinethe parameter values of the ID3 classifier. The performance analysis of theASDC-OSAML model is performed using benchmark dataset. An extensivecomparison study highlighted the supremacy of the ASDC-OSAML modelover recent state of art approaches.展开更多
Currently, the second most devastating form of cancer in people, particularly in women, is Breast Cancer (BC). In the healthcare industry, Machine Learning (ML) is commonly employed in fatal disease prediction. Due to...Currently, the second most devastating form of cancer in people, particularly in women, is Breast Cancer (BC). In the healthcare industry, Machine Learning (ML) is commonly employed in fatal disease prediction. Due to breast cancer’s favourable prognosis at an early stage, a model is created to utilize the Dataset on Wisconsin Diagnostic Breast Cancer (WDBC). Conversely, this model’s overarching axiom is to compare the effectiveness of five well-known ML classifiers, including Logistic Regression (LR), Decision Tree (DT), Random Forest (RF), K-Nearest Neighbor (KNN), and Naive Bayes (NB) with the conventional method. To counterbalance the effect with conventional methods, the overarching tactic we utilized was hyperparameter tuning utilizing the grid search method, which improved accuracy, secondary precision, third recall, F1 score and finally the AUC & ROC curve. In this study of hyperparameter tuning model, the rate of accuracy increased from 94.15% to 98.83% whereas the accuracy of the conventional method increased from 93.56% to 97.08%. According to this investigation, KNN outperformed all other classifiers in terms of accuracy, achieving a score of 98.83%. In conclusion, our study shows that KNN works well with the hyper-tuning method. These analyses show that this study prediction approach is useful in prognosticating women with breast cancer with a viable performance and more accurate findings when compared to the conventional approach.展开更多
Word Sense Disambiguation has been a trending topic of research in Natural Language Processing and Machine Learning.Mining core features and performing the text classification still exist as a challenging task.Here the...Word Sense Disambiguation has been a trending topic of research in Natural Language Processing and Machine Learning.Mining core features and performing the text classification still exist as a challenging task.Here the features of the context such as neighboring words like adjective provide the evidence for classification using machine learning approach.This paper presented the text document classification that has wide applications in information retrieval,which uses movie review datasets.Here the document indexing based on controlled vocabulary,adjective,word sense disambiguation,generating hierarchical cate-gorization of web pages,spam detection,topic labeling,web search,document summarization,etc.Here the kernel support vector machine learning algorithm helps to classify the text and feature extract is performed by cuckoo search opti-mization.Positive review and negative review of movie dataset is presented to get the better classification accuracy.Experimental results focused with context mining,feature analysis and classification.By comparing with the previous work,proposed work designed to achieve the efficient results.Overall design is per-formed with MATLAB 2020a tool.展开更多
Because of everyone's involvement in social networks, social networks are full of massive multimedia data, and events are got released and disseminated through social networks in the form of multi-modal and multi-att...Because of everyone's involvement in social networks, social networks are full of massive multimedia data, and events are got released and disseminated through social networks in the form of multi-modal and multi-attribute heterogeneous data. There have been numerous researches on social network search. Considering the spatio-temporal feature of messages and social relationships among users, we summarized an overall social network search framework from the perspective of semantics based on existing researches. For social network search, the acquisition and representation of spatio-temporal data is the basis, the semantic analysis and modeling of social network cross-media big data is an important component, deep semantic learning of social networks is the key research field, and the indexing and ranking mechanism is the indispensable part. This paper reviews the current studies in these fields, and then main challenges of social network search are given. Finally, we give an outlook to the prospect and further work of social network search.展开更多
Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessi...Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.展开更多
Cloud computing environments,characterized by dynamic scaling,distributed architectures,and complex work-loads,are increasingly targeted by malicious actors.These threats encompass unauthorized access,data breaches,de...Cloud computing environments,characterized by dynamic scaling,distributed architectures,and complex work-loads,are increasingly targeted by malicious actors.These threats encompass unauthorized access,data breaches,denial-of-service attacks,and evolving malware variants.Traditional security solutions often struggle with the dynamic nature of cloud environments,highlighting the need for robust Adaptive Cloud Intrusion Detection Systems(CIDS).Existing adaptive CIDS solutions,while offering improved detection capabilities,often face limitations such as reliance on approximations for change point detection,hindering their precision in identifying anomalies.This can lead to missed attacks or an abundance of false alarms,impacting overall security effectiveness.To address these challenges,we propose ACIDS(Adaptive Cloud Intrusion Detection System)-PELT.This novel Adaptive CIDS framework leverages the Pruned Exact Linear Time(PELT)algorithm and a Support Vector Machine(SVM)for enhanced accuracy and efficiency.ACIDS-PELT comprises four key components:(1)Feature Selection:Utilizing a hybrid harmony search algorithm and the symmetrical uncertainty filter(HSO-SU)to identify the most relevant features that effectively differentiate between normal and anomalous network traffic in the cloud environment.(2)Surveillance:Employing the PELT algorithm to detect change points within the network traffic data,enabling the identification of anomalies and potential security threats with improved precision compared to existing approaches.(3)Training Set:Labeled network traffic data forms the training set used to train the SVM classifier to distinguish between normal and anomalous behaviour patterns.(4)Testing Set:The testing set evaluates ACIDS-PELT’s performance by measuring its accuracy,precision,and recall in detecting security threats within the cloud environment.We evaluate the performance of ACIDS-PELT using the NSL-KDD benchmark dataset.The results demonstrate that ACIDS-PELT outperforms existing cloud intrusion detection techniques in terms of accuracy,precision,and recall.This superiority stems from ACIDS-PELT’s ability to overcome limitations associated with approximation and imprecision in change point detection while offering a more accurate and precise approach to detecting security threats in dynamic cloud environments.展开更多
基金funded by Firat University Scientific Research Projects Management Unit for the scientific research project of Feyza AltunbeyÖzbay,numbered MF.23.49.
文摘Artificial rabbits optimization(ARO)is a recently proposed biology-based optimization algorithm inspired by the detour foraging and random hiding behavior of rabbits in nature.However,for solving optimization problems,the ARO algorithm shows slow convergence speed and can fall into local minima.To overcome these drawbacks,this paper proposes chaotic opposition-based learning ARO(COARO),an improved version of the ARO algorithm that incorporates opposition-based learning(OBL)and chaotic local search(CLS)techniques.By adding OBL to ARO,the convergence speed of the algorithm increases and it explores the search space better.Chaotic maps in CLS provide rapid convergence by scanning the search space efficiently,since their ergodicity and non-repetitive properties.The proposed COARO algorithm has been tested using thirty-three distinct benchmark functions.The outcomes have been compared with the most recent optimization algorithms.Additionally,the COARO algorithm’s problem-solving capabilities have been evaluated using six different engineering design problems and compared with various other algorithms.This study also introduces a binary variant of the continuous COARO algorithm,named BCOARO.The performance of BCOARO was evaluated on the breast cancer dataset.The effectiveness of BCOARO has been compared with different feature selection algorithms.The proposed BCOARO outperforms alternative algorithms,according to the findings obtained for real applications in terms of accuracy performance,and fitness value.Extensive experiments show that the COARO and BCOARO algorithms achieve promising results compared to other metaheuristic algorithms.
文摘Contract Bridge,a four-player imperfect information game,comprises two phases:bidding and playing.While computer programs excel at playing,bidding presents a challenging aspect due to the need for information exchange with partners and interference with communication of opponents.In this work,we introduce a Bridge bidding agent that combines supervised learning,deep reinforcement learning via self-play,and a test-time search approach.Our experiments demonstrate that our agent outperforms WBridge5,a highly regarded computer Bridge software that has won multiple world championships,by a performance of 0.98 IMPs(international match points)per deal over 10000 deals,with a much cost-effective approach.The performance significantly surpasses previous state-of-the-art(0.85 IMPs per deal).Note 0.1 IMPs per deal is a significant improvement in Bridge bidding.
基金supported by the Beijing Academy of Quantum Information Sciencessupported by the National Natural Science Foundation of China(Grant No.92365206)+2 种基金the support of the China Postdoctoral Science Foundation(Certificate Number:2023M740272)supported by the National Natural Science Foundation of China(Grant No.12247168)China Postdoctoral Science Foundation(Certificate Number:2022TQ0036)。
文摘With the rapid advancement of quantum computing,hybrid quantum–classical machine learning has shown numerous potential applications at the current stage,with expectations of being achievable in the noisy intermediate-scale quantum(NISQ)era.Quantum reinforcement learning,as an indispensable study,has recently demonstrated its ability to solve standard benchmark environments with formally provable theoretical advantages over classical counterparts.However,despite the progress of quantum processors and the emergence of quantum computing clouds,implementing quantum reinforcement learning algorithms utilizing parameterized quantum circuits(PQCs)on NISQ devices remains infrequent.In this work,we take the first step towards executing benchmark quantum reinforcement problems on real devices equipped with at most 136 qubits on the BAQIS Quafu quantum computing cloud.The experimental results demonstrate that the policy agents can successfully accomplish objectives under modified conditions in both the training and inference phases.Moreover,we design hardware-efficient PQC architectures in the quantum model using a multi-objective evolutionary algorithm and develop a learning algorithm that is adaptable to quantum devices.We hope that the Quafu-RL can be a guiding example to show how to realize machine learning tasks by taking advantage of quantum computers on the quantum cloud platform.
基金This work was financially supported by the National Natural Science Foundation of China(Grant No.51575528)the Science Foundation of China University of Petroleum,Beijing(No.2462022QEDX011).
文摘Pipeline isolation plugging robot (PIPR) is an important tool in pipeline maintenance operation. During the plugging process, the violent vibration will occur by the flow field, which can cause serious damage to the pipeline and PIPR. In this paper, we propose a dynamic regulating strategy to reduce the plugging-induced vibration by regulating the spoiler angle and plugging velocity. Firstly, the dynamic plugging simulation and experiment are performed to study the flow field changes during dynamic plugging. And the pressure difference is proposed to evaluate the degree of flow field vibration. Secondly, the mathematical models of pressure difference with plugging states and spoiler angles are established based on the extreme learning machine (ELM) optimized by improved sparrow search algorithm (ISSA). Finally, a modified Q-learning algorithm based on simulated annealing is applied to determine the optimal strategy for the spoiler angle and plugging velocity in real time. The results show that the proposed method can reduce the plugging-induced vibration by 19.9% and 32.7% on average, compared with single-regulating methods. This study can effectively ensure the stability of the plugging process.
基金supported by the Natural Science Foundation of Zhejiang Province(LY19A020001).
文摘With the increasing demand for electrical services,wind farm layout optimization has been one of the biggest challenges that we have to deal with.Despite the promising performance of the heuristic algorithm on the route network design problem,the expressive capability and search performance of the algorithm on multi-objective problems remain unexplored.In this paper,the wind farm layout optimization problem is defined.Then,a multi-objective algorithm based on Graph Neural Network(GNN)and Variable Neighborhood Search(VNS)algorithm is proposed.GNN provides the basis representations for the following search algorithm so that the expressiveness and search accuracy of the algorithm can be improved.The multi-objective VNS algorithm is put forward by combining it with the multi-objective optimization algorithm to solve the problem with multiple objectives.The proposed algorithm is applied to the 18-node simulation example to evaluate the feasibility and practicality of the developed optimization strategy.The experiment on the simulation example shows that the proposed algorithm yields a reduction of 6.1% in Point of Common Coupling(PCC)over the current state-of-the-art algorithm,which means that the proposed algorithm designs a layout that improves the quality of the power supply by 6.1%at the same cost.The ablation experiments show that the proposed algorithm improves the power quality by more than 8.6% and 7.8% compared to both the original VNS algorithm and the multi-objective VNS algorithm.
基金Under the auspices of National Natural Science Foundation of China(No.52079103)。
文摘Precise and timely prediction of crop yields is crucial for food security and the development of agricultural policies.However,crop yield is influenced by multiple factors within complex growth environments.Previous research has paid relatively little attention to the interference of environmental factors and drought on the growth of winter wheat.Therefore,there is an urgent need for more effective methods to explore the inherent relationship between these factors and crop yield,making precise yield prediction increasingly important.This study was based on four type of indicators including meteorological,crop growth status,environmental,and drought index,from October 2003 to June 2019 in Henan Province as the basic data for predicting winter wheat yield.Using the sparrow search al-gorithm combined with random forest(SSA-RF)under different input indicators,accuracy of winter wheat yield estimation was calcu-lated.The estimation accuracy of SSA-RF was compared with partial least squares regression(PLSR),extreme gradient boosting(XG-Boost),and random forest(RF)models.Finally,the determined optimal yield estimation method was used to predict winter wheat yield in three typical years.Following are the findings:1)the SSA-RF demonstrates superior performance in estimating winter wheat yield compared to other algorithms.The best yield estimation method is achieved by four types indicators’composition with SSA-RF)(R^(2)=0.805,RRMSE=9.9%.2)Crops growth status and environmental indicators play significant roles in wheat yield estimation,accounting for 46%and 22%of the yield importance among all indicators,respectively.3)Selecting indicators from October to April of the follow-ing year yielded the highest accuracy in winter wheat yield estimation,with an R^(2)of 0.826 and an RMSE of 9.0%.Yield estimates can be completed two months before the winter wheat harvest in June.4)The predicted performance will be slightly affected by severe drought.Compared with severe drought year(2011)(R^(2)=0.680)and normal year(2017)(R^(2)=0.790),the SSA-RF model has higher prediction accuracy for wet year(2018)(R^(2)=0.820).This study could provide an innovative approach for remote sensing estimation of winter wheat yield.yield.
文摘Recently,human healthcare from body sensor data has gained considerable interest from a wide variety of human-computer communication and pattern analysis research owing to their real-time applications namely smart healthcare systems.Even though there are various forms of utilizing distributed sensors to monitor the behavior of people and vital signs,physical human action recognition(HAR)through body sensors gives useful information about the lifestyle and functionality of an individual.This article concentrates on the design of an Improved Transient Search Optimization with Machine Learning based BehaviorRecognition(ITSOMLBR)technique using body sensor data.The presented ITSOML-BR technique collects data from different body sensors namely electrocardiography(ECG),accelerometer,and magnetometer.In addition,the ITSOML-BR technique extract features like variance,mean,skewness,and standard deviation.Moreover,the presented ITSOML-BR technique executes a micro neural network(MNN)which can be employed for long term healthcare monitoring and classification.Furthermore,the parameters related to the MNN model are optimally selected via the ITSO algorithm.The experimental result analysis of the ITSOML-BR technique is tested on the MHEALTH dataset.The comprehensive comparison study reported a higher result for the ITSOMLBR approach over other existing approaches with maximum accuracy of 99.60%.
基金partially supported by the Japan Society for the Promotion of Science(JSPS)KAKENHI(JP22H03643)Japan Science and Technology Agency(JST)Support for Pioneering Research Initiated by the Next Generation(SPRING)(JPMJSP2145)JST through the Establishment of University Fellowships towards the Creation of Science Technology Innovation(JPMJFS2115)。
文摘Wind energy has been widely applied in power generation to alleviate climate problems.The wind turbine layout of a wind farm is a primary factor of impacting power conversion efficiency due to the wake effect that reduces the power outputs of wind turbines located in downstream.Wind farm layout optimization(WFLO)aims to reduce the wake effect for maximizing the power outputs of the wind farm.Nevertheless,the wake effect among wind turbines increases significantly as the number of wind turbines increases in the wind farm,which severely affect power conversion efficiency.Conventional heuristic algorithms suffer from issues of low solution quality and local optimum for large-scale WFLO under complex wind scenarios.Thus,a chaotic local search-based genetic learning particle swarm optimizer(CGPSO)is proposed to optimize large-scale WFLO problems.CGPSO is tested on four larger-scale wind farms under four complex wind scenarios and compares with eight state-of-the-art algorithms.The experiment results indicate that CGPSO significantly outperforms its competitors in terms of performance,stability,and robustness.To be specific,a success and failure memories-based selection is proposed to choose a chaotic map for chaotic search local.It improves the solution quality.The parameter and search pattern of chaotic local search are also analyzed for WFLO problems.
基金National Natural Science Foundation of China,Grant/Award Numbers:61673084,National Natural Science Foundation of ChinaThe Fundamental Research Foundation for Universities of Heilongjiang Province,Grant/Award Number:LGYC2018JC017。
文摘As a complex hot problem in the financial field,stock trend forecasting uses a large amount of data and many related indicators;hence it is difficult to obtain sustainable and effective results only by relying on empirical analysis.Researchers in the field of machine learning have proved that random forest can form better judgements on this kind of problem,and it has an auxiliary role in the prediction of stock trend.This study uses historical trading data of four listed companies in the USA stock market,and the purpose of this study is to improve the performance of random forest model in medium-and long-term stock trend prediction.This study applies the exponential smoothing method to process the initial data,calculates the relevant technical indicators as the characteristics to be selected,and proposes the D-RF-RS method to optimize random forest.As the random forest is an ensemble learning model and is closely related to decision tree,D-RF-RS method uses a decision tree to screen the importance of features,and obtains the effective strong feature set of the model as input.Then,the parameter combination of the model is optimized through random parameter search.The experimental results show that the average accuracy of random forest is increased by 0.17 after the above process optimization,which is 0.18 higher than the average accuracy of light gradient boosting machine model.Combined with the performance of the ROC curve and Precision–Recall curve,the stability of the model is also guaranteed,which further demonstrates the advantages of random forest in medium-and long-term trend prediction of the stock market.
文摘The exponential increase in data over the past fewyears,particularly in images,has led to more complex content since visual representation became the new norm.E-commerce and similar platforms maintain large image catalogues of their products.In image databases,searching and retrieving similar images is still a challenge,even though several image retrieval techniques have been proposed over the decade.Most of these techniques work well when querying general image databases.However,they often fail in domain-specific image databases,especially for datasets with low intraclass variance.This paper proposes a domain-specific image similarity search engine based on a fused deep learning network.The network is comprised of an improved object localization module,a classification module to narrow down search options and finally a feature extraction and similarity calculation module.The network features both an offline stage for indexing the dataset and an online stage for querying.The dataset used to evaluate the performance of the proposed network is a custom domain-specific dataset related to cosmetics packaging gathered from various online platforms.The proposed method addresses the intraclass variance problem with more precise object localization and the introduction of top result reranking based on object contours.Finally,quantitative and qualitative experiment results are presented,showing improved image similarity search performance.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work under Grant Number(RGP 2/158/43)Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R114),Princess Nourah bint Abdulrahman University,Riyadh,Saudi Arabia.
文摘Presently,smart cities play a vital role to enhance the quality of living among human beings in several ways such as online shopping,e-learning,ehealthcare,etc.Despite the benefits of advanced technologies,issues are also existed from the transformation of the physical word into digital word,particularly in online social networks(OSN).Cyberbullying(CB)is a major problem in OSN which needs to be addressed by the use of automated natural language processing(NLP)and machine learning(ML)approaches.This article devises a novel search and rescue optimization with machine learning enabled cybersecurity model for online social networks,named SRO-MLCOSN model.The presented SRO-MLCOSN model focuses on the identification of CB that occurred in social networking sites.The SRO-MLCOSN model initially employs Glove technique for word embedding process.Besides,a multiclass-weighted kernel extreme learning machine(M-WKELM)model is utilized for effectual identification and categorization of CB.Finally,Search and Rescue Optimization(SRO)algorithm is exploited to fine tune the parameters involved in the M-WKELM model.The experimental validation of the SRO-MLCOSN model on the benchmark dataset reported significant outcomes over the other approaches with precision,recall,and F1-score of 96.24%,98.71%,and 97.46%respectively.
基金The authors extend their appreciation to the Deanship of Scientific Research at King Khalid University for funding this work through Large Groups Project Under Grant Number(61/43)Princess Nourah bint Abdulrahman University Researchers Supporting Project Number(PNURSP2022R114)+1 种基金Princess Nourah bint Abdulrahman University,Riyadh,Saudi ArabiaThe authors would like to thank the Deanship of Scientific Research at Umm Al-Qura University for supporting this work by Grant Code:(22UQU4310373DSR26).
文摘Autism Spectrum Disorder (ASD) refers to a neuro-disorder wherean individual has long-lasting effects on communication and interaction withothers.Advanced information technologywhich employs artificial intelligence(AI) model has assisted in early identify ASD by using pattern detection.Recent advances of AI models assist in the automated identification andclassification of ASD, which helps to reduce the severity of the disease.This study introduces an automated ASD classification using owl searchalgorithm with machine learning (ASDC-OSAML) model. The proposedASDC-OSAML model majorly focuses on the identification and classificationof ASD. To attain this, the presentedASDC-OSAML model follows minmaxnormalization approach as a pre-processing stage. Next, the owl searchalgorithm (OSA)-based feature selection (OSA-FS) model is used to derivefeature subsets. Then, beetle swarm antenna search (BSAS) algorithm withIterative Dichotomiser 3 (ID3) classification method was implied for ASDdetection and classification. The design of BSAS algorithm helps to determinethe parameter values of the ID3 classifier. The performance analysis of theASDC-OSAML model is performed using benchmark dataset. An extensivecomparison study highlighted the supremacy of the ASDC-OSAML modelover recent state of art approaches.
文摘Currently, the second most devastating form of cancer in people, particularly in women, is Breast Cancer (BC). In the healthcare industry, Machine Learning (ML) is commonly employed in fatal disease prediction. Due to breast cancer’s favourable prognosis at an early stage, a model is created to utilize the Dataset on Wisconsin Diagnostic Breast Cancer (WDBC). Conversely, this model’s overarching axiom is to compare the effectiveness of five well-known ML classifiers, including Logistic Regression (LR), Decision Tree (DT), Random Forest (RF), K-Nearest Neighbor (KNN), and Naive Bayes (NB) with the conventional method. To counterbalance the effect with conventional methods, the overarching tactic we utilized was hyperparameter tuning utilizing the grid search method, which improved accuracy, secondary precision, third recall, F1 score and finally the AUC & ROC curve. In this study of hyperparameter tuning model, the rate of accuracy increased from 94.15% to 98.83% whereas the accuracy of the conventional method increased from 93.56% to 97.08%. According to this investigation, KNN outperformed all other classifiers in terms of accuracy, achieving a score of 98.83%. In conclusion, our study shows that KNN works well with the hyper-tuning method. These analyses show that this study prediction approach is useful in prognosticating women with breast cancer with a viable performance and more accurate findings when compared to the conventional approach.
文摘Word Sense Disambiguation has been a trending topic of research in Natural Language Processing and Machine Learning.Mining core features and performing the text classification still exist as a challenging task.Here the features of the context such as neighboring words like adjective provide the evidence for classification using machine learning approach.This paper presented the text document classification that has wide applications in information retrieval,which uses movie review datasets.Here the document indexing based on controlled vocabulary,adjective,word sense disambiguation,generating hierarchical cate-gorization of web pages,spam detection,topic labeling,web search,document summarization,etc.Here the kernel support vector machine learning algorithm helps to classify the text and feature extract is performed by cuckoo search opti-mization.Positive review and negative review of movie dataset is presented to get the better classification accuracy.Experimental results focused with context mining,feature analysis and classification.By comparing with the previous work,proposed work designed to achieve the efficient results.Overall design is per-formed with MATLAB 2020a tool.
文摘Because of everyone's involvement in social networks, social networks are full of massive multimedia data, and events are got released and disseminated through social networks in the form of multi-modal and multi-attribute heterogeneous data. There have been numerous researches on social network search. Considering the spatio-temporal feature of messages and social relationships among users, we summarized an overall social network search framework from the perspective of semantics based on existing researches. For social network search, the acquisition and representation of spatio-temporal data is the basis, the semantic analysis and modeling of social network cross-media big data is an important component, deep semantic learning of social networks is the key research field, and the indexing and ranking mechanism is the indispensable part. This paper reviews the current studies in these fields, and then main challenges of social network search are given. Finally, we give an outlook to the prospect and further work of social network search.
文摘Internet of Things(IoTs)provides better solutions in various fields,namely healthcare,smart transportation,home,etc.Recognizing Denial of Service(DoS)outbreaks in IoT platforms is significant in certifying the accessibility and integrity of IoT systems.Deep learning(DL)models outperform in detecting complex,non-linear relationships,allowing them to effectually severe slight deviations fromnormal IoT activities that may designate a DoS outbreak.The uninterrupted observation and real-time detection actions of DL participate in accurate and rapid detection,permitting proactive reduction events to be executed,hence securing the IoT network’s safety and functionality.Subsequently,this study presents pigeon-inspired optimization with a DL-based attack detection and classification(PIODL-ADC)approach in an IoT environment.The PIODL-ADC approach implements a hyperparameter-tuned DL method for Distributed Denial-of-Service(DDoS)attack detection in an IoT platform.Initially,the PIODL-ADC model utilizes Z-score normalization to scale input data into a uniformformat.For handling the convolutional and adaptive behaviors of IoT,the PIODL-ADCmodel employs the pigeon-inspired optimization(PIO)method for feature selection to detect the related features,considerably enhancing the recognition’s accuracy.Also,the Elman Recurrent Neural Network(ERNN)model is utilized to recognize and classify DDoS attacks.Moreover,reptile search algorithm(RSA)based hyperparameter tuning is employed to improve the precision and robustness of the ERNN method.A series of investigational validations is made to ensure the accomplishment of the PIODL-ADC method.The experimental outcome exhibited that the PIODL-ADC method shows greater accomplishment when related to existing models,with a maximum accuracy of 99.81%.
基金funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University(IMSIU)through Research Partnership Program No.RP-21-07-09.
文摘Cloud computing environments,characterized by dynamic scaling,distributed architectures,and complex work-loads,are increasingly targeted by malicious actors.These threats encompass unauthorized access,data breaches,denial-of-service attacks,and evolving malware variants.Traditional security solutions often struggle with the dynamic nature of cloud environments,highlighting the need for robust Adaptive Cloud Intrusion Detection Systems(CIDS).Existing adaptive CIDS solutions,while offering improved detection capabilities,often face limitations such as reliance on approximations for change point detection,hindering their precision in identifying anomalies.This can lead to missed attacks or an abundance of false alarms,impacting overall security effectiveness.To address these challenges,we propose ACIDS(Adaptive Cloud Intrusion Detection System)-PELT.This novel Adaptive CIDS framework leverages the Pruned Exact Linear Time(PELT)algorithm and a Support Vector Machine(SVM)for enhanced accuracy and efficiency.ACIDS-PELT comprises four key components:(1)Feature Selection:Utilizing a hybrid harmony search algorithm and the symmetrical uncertainty filter(HSO-SU)to identify the most relevant features that effectively differentiate between normal and anomalous network traffic in the cloud environment.(2)Surveillance:Employing the PELT algorithm to detect change points within the network traffic data,enabling the identification of anomalies and potential security threats with improved precision compared to existing approaches.(3)Training Set:Labeled network traffic data forms the training set used to train the SVM classifier to distinguish between normal and anomalous behaviour patterns.(4)Testing Set:The testing set evaluates ACIDS-PELT’s performance by measuring its accuracy,precision,and recall in detecting security threats within the cloud environment.We evaluate the performance of ACIDS-PELT using the NSL-KDD benchmark dataset.The results demonstrate that ACIDS-PELT outperforms existing cloud intrusion detection techniques in terms of accuracy,precision,and recall.This superiority stems from ACIDS-PELT’s ability to overcome limitations associated with approximation and imprecision in change point detection while offering a more accurate and precise approach to detecting security threats in dynamic cloud environments.