In the continuous development of the modern highway and bridge engineering industry,the reasonable selection of mega highway bridges and their design is crucial.Based on this,this paper takes the actual bridge project...In the continuous development of the modern highway and bridge engineering industry,the reasonable selection of mega highway bridges and their design is crucial.Based on this,this paper takes the actual bridge project as an example,and analyses the overall selection design of such highway bridges,including the basic overview of the project,the basic selection principle of mega highway bridge project structure and its design strategy,etc.,to provide scientific reference for its selection design.展开更多
In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature sel...In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature selection aims to alleviate this issue by minimizing the number of features in the subset while simultaneously minimizing the classification error rate.Single-objective optimization approaches employ an evaluation function designed as an aggregate function with a parameter,but the results obtained depend on the value of the parameter.To eliminate this parameter’s influence,the problem can be reformulated as a multi-objective optimization problem.The Whale Optimization Algorithm(WOA)is widely used in optimization problems because of its simplicity and easy implementation.In this paper,we propose a multi-strategy assisted multi-objective WOA(MSMOWOA)to address feature selection.To enhance the algorithm’s search ability,we integrate multiple strategies such as Levy flight,Grey Wolf Optimizer,and adaptive mutation into it.Additionally,we utilize an external repository to store non-dominant solution sets and grid technology is used to maintain diversity.Results on fourteen University of California Irvine(UCI)datasets demonstrate that our proposed method effectively removes redundant features and improves classification performance.The source code can be accessed from the website:https://github.com/zc0315/MSMOWOA.展开更多
In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selec...In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.展开更多
Genomic selection(GS)has been widely used in livestock,which greatly accelerated the genetic progress of complex traits.The population size was one of the significant factors affecting the prediction accuracy,while it...Genomic selection(GS)has been widely used in livestock,which greatly accelerated the genetic progress of complex traits.The population size was one of the significant factors affecting the prediction accuracy,while it was limited by the purebred population.Compared to directly combining two uncorrelated purebred populations to extend the reference population size,it might be more meaningful to incorporate the correlated crossbreds into reference population for genomic prediction.In this study,we simulated purebred offspring(PAS and PBS)and crossbred offspring(CAB)base on real genotype data of two base purebred populations(PA and PB),to evaluate the performance of genomic selection on purebred while incorporating crossbred information.The results showed that selecting key crossbred individuals via maximizing the expected genetic relationship(REL)was better than the other methods(individuals closet or farthest to the purebred population,CP/FP)in term of the prediction accuracy.Furthermore,the prediction accuracy of reference populations combining PA and CAB was significantly better only based on PA,which was similar to combine PA and PAS.Moreover,the rank correlation between the multiple of the increased relationship(MIR)and reliability improvement was 0.60-0.70.But for individuals with low correlation(Cor(Pi,PA or B),the reliability improvement was significantly lower than other individuals.Our findings suggested that incorporating crossbred into purebred population could improve the performance of genetic prediction compared with using the purebred population only.The genetic relationship between purebred and crossbred population is a key factor determining the increased reliability while incorporating crossbred population in the genomic prediction on pure bred individuals.展开更多
Federated learning is an important distributed model training technique in Internet of Things(IoT),in which participant selection is a key component that plays a role in improving training efficiency and model accurac...Federated learning is an important distributed model training technique in Internet of Things(IoT),in which participant selection is a key component that plays a role in improving training efficiency and model accuracy.This module enables a central server to select a subset of participants to performmodel training based on data and device information.By doing so,selected participants are rewarded and actively perform model training,while participants that are detrimental to training efficiency and model accuracy are excluded.However,in practice,participants may suspect that the central server may have miscalculated and thus not made the selection honestly.This lack of trustworthiness problem,which can demotivate participants,has received little attention.Another problem that has received little attention is the leakage of participants’private information during the selection process.We will therefore propose a federated learning framework with auditable participant selection.It supports smart contracts in selecting a set of suitable participants based on their training loss without compromising the privacy.Considering the possibility of malicious campaigning and impersonation of participants,the framework employs commitment schemes and zero-knowledge proofs to counteract these malicious behaviors.Finally,we analyze the security of the framework and conduct a series of experiments to demonstrate that the framework can effectively improve the efficiency of federated learning.展开更多
Scallop culture is an important way of bottom-seeding marine ranching,which is of great significance to improve the current situation of fishery resources.However,there are some problems in site-selection evaluation o...Scallop culture is an important way of bottom-seeding marine ranching,which is of great significance to improve the current situation of fishery resources.However,there are some problems in site-selection evaluation of marine ranching,such as imperfect criteria system,complex structure,untargeted criteria quantification,etc.In addition,no site-selection evaluation method of bottom-seeding culture areas for scallops is available.Therefore,we established a hierarchy structure model according to the analytic hierarchy process(AHP)theory,in which social,physical,chemical,and biological environments are used as main criteria,and marine functional zonation,water depth,current,water temperature,salinity,substrate type,water quality,sediment quality,red tide,phytoplankton,and zooplankton are used as sub-criteria,on which a multi-parameter evaluation system is set up.Meanwhile,the dualism method,assignment method,and membership function method were used to quantify sub-criteria,and a quantitative evaluation for the entire criteria was added,including the evaluation and analysis of two types of unsuitable environmental situations.By overall consideration in scallop yield,quality,and marine ranching construction objectives,the weight of the main criteria could be determined.Five grades in the suitability corresponding to the evaluation result were divided,and the Python language was used to create an evaluation system for efficient calculation and intuitive presentation of the evaluation outcome.Eight marine cases were simulated based on existing survey data,and the results prove that the method is feasible for evaluating and analyzing the site selection of bottom-seeding culture areas for scallops under various environmental situations.The proposed evaluation method can be promoted for the site selection of bottom-seeding marine ranching.This study provided theoretical and methodological references for the site selection evaluation of other types of marine ranching.展开更多
Manganese superoxide dismutase(MnSOD)is an antioxidant that exists in mitochondria and can effectively remove superoxide anions in mitochondria.In a dark,high-pressure,and low-temperature deep-sea environment,MnSOD is...Manganese superoxide dismutase(MnSOD)is an antioxidant that exists in mitochondria and can effectively remove superoxide anions in mitochondria.In a dark,high-pressure,and low-temperature deep-sea environment,MnSOD is essential for the survival of sea cucumbers.Six MnSODs were identified from the transcriptomes of deep and shallow-sea sea cucumbers.To explore their environmental adaptation mechanism,we conducted environmental selection pressure analysis through the branching site model of PAML software.We obtained night positive selection sites,and two of them were significant(97F→H,134K→V):97F→H located in a highly conservative characteristic sequence,and its polarity c hange might have a great impact on the function of MnSOD;134K→V had a change in piezophilic a bility,which might help MnSOD adapt to the environment of high hydrostatic pressure in the deepsea.To further study the effect of these two positive selection sites on MnSOD,we predicted the point mutations of F97H and K134V on shallow-sea sea cucumber by using MAESTROweb and PyMOL.Results show that 97F→H,134K→V might improve MnSOD’s efficiency of scavenging superoxide a nion and its ability to resist high hydrostatic pressure by moderately reducing its stability.The above results indicated that MnSODs of deep-sea sea cucumber adapted to deep-sea environments through their amino acid changes in polarity,piezophilic behavior,and local stability.This study revealed the correlation between MnSOD and extreme environment,and will help improve our understanding of the organism’s adaptation mechanisms in deep sea.展开更多
This study employs a data-driven methodology that embeds the principle of dimensional invariance into an artificial neural network to automatically identify dominant dimensionless quantities in the penetration of rod ...This study employs a data-driven methodology that embeds the principle of dimensional invariance into an artificial neural network to automatically identify dominant dimensionless quantities in the penetration of rod projectiles into semi-infinite metal targets from experimental measurements.The derived mathematical expressions of dimensionless quantities are simplified by the examination of the exponent matrix and coupling relationships between feature variables.As a physics-based dimension reduction methodology,this way reduces high-dimensional parameter spaces to descriptions involving only a few physically interpretable dimensionless quantities in penetrating cases.Then the relative importance of various dimensionless feature variables on the penetration efficiencies for four impacting conditions is evaluated through feature selection engineering.The results indicate that the selected critical dimensionless feature variables by this synergistic method,without referring to the complex theoretical equations and aiding in the detailed knowledge of penetration mechanics,are in accordance with those reported in the reference.Lastly,the determined dimensionless quantities can be efficiently applied to conduct semi-empirical analysis for the specific penetrating case,and the reliability of regression functions is validated.展开更多
In recent years,deep learning-based signal recognition technology has gained attention and emerged as an important approach for safeguarding the electromagnetic environment.However,training deep learning-based classif...In recent years,deep learning-based signal recognition technology has gained attention and emerged as an important approach for safeguarding the electromagnetic environment.However,training deep learning-based classifiers on large signal datasets with redundant samples requires significant memory and high costs.This paper proposes a support databased core-set selection method(SD)for signal recognition,aiming to screen a representative subset that approximates the large signal dataset.Specifically,this subset can be identified by employing the labeled information during the early stages of model training,as some training samples are labeled as supporting data frequently.This support data is crucial for model training and can be found using a border sample selector.Simulation results demonstrate that the SD method minimizes the impact on model recognition performance while reducing the dataset size,and outperforms five other state-of-the-art core-set selection methods when the fraction of training sample kept is less than or equal to 0.3 on the RML2016.04C dataset or 0.5 on the RML22 dataset.The SD method is particularly helpful for signal recognition tasks with limited memory and computing resources.展开更多
The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques we...The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.展开更多
Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is ext...Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.展开更多
Millimeter-wave transmission combined with Orbital Angular Momentum(OAM)has the advantage of reducing the loss of beam power and increasing the system capacity.However,to fulfill this advantage,the antennas at the tra...Millimeter-wave transmission combined with Orbital Angular Momentum(OAM)has the advantage of reducing the loss of beam power and increasing the system capacity.However,to fulfill this advantage,the antennas at the transmitter and receiver must be parallel and coaxial;otherwise,the accuracy of mode detection at the receiver can be seriously influenced.In this paper,we design an OAM millimeter-wave communication system for overcoming the above limitation.Specifically,the first contribution is that the power distribution between different OAM modes and the capacity of the system with different mode sets are analytically derived for performance analysis.The second contribution lies in that a novel mode selection scheme is proposed to reduce the total interference between different modes.Numerical results show that system performance is less affected by the offset when the mode set with smaller modes or larger intervals is selected.展开更多
Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin ...Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.展开更多
Ground hydraulic fracturing plays a crucial role in controlling the far-field hard roof,making it imperative to identify the most suitable target stratum for effective control.Physical experiments are conducted based ...Ground hydraulic fracturing plays a crucial role in controlling the far-field hard roof,making it imperative to identify the most suitable target stratum for effective control.Physical experiments are conducted based on engineering properties to simulate the gradual collapse of the roof during longwall top coal caving(LTCC).A numerical model is established using the material point method(MPM)and the strain-softening damage constitutive model according to the structure of the physical model.Numerical simulations are conducted to analyze the LTCC process under different hard roofs for ground hydraulic fracturing.The results show that ground hydraulic fracturing releases the energy and stress of the target stratum,resulting in a substantial lag in the fracturing of the overburden before collapse occurs in the hydraulic fracturing stratum.Ground hydraulic fracturing of a low hard roof reduces the lag effect of hydraulic fractures,dissipates the energy consumed by the fracture of the hard roof,and reduces the abutment stress.Therefore,it is advisable to prioritize the selection of the lower hard roof as the target stratum.展开更多
The grain protein content(GPC)is the key parameter for wheat grain nutritional quality.This study conducted a resampling GWAS analysis using 406 wheat accessions across eight environments,and identified four previousl...The grain protein content(GPC)is the key parameter for wheat grain nutritional quality.This study conducted a resampling GWAS analysis using 406 wheat accessions across eight environments,and identified four previously reported GPC QTLs.An analysis of 87 landraces and 259 modern cultivars revealed the loss of superior GPC haplotypes,especially in Chinese cultivars.These haplotypes were preferentially adopted in different agroecological zones and had broad effects on wheat yield and agronomic traits.Most GPC QTLs did not significantly reduce yield,suggesting that high GPC can be achieved without a yield penalty.The results of this study provide a reference for future GPC breeding in wheat using the four identified QTLs.展开更多
The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,a...The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.展开更多
Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In...Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In recent years,meta-heuristic algorithms have been widely used in FS problems,so a Hybrid Binary Chaotic Salp Swarm Dung Beetle Optimization(HBCSSDBO)algorithm is proposed in this paper to improve the effect of FS.In this hybrid algorithm,the original continuous optimization algorithm is converted into binary form by the S-type transfer function and applied to the FS problem.By combining the K nearest neighbor(KNN)classifier,the comparative experiments for FS are carried out between the proposed method and four advanced meta-heuristic algorithms on 16 UCI(University of California,Irvine)datasets.Seven evaluation metrics such as average adaptation,average prediction accuracy,and average running time are chosen to judge and compare the algorithms.The selected dataset is also discussed by categorizing it into three dimensions:high,medium,and low dimensions.Experimental results show that the HBCSSDBO feature selection method has the ability to obtain a good subset of features while maintaining high classification accuracy,shows better optimization performance.In addition,the results of statistical tests confirm the significant validity of the method.展开更多
Federated learning enables data owners in the Internet of Things(IoT)to collaborate in training models without sharing private data,creating new business opportunities for building a data market.However,in practical o...Federated learning enables data owners in the Internet of Things(IoT)to collaborate in training models without sharing private data,creating new business opportunities for building a data market.However,in practical operation,there are still some problems with federated learning applications.Blockchain has the characteristics of decentralization,distribution,and security.The blockchain-enabled federated learning further improve the security and performance of model training,while also expanding the application scope of federated learning.Blockchain has natural financial attributes that help establish a federated learning data market.However,the data of federated learning tasks may be distributed across a large number of resource-constrained IoT devices,which have different computing,communication,and storage resources,and the data quality of each device may also vary.Therefore,how to effectively select the clients with the data required for federated learning task is a research hotspot.In this paper,a two-stage client selection scheme for blockchain-enabled federated learning is proposed,which first selects clients that satisfy federated learning task through attribute-based encryption,protecting the attribute privacy of clients.Then blockchain nodes select some clients for local model aggregation by proximal policy optimization algorithm.Experiments show that the model performance of our two-stage client selection scheme is higher than that of other client selection algorithms when some clients are offline and the data quality is poor.展开更多
With the rapid development and application of energy harvesting technology,it has become a prominent research area due to its significant benefits in terms of green environmental protection,convenience,and high safety...With the rapid development and application of energy harvesting technology,it has become a prominent research area due to its significant benefits in terms of green environmental protection,convenience,and high safety and efficiency.However,the uneven energy collection and consumption among IoT devices at varying distances may lead to resource imbalance within energy harvesting networks,thereby resulting in low energy transmission efficiency.To enhance the energy transmission efficiency of IoT devices in energy harvesting,this paper focuses on the utilization of collaborative communication,along with pricing-based incentive mechanisms and auction strategies.We propose a dynamic relay selection scheme,including a ladder pricing mechanism based on energy level and a Kuhn-Munkre Algorithm based on an auction theory employing a negotiation mechanism,to encourage more IoT devices to participate in the collaboration process.Simulation results demonstrate that the proposed algorithm outperforms traditional algorithms in terms of improving the energy efficiency of the system.展开更多
With the advancement of wireless network technology,vast amounts of traffic have been generated,and malicious traffic attacks that threaten the network environment are becoming increasingly sophisticated.While signatu...With the advancement of wireless network technology,vast amounts of traffic have been generated,and malicious traffic attacks that threaten the network environment are becoming increasingly sophisticated.While signature-based detection methods,static analysis,and dynamic analysis techniques have been previously explored for malicious traffic detection,they have limitations in identifying diversified malware traffic patterns.Recent research has been focused on the application of machine learning to detect these patterns.However,applying machine learning to lightweight devices like IoT devices is challenging because of the high computational demands and complexity involved in the learning process.In this study,we examined methods for effectively utilizing machine learning-based malicious traffic detection approaches for lightweight devices.We introduced the suboptimal feature selection model(SFSM),a feature selection technique designed to reduce complexity while maintaining the effectiveness of malicious traffic detection.Detection performance was evaluated on various malicious traffic,benign,exploits,and generic,using the UNSW-NB15 dataset and SFSM sub-optimized hyperparameters for feature selection and narrowed the search scope to encompass all features.SFSM improved learning performance while minimizing complexity by considering feature selection and exhaustive search as two steps,a problem not considered in conventional models.Our experimental results showed that the detection accuracy was improved by approximately 20%compared to the random model,and the reduction in accuracy compared to the greedy model,which performs an exhaustive search on all features,was kept within 6%.Additionally,latency and complexity were reduced by approximately 96%and 99.78%,respectively,compared to the greedy model.This study demonstrates that malicious traffic can be effectively detected even in lightweight device environments.SFSM verified the possibility of detecting various attack traffic on lightweight devices.展开更多
文摘In the continuous development of the modern highway and bridge engineering industry,the reasonable selection of mega highway bridges and their design is crucial.Based on this,this paper takes the actual bridge project as an example,and analyses the overall selection design of such highway bridges,including the basic overview of the project,the basic selection principle of mega highway bridge project structure and its design strategy,etc.,to provide scientific reference for its selection design.
基金supported in part by the Natural Science Youth Foundation of Hebei Province under Grant F2019403207in part by the PhD Research Startup Foundation of Hebei GEO University under Grant BQ2019055+3 种基金in part by the Open Research Project of the Hubei Key Laboratory of Intelligent Geo-Information Processing under Grant KLIGIP-2021A06in part by the Fundamental Research Funds for the Universities in Hebei Province under Grant QN202220in part by the Science and Technology Research Project for Universities of Hebei under Grant ZD2020344in part by the Guangxi Natural Science Fund General Project under Grant 2021GXNSFAA075029.
文摘In classification problems,datasets often contain a large amount of features,but not all of them are relevant for accurate classification.In fact,irrelevant features may even hinder classification accuracy.Feature selection aims to alleviate this issue by minimizing the number of features in the subset while simultaneously minimizing the classification error rate.Single-objective optimization approaches employ an evaluation function designed as an aggregate function with a parameter,but the results obtained depend on the value of the parameter.To eliminate this parameter’s influence,the problem can be reformulated as a multi-objective optimization problem.The Whale Optimization Algorithm(WOA)is widely used in optimization problems because of its simplicity and easy implementation.In this paper,we propose a multi-strategy assisted multi-objective WOA(MSMOWOA)to address feature selection.To enhance the algorithm’s search ability,we integrate multiple strategies such as Levy flight,Grey Wolf Optimizer,and adaptive mutation into it.Additionally,we utilize an external repository to store non-dominant solution sets and grid technology is used to maintain diversity.Results on fourteen University of California Irvine(UCI)datasets demonstrate that our proposed method effectively removes redundant features and improves classification performance.The source code can be accessed from the website:https://github.com/zc0315/MSMOWOA.
基金the Deputyship for Research and Innovation,“Ministry of Education”in Saudi Arabia for funding this research(IFKSUOR3-014-3).
文摘In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.
基金supported by the earmarked fund for China Agriculture Research System(CARS-35)the National Natural Science Foundation of China(32022078)supported by the National Supercomputer Centre in Guangzhou。
文摘Genomic selection(GS)has been widely used in livestock,which greatly accelerated the genetic progress of complex traits.The population size was one of the significant factors affecting the prediction accuracy,while it was limited by the purebred population.Compared to directly combining two uncorrelated purebred populations to extend the reference population size,it might be more meaningful to incorporate the correlated crossbreds into reference population for genomic prediction.In this study,we simulated purebred offspring(PAS and PBS)and crossbred offspring(CAB)base on real genotype data of two base purebred populations(PA and PB),to evaluate the performance of genomic selection on purebred while incorporating crossbred information.The results showed that selecting key crossbred individuals via maximizing the expected genetic relationship(REL)was better than the other methods(individuals closet or farthest to the purebred population,CP/FP)in term of the prediction accuracy.Furthermore,the prediction accuracy of reference populations combining PA and CAB was significantly better only based on PA,which was similar to combine PA and PAS.Moreover,the rank correlation between the multiple of the increased relationship(MIR)and reliability improvement was 0.60-0.70.But for individuals with low correlation(Cor(Pi,PA or B),the reliability improvement was significantly lower than other individuals.Our findings suggested that incorporating crossbred into purebred population could improve the performance of genetic prediction compared with using the purebred population only.The genetic relationship between purebred and crossbred population is a key factor determining the increased reliability while incorporating crossbred population in the genomic prediction on pure bred individuals.
基金supported by the Key-Area Research and Development Program of Guangdong Province under Grant No.2020B0101090004the National Natural Science Foundation of China under Grant No.62072215,the Guangzhou Basic Research Plan City-School Joint Funding Project under Grant No.2024A03J0405+1 种基金the Guangzhou Basic and Applied Basic Research Foundation under Grant No.2024A04J3458the State Archives Administration Science and Technology Program Plan of China under Grant 2023-X-028.
文摘Federated learning is an important distributed model training technique in Internet of Things(IoT),in which participant selection is a key component that plays a role in improving training efficiency and model accuracy.This module enables a central server to select a subset of participants to performmodel training based on data and device information.By doing so,selected participants are rewarded and actively perform model training,while participants that are detrimental to training efficiency and model accuracy are excluded.However,in practice,participants may suspect that the central server may have miscalculated and thus not made the selection honestly.This lack of trustworthiness problem,which can demotivate participants,has received little attention.Another problem that has received little attention is the leakage of participants’private information during the selection process.We will therefore propose a federated learning framework with auditable participant selection.It supports smart contracts in selecting a set of suitable participants based on their training loss without compromising the privacy.Considering the possibility of malicious campaigning and impersonation of participants,the framework employs commitment schemes and zero-knowledge proofs to counteract these malicious behaviors.Finally,we analyze the security of the framework and conduct a series of experiments to demonstrate that the framework can effectively improve the efficiency of federated learning.
基金Supported by the Strategic Priority Research Program of the Chinese Academy of Sciences(No.XDB 42010203)the National Natural Science Foundation of China(No.42176090)。
文摘Scallop culture is an important way of bottom-seeding marine ranching,which is of great significance to improve the current situation of fishery resources.However,there are some problems in site-selection evaluation of marine ranching,such as imperfect criteria system,complex structure,untargeted criteria quantification,etc.In addition,no site-selection evaluation method of bottom-seeding culture areas for scallops is available.Therefore,we established a hierarchy structure model according to the analytic hierarchy process(AHP)theory,in which social,physical,chemical,and biological environments are used as main criteria,and marine functional zonation,water depth,current,water temperature,salinity,substrate type,water quality,sediment quality,red tide,phytoplankton,and zooplankton are used as sub-criteria,on which a multi-parameter evaluation system is set up.Meanwhile,the dualism method,assignment method,and membership function method were used to quantify sub-criteria,and a quantitative evaluation for the entire criteria was added,including the evaluation and analysis of two types of unsuitable environmental situations.By overall consideration in scallop yield,quality,and marine ranching construction objectives,the weight of the main criteria could be determined.Five grades in the suitability corresponding to the evaluation result were divided,and the Python language was used to create an evaluation system for efficient calculation and intuitive presentation of the evaluation outcome.Eight marine cases were simulated based on existing survey data,and the results prove that the method is feasible for evaluating and analyzing the site selection of bottom-seeding culture areas for scallops under various environmental situations.The proposed evaluation method can be promoted for the site selection of bottom-seeding marine ranching.This study provided theoretical and methodological references for the site selection evaluation of other types of marine ranching.
基金Supported by the Guangdong Province Basic and Applied Basic Research Fund Project(No.2020A1515110826)the National Natural Science Foundation of China(No.42006115)the Major Scientific and Technological Projects of Hainan Province(No.ZDKJ2021036)。
文摘Manganese superoxide dismutase(MnSOD)is an antioxidant that exists in mitochondria and can effectively remove superoxide anions in mitochondria.In a dark,high-pressure,and low-temperature deep-sea environment,MnSOD is essential for the survival of sea cucumbers.Six MnSODs were identified from the transcriptomes of deep and shallow-sea sea cucumbers.To explore their environmental adaptation mechanism,we conducted environmental selection pressure analysis through the branching site model of PAML software.We obtained night positive selection sites,and two of them were significant(97F→H,134K→V):97F→H located in a highly conservative characteristic sequence,and its polarity c hange might have a great impact on the function of MnSOD;134K→V had a change in piezophilic a bility,which might help MnSOD adapt to the environment of high hydrostatic pressure in the deepsea.To further study the effect of these two positive selection sites on MnSOD,we predicted the point mutations of F97H and K134V on shallow-sea sea cucumber by using MAESTROweb and PyMOL.Results show that 97F→H,134K→V might improve MnSOD’s efficiency of scavenging superoxide a nion and its ability to resist high hydrostatic pressure by moderately reducing its stability.The above results indicated that MnSODs of deep-sea sea cucumber adapted to deep-sea environments through their amino acid changes in polarity,piezophilic behavior,and local stability.This study revealed the correlation between MnSOD and extreme environment,and will help improve our understanding of the organism’s adaptation mechanisms in deep sea.
基金supported by the National Natural Science Foundation of China(Grant Nos.12272257,12102292,12032006)the special fund for Science and Technology Innovation Teams of Shanxi Province(Nos.202204051002006).
文摘This study employs a data-driven methodology that embeds the principle of dimensional invariance into an artificial neural network to automatically identify dominant dimensionless quantities in the penetration of rod projectiles into semi-infinite metal targets from experimental measurements.The derived mathematical expressions of dimensionless quantities are simplified by the examination of the exponent matrix and coupling relationships between feature variables.As a physics-based dimension reduction methodology,this way reduces high-dimensional parameter spaces to descriptions involving only a few physically interpretable dimensionless quantities in penetrating cases.Then the relative importance of various dimensionless feature variables on the penetration efficiencies for four impacting conditions is evaluated through feature selection engineering.The results indicate that the selected critical dimensionless feature variables by this synergistic method,without referring to the complex theoretical equations and aiding in the detailed knowledge of penetration mechanics,are in accordance with those reported in the reference.Lastly,the determined dimensionless quantities can be efficiently applied to conduct semi-empirical analysis for the specific penetrating case,and the reliability of regression functions is validated.
基金supported by National Natural Science Foundation of China(62371098)Natural Science Foundation of Sichuan Province(2023NSFSC1422)+1 种基金National Key Research and Development Program of China(2021YFB2900404)Central Universities of South west Minzu University(ZYN2022032).
文摘In recent years,deep learning-based signal recognition technology has gained attention and emerged as an important approach for safeguarding the electromagnetic environment.However,training deep learning-based classifiers on large signal datasets with redundant samples requires significant memory and high costs.This paper proposes a support databased core-set selection method(SD)for signal recognition,aiming to screen a representative subset that approximates the large signal dataset.Specifically,this subset can be identified by employing the labeled information during the early stages of model training,as some training samples are labeled as supporting data frequently.This support data is crucial for model training and can be found using a border sample selector.Simulation results demonstrate that the SD method minimizes the impact on model recognition performance while reducing the dataset size,and outperforms five other state-of-the-art core-set selection methods when the fraction of training sample kept is less than or equal to 0.3 on the RML2016.04C dataset or 0.5 on the RML22 dataset.The SD method is particularly helpful for signal recognition tasks with limited memory and computing resources.
基金supported by the Second Tibetan Plateau Scientific Expedition and Research Program(Grant no.2019QZKK0904)Natural Science Foundation of Hebei Province(Grant no.D2022403032)S&T Program of Hebei(Grant no.E2021403001).
文摘The selection of important factors in machine learning-based susceptibility assessments is crucial to obtain reliable susceptibility results.In this study,metaheuristic optimization and feature selection techniques were applied to identify the most important input parameters for mapping debris flow susceptibility in the southern mountain area of Chengde City in Hebei Province,China,by using machine learning algorithms.In total,133 historical debris flow records and 16 related factors were selected.The support vector machine(SVM)was first used as the base classifier,and then a hybrid model was introduced by a two-step process.First,the particle swarm optimization(PSO)algorithm was employed to select the SVM model hyperparameters.Second,two feature selection algorithms,namely principal component analysis(PCA)and PSO,were integrated into the PSO-based SVM model,which generated the PCA-PSO-SVM and FS-PSO-SVM models,respectively.Three statistical metrics(accuracy,recall,and specificity)and the area under the receiver operating characteristic curve(AUC)were employed to evaluate and validate the performance of the models.The results indicated that the feature selection-based models exhibited the best performance,followed by the PSO-based SVM and SVM models.Moreover,the performance of the FS-PSO-SVM model was better than that of the PCA-PSO-SVM model,showing the highest AUC,accuracy,recall,and specificity values in both the training and testing processes.It was found that the selection of optimal features is crucial to improving the reliability of debris flow susceptibility assessment results.Moreover,the PSO algorithm was found to be not only an effective tool for hyperparameter optimization,but also a useful feature selection algorithm to improve prediction accuracies of debris flow susceptibility by using machine learning algorithms.The high and very high debris flow susceptibility zone appropriately covers 38.01%of the study area,where debris flow may occur under intensive human activities and heavy rainfall events.
文摘Speech emotion recognition(SER)uses acoustic analysis to find features for emotion recognition and examines variations in voice that are caused by emotions.The number of features acquired with acoustic analysis is extremely high,so we introduce a hybrid filter-wrapper feature selection algorithm based on an improved equilibrium optimizer for constructing an emotion recognition system.The proposed algorithm implements multi-objective emotion recognition with the minimum number of selected features and maximum accuracy.First,we use the information gain and Fisher Score to sort the features extracted from signals.Then,we employ a multi-objective ranking method to evaluate these features and assign different importance to them.Features with high rankings have a large probability of being selected.Finally,we propose a repair strategy to address the problem of duplicate solutions in multi-objective feature selection,which can improve the diversity of solutions and avoid falling into local traps.Using random forest and K-nearest neighbor classifiers,four English speech emotion datasets are employed to test the proposed algorithm(MBEO)as well as other multi-objective emotion identification techniques.The results illustrate that it performs well in inverted generational distance,hypervolume,Pareto solutions,and execution time,and MBEO is appropriate for high-dimensional English SER.
基金supported in part by The National Natural Science Foundation of China(62071255,62171232,61771257)The Major Projects of the Natural Science Foundation of the Jiangsu Higher Education Institutions(20KJA510009)+3 种基金The Open Research Fund of Key Lab of Broadband Wireless Communication and Sensor Network Technology(Nanjing University of Posts and Telecommunications),Ministry of Education(JZNY201914)The open research fund of National and Local Joint Engineering Laboratory of RF Integration and Micro-Assembly Technology,Nanjing University of Posts and Telecommunications(KFJJ20170305)The Research Fund of Nanjing University of Posts and Telecommunications(NY218012)Henan province science and technology research projects High and new technology(No.182102210106).
文摘Millimeter-wave transmission combined with Orbital Angular Momentum(OAM)has the advantage of reducing the loss of beam power and increasing the system capacity.However,to fulfill this advantage,the antennas at the transmitter and receiver must be parallel and coaxial;otherwise,the accuracy of mode detection at the receiver can be seriously influenced.In this paper,we design an OAM millimeter-wave communication system for overcoming the above limitation.Specifically,the first contribution is that the power distribution between different OAM modes and the capacity of the system with different mode sets are analytically derived for performance analysis.The second contribution lies in that a novel mode selection scheme is proposed to reduce the total interference between different modes.Numerical results show that system performance is less affected by the offset when the mode set with smaller modes or larger intervals is selected.
文摘Amid the landscape of Cloud Computing(CC),the Cloud Datacenter(DC)stands as a conglomerate of physical servers,whose performance can be hindered by bottlenecks within the realm of proliferating CC services.A linchpin in CC’s performance,the Cloud Service Broker(CSB),orchestrates DC selection.Failure to adroitly route user requests with suitable DCs transforms the CSB into a bottleneck,endangering service quality.To tackle this,deploying an efficient CSB policy becomes imperative,optimizing DC selection to meet stringent Qualityof-Service(QoS)demands.Amidst numerous CSB policies,their implementation grapples with challenges like costs and availability.This article undertakes a holistic review of diverse CSB policies,concurrently surveying the predicaments confronted by current policies.The foremost objective is to pinpoint research gaps and remedies to invigorate future policy development.Additionally,it extensively clarifies various DC selection methodologies employed in CC,enriching practitioners and researchers alike.Employing synthetic analysis,the article systematically assesses and compares myriad DC selection techniques.These analytical insights equip decision-makers with a pragmatic framework to discern the apt technique for their needs.In summation,this discourse resoundingly underscores the paramount importance of adept CSB policies in DC selection,highlighting the imperative role of efficient CSB policies in optimizing CC performance.By emphasizing the significance of these policies and their modeling implications,the article contributes to both the general modeling discourse and its practical applications in the CC domain.
基金the National Natural Science Foundation of China(No.51974042)National Key Research and Development Program of China(No.2023YFC3009005).
文摘Ground hydraulic fracturing plays a crucial role in controlling the far-field hard roof,making it imperative to identify the most suitable target stratum for effective control.Physical experiments are conducted based on engineering properties to simulate the gradual collapse of the roof during longwall top coal caving(LTCC).A numerical model is established using the material point method(MPM)and the strain-softening damage constitutive model according to the structure of the physical model.Numerical simulations are conducted to analyze the LTCC process under different hard roofs for ground hydraulic fracturing.The results show that ground hydraulic fracturing releases the energy and stress of the target stratum,resulting in a substantial lag in the fracturing of the overburden before collapse occurs in the hydraulic fracturing stratum.Ground hydraulic fracturing of a low hard roof reduces the lag effect of hydraulic fractures,dissipates the energy consumed by the fracture of the hard roof,and reduces the abutment stress.Therefore,it is advisable to prioritize the selection of the lower hard roof as the target stratum.
基金supported by the“Integration of Two Chains”Key Research and Development Projects of Shaanxi Province“Wheat Seed Industry Innovation Project”,Chinathe Key R&D of Yangling Seed Industry Innovation Center,China(Ylzy-xm-01)。
文摘The grain protein content(GPC)is the key parameter for wheat grain nutritional quality.This study conducted a resampling GWAS analysis using 406 wheat accessions across eight environments,and identified four previously reported GPC QTLs.An analysis of 87 landraces and 259 modern cultivars revealed the loss of superior GPC haplotypes,especially in Chinese cultivars.These haplotypes were preferentially adopted in different agroecological zones and had broad effects on wheat yield and agronomic traits.Most GPC QTLs did not significantly reduce yield,suggesting that high GPC can be achieved without a yield penalty.The results of this study provide a reference for future GPC breeding in wheat using the four identified QTLs.
基金funded by the University of Jeddah,Jeddah,Saudi Arabia,under Grant No.(UJ-23-DR-26)。
文摘The diversity of data sources resulted in seeking effective manipulation and dissemination.The challenge that arises from the increasing dimensionality has a negative effect on the computation performance,efficiency,and stability of computing.One of the most successful optimization algorithms is Particle Swarm Optimization(PSO)which has proved its effectiveness in exploring the highest influencing features in the search space based on its fast convergence and the ability to utilize a small set of parameters in the search task.This research proposes an effective enhancement of PSO that tackles the challenge of randomness search which directly enhances PSO performance.On the other hand,this research proposes a generic intelligent framework for early prediction of orders delay and eliminate orders backlogs which could be considered as an efficient potential solution for raising the supply chain performance.The proposed adapted algorithm has been applied to a supply chain dataset which minimized the features set from twenty-one features to ten significant features.To confirm the proposed algorithm results,the updated data has been examined by eight of the well-known classification algorithms which reached a minimum accuracy percentage equal to 94.3%for random forest and a maximum of 99.0 for Naïve Bayes.Moreover,the proposed algorithm adaptation has been compared with other proposed adaptations of PSO from the literature over different datasets.The proposed PSO adaptation reached a higher accuracy compared with the literature ranging from 97.8 to 99.36 which also proved the advancement of the current research.
基金This research was funded by the Short-Term Electrical Load Forecasting Based on Feature Selection and optimized LSTM with DBO which is the Fundamental Scientific Research Project of Liaoning Provincial Department of Education(JYTMS20230189)the Application of Hybrid Grey Wolf Algorithm in Job Shop Scheduling Problem of the Research Support Plan for Introducing High-Level Talents to Shenyang Ligong University(No.1010147001131).
文摘Feature Selection(FS)is a key pre-processing step in pattern recognition and data mining tasks,which can effectively avoid the impact of irrelevant and redundant features on the performance of classification models.In recent years,meta-heuristic algorithms have been widely used in FS problems,so a Hybrid Binary Chaotic Salp Swarm Dung Beetle Optimization(HBCSSDBO)algorithm is proposed in this paper to improve the effect of FS.In this hybrid algorithm,the original continuous optimization algorithm is converted into binary form by the S-type transfer function and applied to the FS problem.By combining the K nearest neighbor(KNN)classifier,the comparative experiments for FS are carried out between the proposed method and four advanced meta-heuristic algorithms on 16 UCI(University of California,Irvine)datasets.Seven evaluation metrics such as average adaptation,average prediction accuracy,and average running time are chosen to judge and compare the algorithms.The selected dataset is also discussed by categorizing it into three dimensions:high,medium,and low dimensions.Experimental results show that the HBCSSDBO feature selection method has the ability to obtain a good subset of features while maintaining high classification accuracy,shows better optimization performance.In addition,the results of statistical tests confirm the significant validity of the method.
文摘Federated learning enables data owners in the Internet of Things(IoT)to collaborate in training models without sharing private data,creating new business opportunities for building a data market.However,in practical operation,there are still some problems with federated learning applications.Blockchain has the characteristics of decentralization,distribution,and security.The blockchain-enabled federated learning further improve the security and performance of model training,while also expanding the application scope of federated learning.Blockchain has natural financial attributes that help establish a federated learning data market.However,the data of federated learning tasks may be distributed across a large number of resource-constrained IoT devices,which have different computing,communication,and storage resources,and the data quality of each device may also vary.Therefore,how to effectively select the clients with the data required for federated learning task is a research hotspot.In this paper,a two-stage client selection scheme for blockchain-enabled federated learning is proposed,which first selects clients that satisfy federated learning task through attribute-based encryption,protecting the attribute privacy of clients.Then blockchain nodes select some clients for local model aggregation by proximal policy optimization algorithm.Experiments show that the model performance of our two-stage client selection scheme is higher than that of other client selection algorithms when some clients are offline and the data quality is poor.
基金funded by the Researchers Supporting Project Number RSPD2024R681,King Saud University,Riyadh,Saudi Arabia.
文摘With the rapid development and application of energy harvesting technology,it has become a prominent research area due to its significant benefits in terms of green environmental protection,convenience,and high safety and efficiency.However,the uneven energy collection and consumption among IoT devices at varying distances may lead to resource imbalance within energy harvesting networks,thereby resulting in low energy transmission efficiency.To enhance the energy transmission efficiency of IoT devices in energy harvesting,this paper focuses on the utilization of collaborative communication,along with pricing-based incentive mechanisms and auction strategies.We propose a dynamic relay selection scheme,including a ladder pricing mechanism based on energy level and a Kuhn-Munkre Algorithm based on an auction theory employing a negotiation mechanism,to encourage more IoT devices to participate in the collaboration process.Simulation results demonstrate that the proposed algorithm outperforms traditional algorithms in terms of improving the energy efficiency of the system.
基金supported by the Korea Institute for Advancement of Technology(KIAT)Grant funded by the Korean Government(MOTIE)(P0008703,The Competency Development Program for Industry Specialists)MSIT under the ICAN(ICT Challenge and Advanced Network of HRD)Program(No.IITP-2022-RS-2022-00156310)supervised by the Institute of Information&Communication Technology Planning and Evaluation(IITP).
文摘With the advancement of wireless network technology,vast amounts of traffic have been generated,and malicious traffic attacks that threaten the network environment are becoming increasingly sophisticated.While signature-based detection methods,static analysis,and dynamic analysis techniques have been previously explored for malicious traffic detection,they have limitations in identifying diversified malware traffic patterns.Recent research has been focused on the application of machine learning to detect these patterns.However,applying machine learning to lightweight devices like IoT devices is challenging because of the high computational demands and complexity involved in the learning process.In this study,we examined methods for effectively utilizing machine learning-based malicious traffic detection approaches for lightweight devices.We introduced the suboptimal feature selection model(SFSM),a feature selection technique designed to reduce complexity while maintaining the effectiveness of malicious traffic detection.Detection performance was evaluated on various malicious traffic,benign,exploits,and generic,using the UNSW-NB15 dataset and SFSM sub-optimized hyperparameters for feature selection and narrowed the search scope to encompass all features.SFSM improved learning performance while minimizing complexity by considering feature selection and exhaustive search as two steps,a problem not considered in conventional models.Our experimental results showed that the detection accuracy was improved by approximately 20%compared to the random model,and the reduction in accuracy compared to the greedy model,which performs an exhaustive search on all features,was kept within 6%.Additionally,latency and complexity were reduced by approximately 96%and 99.78%,respectively,compared to the greedy model.This study demonstrates that malicious traffic can be effectively detected even in lightweight device environments.SFSM verified the possibility of detecting various attack traffic on lightweight devices.