It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than ...It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.展开更多
Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identific...Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.展开更多
Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing mode...Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing modelling uncertainties, record to record (RTR) variations and cognitive uncertainties for moment resisting steel frames of various heights is discussed in detail. RTR uncertainty is used by incremental dynamic analysis (IDA), modelling uncertainties are considered through backbone curves and hysteresis loops of component, and cognitive uncertainty is presented in three levels of material quality. IDA is used to evaluate RTR uncertainty based on strong ground motion records selected by the k-means algorithm, which is favoured over Monte Carlo selection due to its time saving appeal. Analytical equations of the Response Surface Method are obtained through IDA results by the Cuckoo algorithm, which predicts the mean and standard deviation of the collapse fragility curve. The Takagi-Sugeno-Kang model is used to represent material quality based on the response surface coefficients. Finally, collapse fragility curves with the various sources of uncertainties mentioned are derived through a large number of material quality values and meta variables inferred by the Takagi-Sugeno-Kang fuzzy model based on response surface method coefficients. It is concluded that a better risk management strategy in countries where material quality control is weak, is to account for cognitive uncertainties in fragility curves and the mean annual frequency.展开更多
Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swa...Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.展开更多
The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow S...The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.展开更多
Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A...Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.展开更多
Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance so...Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance solutions, robust meta-heuristic algorithms are used in research. Two main optimization techniques are applied including single-objective and multi-objective optimization. Genetic algorithms (GA), particle swarm optimization (PSO), and combination of genetic algorithm and particle swarm optimization (GAPSO) as single-objective techniques are used, while the non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization (MOPSO) which are sufficient for solving computationally complex large-size optimization problems as multi-objective techniques are applied and compared. A real case study from the rural transportation network of Iran is employed to illustrate the sufficiency of the optimum algorithm. The formulation of the optimization model is carried out in such a way that a cost-effective maintenance strategy is reached by preserving the performance level of the road network at a desirable level. So, the objective functions are pavement performance maximization and maintenance cost minimization. It is concluded that multi-objective algorithms including non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization performed better than the single objective algorithms due to the capability to balance between both objectives. And between multi-objective algorithms the NSGAII provides the optimum solution for the road maintenance planning.展开更多
The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract ...The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.展开更多
The phenomenon of a target echo peak overlapping with the backscattered echo peak significantly undermines the detection range and precision of underwater laser fuzes.To overcome this issue,we propose a four-quadrant ...The phenomenon of a target echo peak overlapping with the backscattered echo peak significantly undermines the detection range and precision of underwater laser fuzes.To overcome this issue,we propose a four-quadrant dual-beam circumferential scanning laser fuze to distinguish various interference signals and provide more real-time data for the backscatter filtering algorithm.This enhances the algorithm loading capability of the fuze.In order to address the problem of insufficient filtering capacity in existing linear backscatter filtering algorithms,we develop a nonlinear backscattering adaptive filter based on the spline adaptive filter least mean square(SAF-LMS)algorithm.We also designed an algorithm pause module to retain the original trend of the target echo peak,improving the time discrimination accuracy and anti-interference capability of the fuze.Finally,experiments are conducted with varying signal-to-noise ratios of the original underwater target echo signals.The experimental results show that the average signal-to-noise ratio before and after filtering can be improved by more than31 d B,with an increase of up to 76%in extreme detection distance.展开更多
Compression index Ccis an essential parameter in geotechnical design for which the effectiveness of correlation is still a challenge.This paper suggests a novel modelling approach using machine learning(ML)technique.T...Compression index Ccis an essential parameter in geotechnical design for which the effectiveness of correlation is still a challenge.This paper suggests a novel modelling approach using machine learning(ML)technique.The performance of five commonly used machine learning(ML)algorithms,i.e.back-propagation neural network(BPNN),extreme learning machine(ELM),support vector machine(SVM),random forest(RF)and evolutionary polynomial regression(EPR)in predicting Cc is comprehensively investigated.A database with a total number of 311 datasets including three input variables,i.e.initial void ratio e0,liquid limit water content wL,plasticity index Ip,and one output variable Cc is first established.Genetic algorithm(GA)is used to optimize the hyper-parameters in five ML algorithms,and the average prediction error for the 10-fold cross-validation(CV)sets is set as thefitness function in the GA for enhancing the robustness of ML models.The results indicate that ML models outperform empirical prediction formulations with lower prediction error.RF yields the lowest error followed by BPNN,ELM,EPR and SVM.If the ranges of input variables in the database are large enough,BPNN and RF models are recommended to predict Cc.Furthermore,if the distribution of input variables is continuous,RF model is the best one.Otherwise,EPR model is recommended if the ranges of input variables are small.The predicted correlations between input and output variables using five ML models show great agreement with the physical explanation.展开更多
The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few hav...The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few have been performed for heterogeneouswireless sensor networks.This paper utilizes Rao algorithms to optimize the structure of heterogeneous wireless sensor networks according to node locations and their initial energies.The proposed algorithms lack algorithm-specific parameters and metaphorical connotations.The proposed algorithms examine the search space based on the relations of the population with the best,worst,and randomly assigned solutions.The proposed algorithms can be evaluated using any routing protocol,however,we have chosen the well-known routing protocols in the literature:Low Energy Adaptive Clustering Hierarchy(LEACH),Power-Efficient Gathering in Sensor Information Systems(PEAGSIS),Partitioned-based Energy-efficient LEACH(PE-LEACH),and the Power-Efficient Gathering in Sensor Information Systems Neural Network(PEAGSIS-NN)recent routing protocol.We compare our optimized method with the Jaya,the Particle Swarm Optimization-based Energy Efficient Clustering(PSO-EEC)protocol,and the hybrid Harmony Search Algorithm and PSO(HSA-PSO)algorithms.The efficiencies of our proposed algorithms are evaluated by conducting experiments in terms of the network lifetime(first dead node,half dead nodes,and last dead node),energy consumption,packets to cluster head,and packets to the base station.The experimental results were compared with those obtained using the Jaya optimization algorithm.The proposed algorithms exhibited the best performance.The proposed approach successfully prolongs the network lifetime by 71% for the PEAGSIS protocol,51% for the LEACH protocol,10% for the PE-LEACH protocol,and 73% for the PEGSIS-NN protocol;Moreover,it enhances other criteria such as energy conservation,fitness convergence,packets to cluster head,and packets to the base station.展开更多
Rapid development in Information Technology(IT)has allowed several novel application regions like large outdoor vehicular networks for Vehicle-to-Vehicle(V2V)transmission.Vehicular networks give a safe and more effect...Rapid development in Information Technology(IT)has allowed several novel application regions like large outdoor vehicular networks for Vehicle-to-Vehicle(V2V)transmission.Vehicular networks give a safe and more effective driving experience by presenting time-sensitive and location-aware data.The communication occurs directly between V2V and Base Station(BS)units such as the Road Side Unit(RSU),named as a Vehicle to Infrastructure(V2I).However,the frequent topology alterations in VANETs generate several problems with data transmission as the vehicle velocity differs with time.Therefore,the scheme of an effectual routing protocol for reliable and stable communications is significant.Current research demonstrates that clustering is an intelligent method for effectual routing in a mobile environment.Therefore,this article presents a Falcon Optimization Algorithm-based Energy Efficient Communication Protocol for Cluster-based Routing(FOA-EECPCR)technique in VANETS.The FOA-EECPCR technique intends to group the vehicles and determine the shortest route in the VANET.To accomplish this,the FOA-EECPCR technique initially clusters the vehicles using FOA with fitness functions comprising energy,distance,and trust level.For the routing process,the Sparrow Search Algorithm(SSA)is derived with a fitness function that encompasses two variables,namely,energy and distance.A series of experiments have been conducted to exhibit the enhanced performance of the FOA-EECPCR method.The experimental outcomes demonstrate the enhanced performance of the FOA-EECPCR approach over other current methods.展开更多
With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality pred...With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.展开更多
With the increase in ocean exploration activities and underwater development,the autonomous underwater vehicle(AUV)has been widely used as a type of underwater automation equipment in the detection of underwater envir...With the increase in ocean exploration activities and underwater development,the autonomous underwater vehicle(AUV)has been widely used as a type of underwater automation equipment in the detection of underwater environments.However,nowadays AUVs generally have drawbacks such as weak endurance,low intelligence,and poor detection ability.The research and implementation of path-planning methods are the premise of AUVs to achieve actual tasks.To improve the underwater operation ability of the AUV,this paper studies the typical problems of path-planning for the ant colony algorithm and the artificial potential field algorithm.In response to the limitations of a single algorithm,an optimization scheme is proposed to improve the artificial potential field ant colony(APF-AC)algorithm.Compared with traditional ant colony and comparative algorithms,the APF-AC reduced the path length by 1.57%and 0.63%(in the simple environment),8.92%and 3.46%(in the complex environment).The iteration time has been reduced by approximately 28.48%and 18.05%(in the simple environment),18.53%and 9.24%(in the complex environment).Finally,the improved APF-AC algorithm has been validated on the AUV platform,and the experiment is consistent with the simulation.Improved APF-AC algorithm can effectively reduce the underwater operation time and overall power consumption of the AUV,and shows a higher safety.展开更多
One of the important research issues in wireless sensor networks(WSNs)is the optimal layout designing for the deployment of sensor nodes.It directly affects the quality of monitoring,cost,and detection capability of W...One of the important research issues in wireless sensor networks(WSNs)is the optimal layout designing for the deployment of sensor nodes.It directly affects the quality of monitoring,cost,and detection capability of WSNs.Layout optimization is an NP-hard combinatorial problem,which requires optimization of multiple competing objectives like cost,coverage,connectivity,lifetime,load balancing,and energy consumption of sensor nodes.In the last decade,several meta-heuristic optimization techniques have been proposed to solve this problem,such as genetic algorithms(GA)and particle swarm optimization(PSO).However,these approaches either provided computationally expensive solutions or covered a limited number of objectives,which are combinations of area coverage,the number of sensor nodes,energy consumption,and lifetime.In this study,a meta-heuristic multi-objective firefly algorithm(MOFA)is presented to solve the layout optimization problem.Here,the main goal is to cover a number of objectives related to optimal layouts of homogeneous WSNs,which includes coverage,connectivity,lifetime,energy consumption and the number of sensor nodes.Simulation results showed that MOFA created optimal Pareto front of non-dominated solutions with better hyper-volumes and spread of solutions,in comparison to multi-objective genetic algorithms(IBEA,NSGA-II)and particle swarm optimizers(OMOPSO,SMOPSO).Therefore,MOFA can be used in real-time deployment applications of large-scale WSNs to enhance their detection capability and quality of monitoring.展开更多
Soil swelling-related disaster is considered as one of the most devastating geo-hazards in modern history.Hence,proper determination of a soil’s ability to expand is very vital for achieving a secure and safe ground ...Soil swelling-related disaster is considered as one of the most devastating geo-hazards in modern history.Hence,proper determination of a soil’s ability to expand is very vital for achieving a secure and safe ground for infrastructures.Accordingly,this study has provided a novel and intelligent approach that enables an improved estimation of swelling by using kernelised machines(Bayesian linear regression(BLR)&bayes point machine(BPM)support vector machine(SVM)and deep-support vector machine(D-SVM));(multiple linear regressor(REG),logistic regressor(LR)and artificial neural network(ANN)),tree-based algorithms such as decision forest(RDF)&boosted trees(BDT).Also,and for the first time,meta-heuristic classifiers incorporating the techniques of voting(VE)and stacking(SE)were utilised.Different independent scenarios of explanatory features’combination that influence soil behaviour in swelling were investigated.Preliminary results indicated BLR as possessing the highest amount of deviation from the predictor variable(the actual swell-strain).REG and BLR performed slightly better than ANN while the meta-heuristic learners(VE and SE)produced the best overall performance(greatest R2 value of 0.94 and RMSE of 0.06%exhibited by VE).CEC,plasticity index and moisture content were the features considered to have the highest level of importance.Kernelized binary classifiers(SVM,D-SVM and BPM)gave better accuracy(average accuracy and recall rate of 0.93 and 0.60)compared to ANN,LR and RDF.Sensitivity-driven diagnostic test indicated that the meta-heuristic models’best performance occurred when ML training was conducted using k-fold validation technique.Finally,it is recommended that the concepts developed herein be deployed during the preliminary phases of a geotechnical or geological site characterisation by using the best performing meta-heuristic models via their background coding resource.展开更多
Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a s...Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.展开更多
In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct pi...In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.展开更多
The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study intro...The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study introduces a hybrid optimization algorithm, named the adaptive inertia weight whale optimization algorithm and gannet optimization algorithm (AIWGOA), which addresses challenges in enhancing handwritten documents. The hybrid strategy integrates the strengths of both algorithms, significantly enhancing their capabilities, whereas the adaptive parameter strategy mitigates the need for manual parameter setting. By amalgamating the hybrid strategy and parameter-adaptive approach, the Gannet Optimization Algorithm was refined to yield the AIWGOA. Through a performance analysis of the CEC2013 benchmark, the AIWGOA demonstrates notable advantages across various metrics. Subsequently, an evaluation index was employed to assess the enhanced handwritten documents and images, affirming the superior practical application of the AIWGOA compared with other algorithms.展开更多
Cornachia’s algorithm can be adapted to the case of the equation x2+dy2=nand even to the case of ax2+bxy+cy2=n. For the sake of completeness, we have given modalities without proofs (the proof in the case of the equa...Cornachia’s algorithm can be adapted to the case of the equation x2+dy2=nand even to the case of ax2+bxy+cy2=n. For the sake of completeness, we have given modalities without proofs (the proof in the case of the equation x2+y2=n). Starting from a quadratic form with two variables f(x,y)=ax2+bxy+cy2and n an integer. We have shown that a primitive positive solution (u,v)of the equation f(x,y)=nis admissible if it is obtained in the following way: we take α modulo n such that f(α,1)≡0modn, u is the first of the remainders of Euclid’s algorithm associated with n and α that is less than 4cn/| D |) (possibly α itself) and the equation f(x,y)=n. has an integer solution u in y. At the end of our work, it also appears that the Cornacchia algorithm is good for the form n=ax2+bxy+cy2if all the primitive positive integer solutions of the equation f(x,y)=nare admissible, i.e. computable by the algorithmic process.展开更多
文摘It is one of the topics that have been studied extensively on maximum power point tracking(MPPT)recently.Traditional or soft computing methods are used for MPPT.Since soft computing approaches are more effective than traditional approaches,studies on MPPT have shifted in this direction.This study aims comparison of performance of seven meta-heuristic training algorithms in the neuro-fuzzy training for MPPT.The meta-heuristic training algorithms used are particle swarm optimization(PSO),harmony search(HS),cuckoo search(CS),artificial bee colony(ABC)algorithm,bee algorithm(BA),differential evolution(DE)and flower pollination algorithm(FPA).The antecedent and conclusion parameters of neuro-fuzzy are determined by these algorithms.The data of a 250 W photovoltaic(PV)is used in the applications.For effective MPPT,different neuro-fuzzy structures,different membership functions and different control parameter values are evaluated in detail.Related training algorithms are compared in terms of solution quality and convergence speed.The strengths and weaknesses of these algorithms are revealed.It is seen that the type and number of membership function,colony size,number of generations affect the solution quality and convergence speed of the training algorithms.As a result,it has been observed that CS and ABC algorithm are more effective than other algorithms in terms of solution quality and convergence in solving the related problem.
文摘Damage identification of the offshore floating wind turbine by vibration/dynamic signals is one of the important and new research fields in the Structural Health Monitoring(SHM). In this paper a new damage identification method is proposed based on meta-heuristic algorithms using the dynamic response of the TLP(Tension-Leg Platform) floating wind turbine structure. The Genetic Algorithms(GA), Artificial Immune System(AIS), Particle Swarm Optimization(PSO), and Artificial Bee Colony(ABC) are chosen for minimizing the object function, defined properly for damage identification purpose. In addition to studying the capability of mentioned algorithms in correctly identifying the damage, the effect of the response type on the results of identification is studied. Also, the results of proposed damage identification are investigated with considering possible uncertainties of the structure. Finally, for evaluating the proposed method in real condition, a 1/100 scaled experimental setup of TLP Floating Wind Turbine(TLPFWT) is provided in a laboratory scale and the proposed damage identification method is applied to the scaled turbine.
文摘Different performance levels may be obtained for sideway collapse evaluation of steel moment frames depending on the evaluation procedure used to handle uncertainties. In this article, the process of representing modelling uncertainties, record to record (RTR) variations and cognitive uncertainties for moment resisting steel frames of various heights is discussed in detail. RTR uncertainty is used by incremental dynamic analysis (IDA), modelling uncertainties are considered through backbone curves and hysteresis loops of component, and cognitive uncertainty is presented in three levels of material quality. IDA is used to evaluate RTR uncertainty based on strong ground motion records selected by the k-means algorithm, which is favoured over Monte Carlo selection due to its time saving appeal. Analytical equations of the Response Surface Method are obtained through IDA results by the Cuckoo algorithm, which predicts the mean and standard deviation of the collapse fragility curve. The Takagi-Sugeno-Kang model is used to represent material quality based on the response surface coefficients. Finally, collapse fragility curves with the various sources of uncertainties mentioned are derived through a large number of material quality values and meta variables inferred by the Takagi-Sugeno-Kang fuzzy model based on response surface method coefficients. It is concluded that a better risk management strategy in countries where material quality control is weak, is to account for cognitive uncertainties in fragility curves and the mean annual frequency.
文摘Small parasitic Hemipteran insects known as bedbugs(Cimicidae)feed on warm-blooded mammal’s blood.The most famous member of this family is the Cimex lectularius or common bedbug.The current paper proposes a novel swarm intelligence optimization algorithm called the Bedbug Meta-Heuristic Algorithm(BMHA).The primary inspiration for the bedbug algorithm comes from the static and dynamic swarming behaviors of bedbugs in nature.The two main stages of optimization algorithms,exploration,and exploitation,are designed by modeling bedbug social interaction to search for food.The proposed algorithm is benchmarked qualitatively and quantitatively using many test functions including CEC2019.The results of evaluating BMHA prove that this algorithm can improve the initial random population for a given optimization problem to converge towards global optimization and provide highly competitive results compared to other well-known optimization algorithms.The results also prove the new algorithm's performance in solving real optimization problems in unknown search spaces.To achieve this,the proposed algorithm has been used to select the features of fake news in a semi-supervised manner,the results of which show the good performance of the proposed algorithm in solving problems.
基金partially supported by the Guangdong Basic and Applied Basic Research Foundation(2023A1515011531)the National Natural Science Foundation of China under Grant 62173356+2 种基金the Science and Technology Development Fund(FDCT),Macao SAR,under Grant 0019/2021/AZhuhai Industry-University-Research Project with Hongkong and Macao under Grant ZH22017002210014PWCthe Key Technologies for Scheduling and Optimization of Complex Distributed Manufacturing Systems(22JR10KA007).
文摘The flow shop scheduling problem is important for the manufacturing industry.Effective flow shop scheduling can bring great benefits to the industry.However,there are few types of research on Distributed Hybrid Flow Shop Problems(DHFSP)by learning assisted meta-heuristics.This work addresses a DHFSP with minimizing the maximum completion time(Makespan).First,a mathematical model is developed for the concerned DHFSP.Second,four Q-learning-assisted meta-heuristics,e.g.,genetic algorithm(GA),artificial bee colony algorithm(ABC),particle swarm optimization(PSO),and differential evolution(DE),are proposed.According to the nature of DHFSP,six local search operations are designed for finding high-quality solutions in local space.Instead of randomselection,Q-learning assists meta-heuristics in choosing the appropriate local search operations during iterations.Finally,based on 60 cases,comprehensive numerical experiments are conducted to assess the effectiveness of the proposed algorithms.The experimental results and discussions prove that using Q-learning to select appropriate local search operations is more effective than the random strategy.To verify the competitiveness of the Q-learning assistedmeta-heuristics,they are compared with the improved iterated greedy algorithm(IIG),which is also for solving DHFSP.The Friedman test is executed on the results by five algorithms.It is concluded that the performance of four Q-learning-assisted meta-heuristics are better than IIG,and the Q-learning-assisted PSO shows the best competitiveness.
基金supported by the Center for Mining,Electro-Mechanical research of Hanoi University of Mining and Geology(HUMG),Hanoi,Vietnamfinancially supported by the Hunan Provincial Department of Education General Project(19C1744)+1 种基金Hunan Province Science Foundation for Youth Scholars of China fund(2018JJ3510)the Innovation-Driven Project of Central South University(2020CX040)。
文摘Blasting is well-known as an effective method for fragmenting or moving rock in open-pit mines.To evaluate the quality of blasting,the size of rock distribution is used as a critical criterion in blasting operations.A high percentage of oversized rocks generated by blasting operations can lead to economic and environmental damage.Therefore,this study proposed four novel intelligent models to predict the size of rock distribution in mine blasting in order to optimize blasting parameters,as well as the efficiency of blasting operation in open mines.Accordingly,a nature-inspired algorithm(i.e.,firefly algorithm-FFA)and different machine learning algorithms(i.e.,gradient boosting machine(GBM),support vector machine(SVM),Gaussian process(GP),and artificial neural network(ANN))were combined for this aim,abbreviated as FFA-GBM,FFA-SVM,FFA-GP,and FFA-ANN,respectively.Subsequently,predicted results from the abovementioned models were compared with each other using three statistical indicators(e.g.,mean absolute error,root-mean-squared error,and correlation coefficient)and color intensity method.For developing and simulating the size of rock in blasting operations,136 blasting events with their images were collected and analyzed by the Split-Desktop software.In which,111 events were randomly selected for the development and optimization of the models.Subsequently,the remaining 25 blasting events were applied to confirm the accuracy of the proposed models.Herein,blast design parameters were regarded as input variables to predict the size of rock in blasting operations.Finally,the obtained results revealed that the FFA is a robust optimization algorithm for estimating rock fragmentation in bench blasting.Among the models developed in this study,FFA-GBM provided the highest accuracy in predicting the size of fragmented rocks.The other techniques(i.e.,FFA-SVM,FFA-GP,and FFA-ANN)yielded lower computational stability and efficiency.Hence,the FFA-GBM model can be used as a powerful and precise soft computing tool that can be applied to practical engineering cases aiming to improve the quality of blasting and rock fragmentation.
文摘Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at pro- posing an optimal set of road maintenance solutions, robust meta-heuristic algorithms are used in research. Two main optimization techniques are applied including single-objective and multi-objective optimization. Genetic algorithms (GA), particle swarm optimization (PSO), and combination of genetic algorithm and particle swarm optimization (GAPSO) as single-objective techniques are used, while the non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization (MOPSO) which are sufficient for solving computationally complex large-size optimization problems as multi-objective techniques are applied and compared. A real case study from the rural transportation network of Iran is employed to illustrate the sufficiency of the optimum algorithm. The formulation of the optimization model is carried out in such a way that a cost-effective maintenance strategy is reached by preserving the performance level of the road network at a desirable level. So, the objective functions are pavement performance maximization and maintenance cost minimization. It is concluded that multi-objective algorithms including non-domination sorting genetic algorithm II (NSGAII) and multi-objective particle swarm optimization performed better than the single objective algorithms due to the capability to balance between both objectives. And between multi-objective algorithms the NSGAII provides the optimum solution for the road maintenance planning.
文摘The non-invasive evaluation of the heart through EectroCardioG-raphy(ECG)has played a key role in detecting heart disease.The analysis of ECG signals requires years of learning and experience to interpret and extract useful information from them.Thus,a computerized system is needed to classify ECG signals with more accurate results effectively.Abnormal heart rhythms are called arrhythmias and cause sudden cardiac deaths.In this work,a Computerized Abnormal Heart Rhythms Detection(CAHRD)system is developed using ECG signals.It consists of four stages;preprocessing,feature extraction,feature optimization and classifier.At first,Pan and Tompkins algorithm is employed to detect the envelope of Q,R and S waves in the preprocessing stage.It uses a recursive filter to eliminate muscle noise,T-wave interference and baseline wander.As the analysis of ECG signal in the spatial domain does not provide a complete description of the signal,the feature extraction involves using frequency contents obtained from multiple wavelet filters;bi-orthogonal,Symlet and Daubechies at different resolution levels in the feature extraction stage.Then,Black Widow Optimization(BWO)is applied to optimize the hybrid wavelet features in the feature optimization stage.Finally,a kernel based Support Vector Machine(SVM)is employed to classify heartbeats into five classes.In SVM,Radial Basis Function(RBF),polynomial and linear kernels are used.A total of∼15000 ECG signals are obtained from the Massachusetts Institute of Technology-Beth Israel Hospital(MIT-BIH)arrhythmia database for performance evaluation of the proposed CAHRD system.Results show that the proposed CAHRD system proved to be a powerful tool for ECG analysis.It correctly classifies five classes of heartbeats with 99.91%accuracy using an RBF kernel with 2nd level wavelet coefficients.The CAHRD system achieves an improvement of∼6%over random projections with the ensemble SVM approach and∼2%over morphological and ECG segment based features with the RBF classifier.
基金supported by the 2021 Open Project Fund of Science and Technology on Electromechanical Dynamic Control Laboratory,grant number 212-C-J-F-QT-2022-0020China Postdoctoral Science Foundation,grant number 2021M701713+1 种基金Postgraduate Research&Practice Innovation Program of Jiangsu Province,grant number KYCX23_0511the Jiangsu Funding Program for Excellent Postdoctoral Talent,grant number 20220ZB245。
文摘The phenomenon of a target echo peak overlapping with the backscattered echo peak significantly undermines the detection range and precision of underwater laser fuzes.To overcome this issue,we propose a four-quadrant dual-beam circumferential scanning laser fuze to distinguish various interference signals and provide more real-time data for the backscatter filtering algorithm.This enhances the algorithm loading capability of the fuze.In order to address the problem of insufficient filtering capacity in existing linear backscatter filtering algorithms,we develop a nonlinear backscattering adaptive filter based on the spline adaptive filter least mean square(SAF-LMS)algorithm.We also designed an algorithm pause module to retain the original trend of the target echo peak,improving the time discrimination accuracy and anti-interference capability of the fuze.Finally,experiments are conducted with varying signal-to-noise ratios of the original underwater target echo signals.The experimental results show that the average signal-to-noise ratio before and after filtering can be improved by more than31 d B,with an increase of up to 76%in extreme detection distance.
基金financial support provided by the RIF project(Grant No.PolyU R5037-18F)from the Research Grants Council(RGC)of Hong Kong is gratefully acknowledged。
文摘Compression index Ccis an essential parameter in geotechnical design for which the effectiveness of correlation is still a challenge.This paper suggests a novel modelling approach using machine learning(ML)technique.The performance of five commonly used machine learning(ML)algorithms,i.e.back-propagation neural network(BPNN),extreme learning machine(ELM),support vector machine(SVM),random forest(RF)and evolutionary polynomial regression(EPR)in predicting Cc is comprehensively investigated.A database with a total number of 311 datasets including three input variables,i.e.initial void ratio e0,liquid limit water content wL,plasticity index Ip,and one output variable Cc is first established.Genetic algorithm(GA)is used to optimize the hyper-parameters in five ML algorithms,and the average prediction error for the 10-fold cross-validation(CV)sets is set as thefitness function in the GA for enhancing the robustness of ML models.The results indicate that ML models outperform empirical prediction formulations with lower prediction error.RF yields the lowest error followed by BPNN,ELM,EPR and SVM.If the ranges of input variables in the database are large enough,BPNN and RF models are recommended to predict Cc.Furthermore,if the distribution of input variables is continuous,RF model is the best one.Otherwise,EPR model is recommended if the ranges of input variables are small.The predicted correlations between input and output variables using five ML models show great agreement with the physical explanation.
文摘The structural optimization of wireless sensor networks is a critical issue because it impacts energy consumption and hence the network’s lifetime.Many studies have been conducted for homogeneous networks,but few have been performed for heterogeneouswireless sensor networks.This paper utilizes Rao algorithms to optimize the structure of heterogeneous wireless sensor networks according to node locations and their initial energies.The proposed algorithms lack algorithm-specific parameters and metaphorical connotations.The proposed algorithms examine the search space based on the relations of the population with the best,worst,and randomly assigned solutions.The proposed algorithms can be evaluated using any routing protocol,however,we have chosen the well-known routing protocols in the literature:Low Energy Adaptive Clustering Hierarchy(LEACH),Power-Efficient Gathering in Sensor Information Systems(PEAGSIS),Partitioned-based Energy-efficient LEACH(PE-LEACH),and the Power-Efficient Gathering in Sensor Information Systems Neural Network(PEAGSIS-NN)recent routing protocol.We compare our optimized method with the Jaya,the Particle Swarm Optimization-based Energy Efficient Clustering(PSO-EEC)protocol,and the hybrid Harmony Search Algorithm and PSO(HSA-PSO)algorithms.The efficiencies of our proposed algorithms are evaluated by conducting experiments in terms of the network lifetime(first dead node,half dead nodes,and last dead node),energy consumption,packets to cluster head,and packets to the base station.The experimental results were compared with those obtained using the Jaya optimization algorithm.The proposed algorithms exhibited the best performance.The proposed approach successfully prolongs the network lifetime by 71% for the PEAGSIS protocol,51% for the LEACH protocol,10% for the PE-LEACH protocol,and 73% for the PEGSIS-NN protocol;Moreover,it enhances other criteria such as energy conservation,fitness convergence,packets to cluster head,and packets to the base station.
文摘Rapid development in Information Technology(IT)has allowed several novel application regions like large outdoor vehicular networks for Vehicle-to-Vehicle(V2V)transmission.Vehicular networks give a safe and more effective driving experience by presenting time-sensitive and location-aware data.The communication occurs directly between V2V and Base Station(BS)units such as the Road Side Unit(RSU),named as a Vehicle to Infrastructure(V2I).However,the frequent topology alterations in VANETs generate several problems with data transmission as the vehicle velocity differs with time.Therefore,the scheme of an effectual routing protocol for reliable and stable communications is significant.Current research demonstrates that clustering is an intelligent method for effectual routing in a mobile environment.Therefore,this article presents a Falcon Optimization Algorithm-based Energy Efficient Communication Protocol for Cluster-based Routing(FOA-EECPCR)technique in VANETS.The FOA-EECPCR technique intends to group the vehicles and determine the shortest route in the VANET.To accomplish this,the FOA-EECPCR technique initially clusters the vehicles using FOA with fitness functions comprising energy,distance,and trust level.For the routing process,the Sparrow Search Algorithm(SSA)is derived with a fitness function that encompasses two variables,namely,energy and distance.A series of experiments have been conducted to exhibit the enhanced performance of the FOA-EECPCR method.The experimental outcomes demonstrate the enhanced performance of the FOA-EECPCR approach over other current methods.
基金supported by the National Science and Technology Innovation 2030 Next-Generation Artifical Intelligence Major Project(2018AAA0101801)the National Natural Science Foundation of China(72271188)。
文摘With the development of information technology,a large number of product quality data in the entire manufacturing process is accumulated,but it is not explored and used effectively.The traditional product quality prediction models have many disadvantages,such as high complexity and low accuracy.To overcome the above problems,we propose an optimized data equalization method to pre-process dataset and design a simple but effective product quality prediction model:radial basis function model optimized by the firefly algorithm with Levy flight mechanism(RBFFALM).First,the new data equalization method is introduced to pre-process the dataset,which reduces the dimension of the data,removes redundant features,and improves the data distribution.Then the RBFFALFM is used to predict product quality.Comprehensive expe riments conducted on real-world product quality datasets validate that the new model RBFFALFM combining with the new data pre-processing method outperforms other previous me thods on predicting product quality.
基金supported by Research Program supported by the National Natural Science Foundation of China(No.62201249)the Jiangsu Agricultural Science and Technology Innovation Fund(No.CX(21)1007)+2 种基金the Open Project of the Zhejiang Provincial Key Laboratory of Crop Harvesting Equipment and Technology(Nos.2021KY03,2021KY04)University-Industry Collaborative Education Program(No.201801166003)the Postgraduate Research&Practice Innovation Program of Jiangsu Province(No.SJCX22_1042).
文摘With the increase in ocean exploration activities and underwater development,the autonomous underwater vehicle(AUV)has been widely used as a type of underwater automation equipment in the detection of underwater environments.However,nowadays AUVs generally have drawbacks such as weak endurance,low intelligence,and poor detection ability.The research and implementation of path-planning methods are the premise of AUVs to achieve actual tasks.To improve the underwater operation ability of the AUV,this paper studies the typical problems of path-planning for the ant colony algorithm and the artificial potential field algorithm.In response to the limitations of a single algorithm,an optimization scheme is proposed to improve the artificial potential field ant colony(APF-AC)algorithm.Compared with traditional ant colony and comparative algorithms,the APF-AC reduced the path length by 1.57%and 0.63%(in the simple environment),8.92%and 3.46%(in the complex environment).The iteration time has been reduced by approximately 28.48%and 18.05%(in the simple environment),18.53%and 9.24%(in the complex environment).Finally,the improved APF-AC algorithm has been validated on the AUV platform,and the experiment is consistent with the simulation.Improved APF-AC algorithm can effectively reduce the underwater operation time and overall power consumption of the AUV,and shows a higher safety.
基金This research has been funded by the Deanship of Scientific Research at Imam Mohammad Ibn Saud Islamic University through Research Group No.RG-21-07-09.
文摘One of the important research issues in wireless sensor networks(WSNs)is the optimal layout designing for the deployment of sensor nodes.It directly affects the quality of monitoring,cost,and detection capability of WSNs.Layout optimization is an NP-hard combinatorial problem,which requires optimization of multiple competing objectives like cost,coverage,connectivity,lifetime,load balancing,and energy consumption of sensor nodes.In the last decade,several meta-heuristic optimization techniques have been proposed to solve this problem,such as genetic algorithms(GA)and particle swarm optimization(PSO).However,these approaches either provided computationally expensive solutions or covered a limited number of objectives,which are combinations of area coverage,the number of sensor nodes,energy consumption,and lifetime.In this study,a meta-heuristic multi-objective firefly algorithm(MOFA)is presented to solve the layout optimization problem.Here,the main goal is to cover a number of objectives related to optimal layouts of homogeneous WSNs,which includes coverage,connectivity,lifetime,energy consumption and the number of sensor nodes.Simulation results showed that MOFA created optimal Pareto front of non-dominated solutions with better hyper-volumes and spread of solutions,in comparison to multi-objective genetic algorithms(IBEA,NSGA-II)and particle swarm optimizers(OMOPSO,SMOPSO).Therefore,MOFA can be used in real-time deployment applications of large-scale WSNs to enhance their detection capability and quality of monitoring.
文摘Soil swelling-related disaster is considered as one of the most devastating geo-hazards in modern history.Hence,proper determination of a soil’s ability to expand is very vital for achieving a secure and safe ground for infrastructures.Accordingly,this study has provided a novel and intelligent approach that enables an improved estimation of swelling by using kernelised machines(Bayesian linear regression(BLR)&bayes point machine(BPM)support vector machine(SVM)and deep-support vector machine(D-SVM));(multiple linear regressor(REG),logistic regressor(LR)and artificial neural network(ANN)),tree-based algorithms such as decision forest(RDF)&boosted trees(BDT).Also,and for the first time,meta-heuristic classifiers incorporating the techniques of voting(VE)and stacking(SE)were utilised.Different independent scenarios of explanatory features’combination that influence soil behaviour in swelling were investigated.Preliminary results indicated BLR as possessing the highest amount of deviation from the predictor variable(the actual swell-strain).REG and BLR performed slightly better than ANN while the meta-heuristic learners(VE and SE)produced the best overall performance(greatest R2 value of 0.94 and RMSE of 0.06%exhibited by VE).CEC,plasticity index and moisture content were the features considered to have the highest level of importance.Kernelized binary classifiers(SVM,D-SVM and BPM)gave better accuracy(average accuracy and recall rate of 0.93 and 0.60)compared to ANN,LR and RDF.Sensitivity-driven diagnostic test indicated that the meta-heuristic models’best performance occurred when ML training was conducted using k-fold validation technique.Finally,it is recommended that the concepts developed herein be deployed during the preliminary phases of a geotechnical or geological site characterisation by using the best performing meta-heuristic models via their background coding resource.
文摘Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.
基金the Japan Society for the Promotion of Science,KAKENHI Grant Nos.20H04199 and 23H00475.
文摘In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.
文摘The Gannet Optimization Algorithm (GOA) and the Whale Optimization Algorithm (WOA) demonstrate strong performance;however, there remains room for improvement in convergence and practical applications. This study introduces a hybrid optimization algorithm, named the adaptive inertia weight whale optimization algorithm and gannet optimization algorithm (AIWGOA), which addresses challenges in enhancing handwritten documents. The hybrid strategy integrates the strengths of both algorithms, significantly enhancing their capabilities, whereas the adaptive parameter strategy mitigates the need for manual parameter setting. By amalgamating the hybrid strategy and parameter-adaptive approach, the Gannet Optimization Algorithm was refined to yield the AIWGOA. Through a performance analysis of the CEC2013 benchmark, the AIWGOA demonstrates notable advantages across various metrics. Subsequently, an evaluation index was employed to assess the enhanced handwritten documents and images, affirming the superior practical application of the AIWGOA compared with other algorithms.
文摘Cornachia’s algorithm can be adapted to the case of the equation x2+dy2=nand even to the case of ax2+bxy+cy2=n. For the sake of completeness, we have given modalities without proofs (the proof in the case of the equation x2+y2=n). Starting from a quadratic form with two variables f(x,y)=ax2+bxy+cy2and n an integer. We have shown that a primitive positive solution (u,v)of the equation f(x,y)=nis admissible if it is obtained in the following way: we take α modulo n such that f(α,1)≡0modn, u is the first of the remainders of Euclid’s algorithm associated with n and α that is less than 4cn/| D |) (possibly α itself) and the equation f(x,y)=n. has an integer solution u in y. At the end of our work, it also appears that the Cornacchia algorithm is good for the form n=ax2+bxy+cy2if all the primitive positive integer solutions of the equation f(x,y)=nare admissible, i.e. computable by the algorithmic process.