In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selec...In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.展开更多
Neuromuscular diseases present profound challenges to individuals and healthcare systems worldwide, profoundly impacting motor functions. This research provides a comprehensive exploration of how artificial intelligen...Neuromuscular diseases present profound challenges to individuals and healthcare systems worldwide, profoundly impacting motor functions. This research provides a comprehensive exploration of how artificial intelligence (AI) technology is revolutionizing rehabilitation for individuals with neuromuscular disorders. Through an extensive review, this paper elucidates a wide array of AI-driven interventions spanning robotic-assisted therapy, virtual reality rehabilitation, and intricately tailored machine learning algorithms. The aim is to delve into the nuanced applications of AI, unlocking its transformative potential in optimizing personalized treatment plans for those grappling with the complexities of neuromuscular diseases. By examining the multifaceted intersection of AI and rehabilitation, this paper not only contributes to our understanding of cutting-edge advancements but also envisions a future where technological innovations play a pivotal role in alleviating the challenges posed by neuromuscular diseases. From employing neural-fuzzy adaptive controllers for precise trajectory tracking amidst uncertainties to utilizing machine learning algorithms for recognizing patient motor intentions and adapting training accordingly, this research encompasses a holistic approach towards harnessing AI for enhanced rehabilitation outcomes. By embracing the synergy between AI and rehabilitation, we pave the way for a future where individuals with neuromuscular disorders can access tailored, effective, and technologically-driven interventions to improve their quality of life and functional independence.展开更多
In today’s rapid widespread of digital technologies into all live aspects to enhance efficiency and productivity on the one hand and on the other hand ensure customer engagement, personal data counterfeiting has beco...In today’s rapid widespread of digital technologies into all live aspects to enhance efficiency and productivity on the one hand and on the other hand ensure customer engagement, personal data counterfeiting has become a major concern for businesses and end-users. One solution to ensure data security is encryption, where keys are central. There is therefore a need to find robusts key generation implementation that is effective, inexpensive and non-invasive for protecting and preventing data counterfeiting. In this paper, we use the theory of electromagnetic wave propagation to generate encryption keys.展开更多
Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear mode...Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.展开更多
Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a s...Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.展开更多
Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main ...Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main features of the problem are the strong nonuniform scale of the solution and large errors (up to 15%) in the input data. In both algorithms, the solution is represented as decomposition on special basic functions, which satisfy given a priori information on solution, and this idea allow us significantly to improve the quality of approximate solution and simplify solving the minimization problem. The theoretical details of the algorithms, as well as the results of numerical experiments for proving robustness of the algorithms, are presented.展开更多
This paper presents a binary gravitational search algorithm (BGSA) is applied to solve the problem of optimal allotment of DG sets and Shunt capacitors in radial distribution systems. The problem is formulated as a no...This paper presents a binary gravitational search algorithm (BGSA) is applied to solve the problem of optimal allotment of DG sets and Shunt capacitors in radial distribution systems. The problem is formulated as a nonlinear constrained single-objective optimization problem where the total line loss (TLL) and the total voltage deviations (TVD) are to be minimized separately by incorporating optimal placement of DG units and shunt capacitors with constraints which include limits on voltage, sizes of installed capacitors and DG. This BGSA is applied on the balanced IEEE 10 Bus distribution network and the results are compared with conventional binary particle swarm optimization.展开更多
Numerous cryptographic algorithms (ElGamal, Rabin, RSA, NTRU etc) require multiple computations of modulo multiplicative inverses. This paper describes and validates a new algorithm, called the Enhanced Euclid Algorit...Numerous cryptographic algorithms (ElGamal, Rabin, RSA, NTRU etc) require multiple computations of modulo multiplicative inverses. This paper describes and validates a new algorithm, called the Enhanced Euclid Algorithm, for modular multiplicative inverse (MMI). Analysis of the proposed algorithm shows that it is more efficient than the Extended Euclid algorithm (XEA). In addition, if a MMI does not exist, then it is not necessary to use the Backtracking procedure in the proposed algorithm;this case requires fewer operations on every step (divisions, multiplications, additions, assignments and push operations on stack), than the XEA. Overall, XEA uses more multiplications, additions, assignments and twice as many variables than the proposed algorithm.展开更多
In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct pi...In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.展开更多
The contradiction of variable step size least mean square(LMS)algorithm between fast convergence speed and small steady-state error has always existed.So,a new algorithm based on the combination of logarithmic and sym...The contradiction of variable step size least mean square(LMS)algorithm between fast convergence speed and small steady-state error has always existed.So,a new algorithm based on the combination of logarithmic and symbolic function and step size factor is proposed.It establishes a new updating method of step factor that is related to step factor and error signal.This work makes an analysis from 3 aspects:theoretical analysis,theoretical verification and specific experiments.The experimental results show that the proposed algorithm is superior to other variable step size algorithms in convergence speed and steady-state error.展开更多
The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource pr...The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.展开更多
In biology, signal transduction refers to a process by which a cell converts one kind of signal or stimulus into another. It involves ordered sequences of biochemical reactions inside the cell. These cascades of react...In biology, signal transduction refers to a process by which a cell converts one kind of signal or stimulus into another. It involves ordered sequences of biochemical reactions inside the cell. These cascades of reactions are carried out by enzymes and activated by second messengers. Signal transduction pathways are complex in nature. Each pathway is responsible for tuning one or more biological functions in the intracellular environment as well as more than one pathway interact among themselves to carry forward a single biological function. Such kind of behavior of these pathways makes understanding difficult. Hence, for the sake of simplicity, they need to be partitioned into smaller modules and then analyzed. We took VEGF signaling pathway, which is responsible for angiogenesis for this kind of modularized study. Modules were obtained by applying the algorithm of Nayak and De (Nayak and De, 2007) for different complexity values. These sets of modules were compared among themselves to get the best set of modules for an optimal complexity value. The best set of modules compared with four different partitioning algorithms namely, Farhat’s (Farhat, 1998), Greedy (Chartrand and Oellermann, 1993), Kernighan-Lin’s (Kernighan and Lin, 1970) and Newman’s community finding algorithm (Newman, 2006). These comparisons enabled us to decide which of the aforementioned algorithms was the best one to create partitions from human VEGF signaling pathway. The optimal complexity value, on which the best set of modules was obtained, was used to get modules from different species for comparative study. Comparison among these modules would shed light on the trend of development of VEGF signaling pathway over these species.展开更多
This work proposes a novel approach for multi-type optimal placement of flexible AC transmission system(FACTS) devices so as to optimize multi-objective voltage stability problem. The current study discusses a way for...This work proposes a novel approach for multi-type optimal placement of flexible AC transmission system(FACTS) devices so as to optimize multi-objective voltage stability problem. The current study discusses a way for locating and setting of thyristor controlled series capacitor(TCSC) and static var compensator(SVC) using the multi-objective optimization approach named strength pareto multi-objective evolutionary algorithm(SPMOEA). Maximization of the static voltage stability margin(SVSM) and minimizations of real power losses(RPL) and load voltage deviation(LVD) are taken as the goals or three objective functions, when optimally locating multi-type FACTS devices. The performance and effectiveness of the proposed approach has been validated by the simulation results of the IEEE 30-bus and IEEE 118-bus test systems. The proposed approach is compared with non-dominated sorting particle swarm optimization(NSPSO) algorithm. This comparison confirms the usefulness of the multi-objective proposed technique that makes it promising for determination of combinatorial problems of FACTS devices location and setting in large scale power systems.展开更多
In this paper we consider a parallel algorithm that detects the maximizer of unimodal function f(x) computable at every point on unbounded interval (0, ∞). The algorithm consists of two modes: scanning and detecting....In this paper we consider a parallel algorithm that detects the maximizer of unimodal function f(x) computable at every point on unbounded interval (0, ∞). The algorithm consists of two modes: scanning and detecting. Search diagrams are introduced as a way to describe parallel searching algorithms on unbounded intervals. Dynamic programming equations, combined with a series of liner programming problems, describe relations between results for every pair of successive evaluations of function f in parallel. Properties of optimal search strategies are derived from these equations. The worst-case complexity analysis shows that, if the maximizer is located on a priori unknown interval (n-1], then it can be detected after cp(n)=「2log「p/2」+1(n+1)」-1 parallel evaluations of f(x), where p is the number of processors.展开更多
CC’s(Cloud Computing)networks are distributed and dynamic as signals appear/disappear or lose significance.MLTs(Machine learning Techniques)train datasets which sometime are inadequate in terms of sample for inferrin...CC’s(Cloud Computing)networks are distributed and dynamic as signals appear/disappear or lose significance.MLTs(Machine learning Techniques)train datasets which sometime are inadequate in terms of sample for inferring information.A dynamic strategy,DevMLOps(Development Machine Learning Operations)used in automatic selections and tunings of MLTs result in significant performance differences.But,the scheme has many disadvantages including continuity in training,more samples and training time in feature selections and increased classification execution times.RFEs(Recursive Feature Eliminations)are computationally very expensive in its operations as it traverses through each feature without considering correlations between them.This problem can be overcome by the use of Wrappers as they select better features by accounting for test and train datasets.The aim of this paper is to use DevQLMLOps for automated tuning and selections based on orchestrations and messaging between containers.The proposed AKFA(Adaptive Kernel Firefly Algorithm)is for selecting features for CNM(Cloud Network Monitoring)operations.AKFA methodology is demonstrated using CNSD(Cloud Network Security Dataset)with satisfactory results in the performance metrics like precision,recall,F-measure and accuracy used.展开更多
AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwi...AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwide dissemination of the PCN management algorithm.METHODS Retrospective review was performed; patients with PCN diagnosis given between April 2012 and April 2015(18 mo before and after the publication of the algorithm) in KPSC(integrated health system with 15 hospitals and 202 medical offices in Southern California) were identified.RESULTS2558(1157 pre-and 1401 post-algorithm) received a new diagnosis of PCN in the study period. There was no difference in the mean cyst size(pre-19.1 mm vs post-18.5 mm, P = 0.119). A smaller percentage of PCNs resulted in EUS after the implementation of the algorithm(pre-45.5% vs post-34.8%, P < 0.001). A smaller proportion of patients were referred for GI(pre-65.2% vs post-53.3%, P < 0.001) and surgery consultations(pre-24.8% vs post-16%, P < 0.001) for PCN after the implementation. There was no significant change in operations for PCNs. Cost of diagnostic care was reduced after the implementation by 24%, 18%, and 36% for EUS, GI, and surgery consultations, respectively, with total cost saving of 24%.CONCLUSION In the current healthcare climate, there is increased need to optimize resource utilization. Dissemination of an algorithm for PCN management in an integrated health system resulted in fewer EUS and GI/surgery referrals, likely by aiding the physicians ordering imaging studies in the decision making for the management of PCNs. This translated to cost saving of 24%, 18%, and 36% for EUS, GI, and surgical consultations, respectively, with total diagnostic cost saving of 24%.展开更多
The flying foxes optimization(FFO)algorithm,as a newly introduced metaheuristic algorithm,is inspired by the survival tactics of flying foxes in heat wave environments.FFO preferentially selects the best-performing in...The flying foxes optimization(FFO)algorithm,as a newly introduced metaheuristic algorithm,is inspired by the survival tactics of flying foxes in heat wave environments.FFO preferentially selects the best-performing individuals.This tendency will cause the newly generated solution to remain closely tied to the candidate optimal in the search area.To address this issue,the paper introduces an opposition-based learning-based search mechanism for FFO algorithm(IFFO).Firstly,this paper introduces niching techniques to improve the survival list method,which not only focuses on the adaptability of individuals but also considers the population’s crowding degree to enhance the global search capability.Secondly,an initialization strategy of opposition-based learning is used to perturb the initial population and elevate its quality.Finally,to verify the superiority of the improved search mechanism,IFFO,FFO and the cutting-edge metaheuristic algorithms are compared and analyzed using a set of test functions.The results prove that compared with other algorithms,IFFO is characterized by its rapid convergence,precise results and robust stability.展开更多
Ultra-high voltage(UHV)transmission lines are an important part of China’s power grid and are often surrounded by a complex electromagnetic environment.The ground total electric field is considered a main electromagn...Ultra-high voltage(UHV)transmission lines are an important part of China’s power grid and are often surrounded by a complex electromagnetic environment.The ground total electric field is considered a main electromagnetic environment indicator of UHV transmission lines and is currently employed for reliable long-term operation of the power grid.Yet,the accurate prediction of the ground total electric field remains a technical challenge.In this work,we collected the total electric field data from the Ningdong-Zhejiang±800 kV UHVDC transmission project,as of the Ling Shao line,and perform an outlier analysis of the total electric field data.We show that the Local Outlier Factor(LOF)elimination algorithm has a small average difference and overcomes the performance of Density-Based Spatial Clustering of Applications with Noise(DBSCAN)and Isolated Forest elimination algorithms.Moreover,the Stacking algorithm has been found to have superior prediction accuracy than a variety of similar prediction algorithms,including the traditional finite element.The low prediction error of the Stacking algorithm highlights the superior ability to accurately forecast the ground total electric field of UHVDC transmission lines.展开更多
Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach ess...Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach essential for minimizing unnecessary transmission energy consumption with sustained network lifetime. This clustering process is identified as the Non-deterministic Polynomial(NP)-hard optimization problems which has the maximized probability of being solved through metaheuristic algorithms.This adoption of hybrid metaheuristic algorithm concentrates on the identification of the optimal or nearoptimal solutions which aids in better energy stability during Cluster Head(CH) selection. In this paper,Hybrid Seagull and Whale Optimization Algorithmbased Dynamic Clustering Protocol(HSWOA-DCP)is proposed with the exploitation benefits of WOA and exploration merits of SEOA to optimal CH selection for maintaining energy stability with prolonged network lifetime. This HSWOA-DCP adopted the modified version of SEagull Optimization Algorithm(SEOA) to handle the problem of premature convergence and computational accuracy which is maximally possible during CH selection. The inclusion of SEOA into WOA improved the global searching capability during the selection of CH and prevents worst fitness nodes from being selected as CH, since the spiral attacking behavior of SEOA is similar to the bubble-net characteristics of WOA. This CH selection integrates the spiral attacking principles of SEOA and contraction surrounding mechanism of WOA for improving computation accuracy to prevent frequent election process. It also included the strategy of levy flight strategy into SEOA for potentially avoiding premature convergence to attain better trade-off between the rate of exploration and exploitation in a more effective manner. The simulation results of the proposed HSWOADCP confirmed better network survivability rate, network residual energy and network overall throughput on par with the competitive CH selection schemes under different number of data transmission rounds.The statistical analysis of the proposed HSWOA-DCP scheme also confirmed its energy stability with respect to ANOVA test.展开更多
This study examines the multicriteria scheduling problem on a single machine to minimize three criteria: the maximum cost function, denoted by maximum late work (V<sub>max</sub>), maximum tardy job, denote...This study examines the multicriteria scheduling problem on a single machine to minimize three criteria: the maximum cost function, denoted by maximum late work (V<sub>max</sub>), maximum tardy job, denoted by (T<sub>max</sub>), and maximum earliness (E<sub>max</sub>). We propose several algorithms based on types of objectives function to be optimized when dealing with simultaneous minimization problems with and without weight and hierarchical minimization problems. The proposed Algorithm (3) is to find the set of efficient solutions for 1//F (V<sub>max</sub>, T<sub>max</sub>, E<sub>max</sub>) and 1//(V<sub>max</sub> + T<sub>max</sub> + E<sub>max</sub>). The Local Search Heuristic Methods (Descent Method (DM), Simulated Annealing (SA), Genetic Algorithm (GA), and the Tree Type Heuristics Method (TTHM) are applied to solve all suggested problems. Finally, the experimental results of Algorithm (3) are compared with the results of the Branch and Bound (BAB) method for optimal and Pareto optimal solutions for smaller instance sizes and compared to the Local Search Heuristic Methods for large instance sizes. These results ensure the efficiency of Algorithm (3) in a reasonable time.展开更多
基金the Deputyship for Research and Innovation,“Ministry of Education”in Saudi Arabia for funding this research(IFKSUOR3-014-3).
文摘In this study,our aim is to address the problem of gene selection by proposing a hybrid bio-inspired evolutionary algorithm that combines Grey Wolf Optimization(GWO)with Harris Hawks Optimization(HHO)for feature selection.Themotivation for utilizingGWOandHHOstems fromtheir bio-inspired nature and their demonstrated success in optimization problems.We aimto leverage the strengths of these algorithms to enhance the effectiveness of feature selection in microarray-based cancer classification.We selected leave-one-out cross-validation(LOOCV)to evaluate the performance of both two widely used classifiers,k-nearest neighbors(KNN)and support vector machine(SVM),on high-dimensional cancer microarray data.The proposed method is extensively tested on six publicly available cancer microarray datasets,and a comprehensive comparison with recently published methods is conducted.Our hybrid algorithm demonstrates its effectiveness in improving classification performance,Surpassing alternative approaches in terms of precision.The outcomes confirm the capability of our method to substantially improve both the precision and efficiency of cancer classification,thereby advancing the development ofmore efficient treatment strategies.The proposed hybridmethod offers a promising solution to the gene selection problem in microarray-based cancer classification.It improves the accuracy and efficiency of cancer diagnosis and treatment,and its superior performance compared to other methods highlights its potential applicability in realworld cancer classification tasks.By harnessing the complementary search mechanisms of GWO and HHO,we leverage their bio-inspired behavior to identify informative genes relevant to cancer diagnosis and treatment.
文摘Neuromuscular diseases present profound challenges to individuals and healthcare systems worldwide, profoundly impacting motor functions. This research provides a comprehensive exploration of how artificial intelligence (AI) technology is revolutionizing rehabilitation for individuals with neuromuscular disorders. Through an extensive review, this paper elucidates a wide array of AI-driven interventions spanning robotic-assisted therapy, virtual reality rehabilitation, and intricately tailored machine learning algorithms. The aim is to delve into the nuanced applications of AI, unlocking its transformative potential in optimizing personalized treatment plans for those grappling with the complexities of neuromuscular diseases. By examining the multifaceted intersection of AI and rehabilitation, this paper not only contributes to our understanding of cutting-edge advancements but also envisions a future where technological innovations play a pivotal role in alleviating the challenges posed by neuromuscular diseases. From employing neural-fuzzy adaptive controllers for precise trajectory tracking amidst uncertainties to utilizing machine learning algorithms for recognizing patient motor intentions and adapting training accordingly, this research encompasses a holistic approach towards harnessing AI for enhanced rehabilitation outcomes. By embracing the synergy between AI and rehabilitation, we pave the way for a future where individuals with neuromuscular disorders can access tailored, effective, and technologically-driven interventions to improve their quality of life and functional independence.
文摘In today’s rapid widespread of digital technologies into all live aspects to enhance efficiency and productivity on the one hand and on the other hand ensure customer engagement, personal data counterfeiting has become a major concern for businesses and end-users. One solution to ensure data security is encryption, where keys are central. There is therefore a need to find robusts key generation implementation that is effective, inexpensive and non-invasive for protecting and preventing data counterfeiting. In this paper, we use the theory of electromagnetic wave propagation to generate encryption keys.
文摘Compositional data, such as relative information, is a crucial aspect of machine learning and other related fields. It is typically recorded as closed data or sums to a constant, like 100%. The statistical linear model is the most used technique for identifying hidden relationships between underlying random variables of interest. However, data quality is a significant challenge in machine learning, especially when missing data is present. The linear regression model is a commonly used statistical modeling technique used in various applications to find relationships between variables of interest. When estimating linear regression parameters which are useful for things like future prediction and partial effects analysis of independent variables, maximum likelihood estimation (MLE) is the method of choice. However, many datasets contain missing observations, which can lead to costly and time-consuming data recovery. To address this issue, the expectation-maximization (EM) algorithm has been suggested as a solution for situations including missing data. The EM algorithm repeatedly finds the best estimates of parameters in statistical models that depend on variables or data that have not been observed. This is called maximum likelihood or maximum a posteriori (MAP). Using the present estimate as input, the expectation (E) step constructs a log-likelihood function. Finding the parameters that maximize the anticipated log-likelihood, as determined in the E step, is the job of the maximization (M) phase. This study looked at how well the EM algorithm worked on a made-up compositional dataset with missing observations. It used both the robust least square version and ordinary least square regression techniques. The efficacy of the EM algorithm was compared with two alternative imputation techniques, k-Nearest Neighbor (k-NN) and mean imputation (), in terms of Aitchison distances and covariance.
文摘Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.
文摘Two new regularization algorithms for solving the first-kind Volterra integral equation, which describes the pressure-rate deconvolution problem in well test data interpretation, are developed in this paper. The main features of the problem are the strong nonuniform scale of the solution and large errors (up to 15%) in the input data. In both algorithms, the solution is represented as decomposition on special basic functions, which satisfy given a priori information on solution, and this idea allow us significantly to improve the quality of approximate solution and simplify solving the minimization problem. The theoretical details of the algorithms, as well as the results of numerical experiments for proving robustness of the algorithms, are presented.
文摘This paper presents a binary gravitational search algorithm (BGSA) is applied to solve the problem of optimal allotment of DG sets and Shunt capacitors in radial distribution systems. The problem is formulated as a nonlinear constrained single-objective optimization problem where the total line loss (TLL) and the total voltage deviations (TVD) are to be minimized separately by incorporating optimal placement of DG units and shunt capacitors with constraints which include limits on voltage, sizes of installed capacitors and DG. This BGSA is applied on the balanced IEEE 10 Bus distribution network and the results are compared with conventional binary particle swarm optimization.
文摘Numerous cryptographic algorithms (ElGamal, Rabin, RSA, NTRU etc) require multiple computations of modulo multiplicative inverses. This paper describes and validates a new algorithm, called the Enhanced Euclid Algorithm, for modular multiplicative inverse (MMI). Analysis of the proposed algorithm shows that it is more efficient than the Extended Euclid algorithm (XEA). In addition, if a MMI does not exist, then it is not necessary to use the Backtracking procedure in the proposed algorithm;this case requires fewer operations on every step (divisions, multiplications, additions, assignments and push operations on stack), than the XEA. Overall, XEA uses more multiplications, additions, assignments and twice as many variables than the proposed algorithm.
基金the Japan Society for the Promotion of Science,KAKENHI Grant Nos.20H04199 and 23H00475.
文摘In this study, we propose an algorithm selection method based on coupling strength for the partitioned analysis ofstructure-piezoelectric-circuit coupling, which includes two types of coupling or inverse and direct piezoelectriccoupling and direct piezoelectric and circuit coupling. In the proposed method, implicit and explicit formulationsare used for strong and weak coupling, respectively. Three feasible partitioned algorithms are generated, namely(1) a strongly coupled algorithm that uses a fully implicit formulation for both types of coupling, (2) a weaklycoupled algorithm that uses a fully explicit formulation for both types of coupling, and (3) a partially stronglycoupled and partially weakly coupled algorithm that uses an implicit formulation and an explicit formulation forthe two types of coupling, respectively.Numerical examples using a piezoelectric energy harvester,which is a typicalstructure-piezoelectric-circuit coupling problem, demonstrate that the proposed method selects the most costeffectivealgorithm.
基金the National Natural Science Foundation of China(No.51575328,61503232).
文摘The contradiction of variable step size least mean square(LMS)algorithm between fast convergence speed and small steady-state error has always existed.So,a new algorithm based on the combination of logarithmic and symbolic function and step size factor is proposed.It establishes a new updating method of step factor that is related to step factor and error signal.This work makes an analysis from 3 aspects:theoretical analysis,theoretical verification and specific experiments.The experimental results show that the proposed algorithm is superior to other variable step size algorithms in convergence speed and steady-state error.
文摘The cloud computing technology is utilized for achieving resource utilization of remotebased virtual computer to facilitate the consumers with rapid and accurate massive data services.It utilizes on-demand resource provisioning,but the necessitated constraints of rapid turnaround time,minimal execution cost,high rate of resource utilization and limited makespan transforms the Load Balancing(LB)process-based Task Scheduling(TS)problem into an NP-hard optimization issue.In this paper,Hybrid Prairie Dog and Beluga Whale Optimization Algorithm(HPDBWOA)is propounded for precise mapping of tasks to virtual machines with the due objective of addressing the dynamic nature of cloud environment.This capability of HPDBWOA helps in decreasing the SLA violations and Makespan with optimal resource management.It is modelled as a scheduling strategy which utilizes the merits of PDOA and BWOA for attaining reactive decisions making with respect to the process of assigning the tasks to virtual resources by considering their priorities into account.It addresses the problem of pre-convergence with wellbalanced exploration and exploitation to attain necessitated Quality of Service(QoS)for minimizing the waiting time incurred during TS process.It further balanced exploration and exploitation rates for reducing the makespan during the task allocation with complete awareness of VM state.The results of the proposed HPDBWOA confirmed minimized energy utilization of 32.18% and reduced cost of 28.94% better than approaches used for investigation.The statistical investigation of the proposed HPDBWOA conducted using ANOVA confirmed its efficacy over the benchmarked systems in terms of throughput,system,and response time.
文摘In biology, signal transduction refers to a process by which a cell converts one kind of signal or stimulus into another. It involves ordered sequences of biochemical reactions inside the cell. These cascades of reactions are carried out by enzymes and activated by second messengers. Signal transduction pathways are complex in nature. Each pathway is responsible for tuning one or more biological functions in the intracellular environment as well as more than one pathway interact among themselves to carry forward a single biological function. Such kind of behavior of these pathways makes understanding difficult. Hence, for the sake of simplicity, they need to be partitioned into smaller modules and then analyzed. We took VEGF signaling pathway, which is responsible for angiogenesis for this kind of modularized study. Modules were obtained by applying the algorithm of Nayak and De (Nayak and De, 2007) for different complexity values. These sets of modules were compared among themselves to get the best set of modules for an optimal complexity value. The best set of modules compared with four different partitioning algorithms namely, Farhat’s (Farhat, 1998), Greedy (Chartrand and Oellermann, 1993), Kernighan-Lin’s (Kernighan and Lin, 1970) and Newman’s community finding algorithm (Newman, 2006). These comparisons enabled us to decide which of the aforementioned algorithms was the best one to create partitions from human VEGF signaling pathway. The optimal complexity value, on which the best set of modules was obtained, was used to get modules from different species for comparative study. Comparison among these modules would shed light on the trend of development of VEGF signaling pathway over these species.
文摘This work proposes a novel approach for multi-type optimal placement of flexible AC transmission system(FACTS) devices so as to optimize multi-objective voltage stability problem. The current study discusses a way for locating and setting of thyristor controlled series capacitor(TCSC) and static var compensator(SVC) using the multi-objective optimization approach named strength pareto multi-objective evolutionary algorithm(SPMOEA). Maximization of the static voltage stability margin(SVSM) and minimizations of real power losses(RPL) and load voltage deviation(LVD) are taken as the goals or three objective functions, when optimally locating multi-type FACTS devices. The performance and effectiveness of the proposed approach has been validated by the simulation results of the IEEE 30-bus and IEEE 118-bus test systems. The proposed approach is compared with non-dominated sorting particle swarm optimization(NSPSO) algorithm. This comparison confirms the usefulness of the multi-objective proposed technique that makes it promising for determination of combinatorial problems of FACTS devices location and setting in large scale power systems.
文摘In this paper we consider a parallel algorithm that detects the maximizer of unimodal function f(x) computable at every point on unbounded interval (0, ∞). The algorithm consists of two modes: scanning and detecting. Search diagrams are introduced as a way to describe parallel searching algorithms on unbounded intervals. Dynamic programming equations, combined with a series of liner programming problems, describe relations between results for every pair of successive evaluations of function f in parallel. Properties of optimal search strategies are derived from these equations. The worst-case complexity analysis shows that, if the maximizer is located on a priori unknown interval (n-1], then it can be detected after cp(n)=「2log「p/2」+1(n+1)」-1 parallel evaluations of f(x), where p is the number of processors.
文摘CC’s(Cloud Computing)networks are distributed and dynamic as signals appear/disappear or lose significance.MLTs(Machine learning Techniques)train datasets which sometime are inadequate in terms of sample for inferring information.A dynamic strategy,DevMLOps(Development Machine Learning Operations)used in automatic selections and tunings of MLTs result in significant performance differences.But,the scheme has many disadvantages including continuity in training,more samples and training time in feature selections and increased classification execution times.RFEs(Recursive Feature Eliminations)are computationally very expensive in its operations as it traverses through each feature without considering correlations between them.This problem can be overcome by the use of Wrappers as they select better features by accounting for test and train datasets.The aim of this paper is to use DevQLMLOps for automated tuning and selections based on orchestrations and messaging between containers.The proposed AKFA(Adaptive Kernel Firefly Algorithm)is for selecting features for CNM(Cloud Network Monitoring)operations.AKFA methodology is demonstrated using CNSD(Cloud Network Security Dataset)with satisfactory results in the performance metrics like precision,recall,F-measure and accuracy used.
文摘AIM To examine the practice pattern in Kaiser Permanente Southern California(KPSC), i.e., gastroenterology(GI)/surgery referrals and endoscopic ultrasound(EUS), for pancreatic cystic neoplasms(PCNs) after the regionwide dissemination of the PCN management algorithm.METHODS Retrospective review was performed; patients with PCN diagnosis given between April 2012 and April 2015(18 mo before and after the publication of the algorithm) in KPSC(integrated health system with 15 hospitals and 202 medical offices in Southern California) were identified.RESULTS2558(1157 pre-and 1401 post-algorithm) received a new diagnosis of PCN in the study period. There was no difference in the mean cyst size(pre-19.1 mm vs post-18.5 mm, P = 0.119). A smaller percentage of PCNs resulted in EUS after the implementation of the algorithm(pre-45.5% vs post-34.8%, P < 0.001). A smaller proportion of patients were referred for GI(pre-65.2% vs post-53.3%, P < 0.001) and surgery consultations(pre-24.8% vs post-16%, P < 0.001) for PCN after the implementation. There was no significant change in operations for PCNs. Cost of diagnostic care was reduced after the implementation by 24%, 18%, and 36% for EUS, GI, and surgery consultations, respectively, with total cost saving of 24%.CONCLUSION In the current healthcare climate, there is increased need to optimize resource utilization. Dissemination of an algorithm for PCN management in an integrated health system resulted in fewer EUS and GI/surgery referrals, likely by aiding the physicians ordering imaging studies in the decision making for the management of PCNs. This translated to cost saving of 24%, 18%, and 36% for EUS, GI, and surgical consultations, respectively, with total diagnostic cost saving of 24%.
基金support from the Ningxia Natural Science Foundation Project(2023AAC03361).
文摘The flying foxes optimization(FFO)algorithm,as a newly introduced metaheuristic algorithm,is inspired by the survival tactics of flying foxes in heat wave environments.FFO preferentially selects the best-performing individuals.This tendency will cause the newly generated solution to remain closely tied to the candidate optimal in the search area.To address this issue,the paper introduces an opposition-based learning-based search mechanism for FFO algorithm(IFFO).Firstly,this paper introduces niching techniques to improve the survival list method,which not only focuses on the adaptability of individuals but also considers the population’s crowding degree to enhance the global search capability.Secondly,an initialization strategy of opposition-based learning is used to perturb the initial population and elevate its quality.Finally,to verify the superiority of the improved search mechanism,IFFO,FFO and the cutting-edge metaheuristic algorithms are compared and analyzed using a set of test functions.The results prove that compared with other algorithms,IFFO is characterized by its rapid convergence,precise results and robust stability.
基金funded by a science and technology project of State Grid Corporation of China“Comparative Analysis of Long-Term Measurement and Prediction of the Ground Synthetic Electric Field of±800 kV DC Transmission Line”(GYW11201907738)Paulo R.F.Rocha acknowledges the support and funding from the European Research Council(ERC)under the European Union’s Horizon 2020 Research and Innovation Program(Grant Agreement No.947897).
文摘Ultra-high voltage(UHV)transmission lines are an important part of China’s power grid and are often surrounded by a complex electromagnetic environment.The ground total electric field is considered a main electromagnetic environment indicator of UHV transmission lines and is currently employed for reliable long-term operation of the power grid.Yet,the accurate prediction of the ground total electric field remains a technical challenge.In this work,we collected the total electric field data from the Ningdong-Zhejiang±800 kV UHVDC transmission project,as of the Ling Shao line,and perform an outlier analysis of the total electric field data.We show that the Local Outlier Factor(LOF)elimination algorithm has a small average difference and overcomes the performance of Density-Based Spatial Clustering of Applications with Noise(DBSCAN)and Isolated Forest elimination algorithms.Moreover,the Stacking algorithm has been found to have superior prediction accuracy than a variety of similar prediction algorithms,including the traditional finite element.The low prediction error of the Stacking algorithm highlights the superior ability to accurately forecast the ground total electric field of UHVDC transmission lines.
文摘Energy efficiency is the prime concern in Wireless Sensor Networks(WSNs) as maximized energy consumption without essentially limits the energy stability and network lifetime. Clustering is the significant approach essential for minimizing unnecessary transmission energy consumption with sustained network lifetime. This clustering process is identified as the Non-deterministic Polynomial(NP)-hard optimization problems which has the maximized probability of being solved through metaheuristic algorithms.This adoption of hybrid metaheuristic algorithm concentrates on the identification of the optimal or nearoptimal solutions which aids in better energy stability during Cluster Head(CH) selection. In this paper,Hybrid Seagull and Whale Optimization Algorithmbased Dynamic Clustering Protocol(HSWOA-DCP)is proposed with the exploitation benefits of WOA and exploration merits of SEOA to optimal CH selection for maintaining energy stability with prolonged network lifetime. This HSWOA-DCP adopted the modified version of SEagull Optimization Algorithm(SEOA) to handle the problem of premature convergence and computational accuracy which is maximally possible during CH selection. The inclusion of SEOA into WOA improved the global searching capability during the selection of CH and prevents worst fitness nodes from being selected as CH, since the spiral attacking behavior of SEOA is similar to the bubble-net characteristics of WOA. This CH selection integrates the spiral attacking principles of SEOA and contraction surrounding mechanism of WOA for improving computation accuracy to prevent frequent election process. It also included the strategy of levy flight strategy into SEOA for potentially avoiding premature convergence to attain better trade-off between the rate of exploration and exploitation in a more effective manner. The simulation results of the proposed HSWOADCP confirmed better network survivability rate, network residual energy and network overall throughput on par with the competitive CH selection schemes under different number of data transmission rounds.The statistical analysis of the proposed HSWOA-DCP scheme also confirmed its energy stability with respect to ANOVA test.
文摘This study examines the multicriteria scheduling problem on a single machine to minimize three criteria: the maximum cost function, denoted by maximum late work (V<sub>max</sub>), maximum tardy job, denoted by (T<sub>max</sub>), and maximum earliness (E<sub>max</sub>). We propose several algorithms based on types of objectives function to be optimized when dealing with simultaneous minimization problems with and without weight and hierarchical minimization problems. The proposed Algorithm (3) is to find the set of efficient solutions for 1//F (V<sub>max</sub>, T<sub>max</sub>, E<sub>max</sub>) and 1//(V<sub>max</sub> + T<sub>max</sub> + E<sub>max</sub>). The Local Search Heuristic Methods (Descent Method (DM), Simulated Annealing (SA), Genetic Algorithm (GA), and the Tree Type Heuristics Method (TTHM) are applied to solve all suggested problems. Finally, the experimental results of Algorithm (3) are compared with the results of the Branch and Bound (BAB) method for optimal and Pareto optimal solutions for smaller instance sizes and compared to the Local Search Heuristic Methods for large instance sizes. These results ensure the efficiency of Algorithm (3) in a reasonable time.