Nowadays,succeeding safe communication and protection-sensitive data from unauthorized access above public networks are the main worries in cloud servers.Hence,to secure both data and keys ensuring secured data storag...Nowadays,succeeding safe communication and protection-sensitive data from unauthorized access above public networks are the main worries in cloud servers.Hence,to secure both data and keys ensuring secured data storage and access,our proposed work designs a Novel Quantum Key Distribution(QKD)relying upon a non-commutative encryption framework.It makes use of a Novel Quantum Key Distribution approach,which guarantees high level secured data transmission.Along with this,a shared secret is generated using Diffie Hellman(DH)to certify secured key generation at reduced time complexity.Moreover,a non-commutative approach is used,which effectively allows the users to store and access the encrypted data into the cloud server.Also,to prevent data loss or corruption caused by the insiders in the cloud,Optimized Genetic Algorithm(OGA)is utilized,which effectively recovers the data and retrieve it if the missed data without loss.It is then followed with the decryption process as if requested by the user.Thus our proposed framework ensures authentication and paves way for secure data access,with enhanced performance and reduced complexities experienced with the prior works.展开更多
Background: Popliteal cysts are common and present as asymptomatic lumps in the medial popliteal fossa. Some have complex internal characteristics such as septa and loose-bodies. However, not all are popliteal cysts a...Background: Popliteal cysts are common and present as asymptomatic lumps in the medial popliteal fossa. Some have complex internal characteristics such as septa and loose-bodies. However, not all are popliteal cysts and can be aggressive. These lesions need to be differentiated by the absence of the communicating neck with the joint on ultrasound. Presence of Doppler flow of non-communicating cysts requires further evaluation on MRI, prior to performing a biopsy. Using a case series, we propose an algorithmic approach that is simple and will help identify the malignant lesions and institute appropriate management. Case-Presentation: Popliteal Cyst: On ultrasound: characteristic neck communicating with knee joint. Synovial Sarcoma: Gadolinium enhancement, with areas of low-, iso- and hyper-intense signal to fat on T2. Synovial-Osteochondromatosis: Non-mineralized: T1-low/intermediate intensity;T2-high intensity. Mineralized type: low intensity on T1 & T2. Thrombosed Popliteal Aneurysm: Lamellated appearance-high/low signal intensity on T2. Myxoid-Liposarcomas: Inhomogeneous appearance;homogenous with gadolinium. Usually require a biopsy for diagnosis. Conclusion: The cystic lesions in the medial aspect of the popliteal fossa can be misdiagnosed. Our article reiterates the importance of the communicating neck that separates popliteal cysts from other mimics. We have proposed an algorithm to identify these mimics.展开更多
Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a s...Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.展开更多
In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications...In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.展开更多
The large-scale optimization problem requires some optimization techniques, and the Metaheuristics approach is highly useful for solving difficult optimization problems in practice. The purpose of the research is to o...The large-scale optimization problem requires some optimization techniques, and the Metaheuristics approach is highly useful for solving difficult optimization problems in practice. The purpose of the research is to optimize the transportation system with the help of this approach. We selected forest vehicle routing data as the case study to minimize the total cost and the distance of the forest transportation system. Matlab software helps us find the best solution for this case by applying three algorithms of Metaheuristics: Genetic Algorithm (GA), Ant Colony Optimization (ACO), and Extended Great Deluge (EGD). The results show that GA, compared to ACO and EGD, provides the best solution for the cost and the length of our case study. EGD is the second preferred approach, and ACO offers the last solution.展开更多
The continuously updated database of failures and censored data of numerous products has become large, and on some covariates, information regarding the failure times is missing in the database. As the dataset is larg...The continuously updated database of failures and censored data of numerous products has become large, and on some covariates, information regarding the failure times is missing in the database. As the dataset is large and has missing information, the analysis tasks become complicated and a long time is required to execute the programming codes. In such situations, the divide and recombine (D&R) approach, which has a practical computational performance for big data analysis, can be applied. In this study, the D&R approach was applied to analyze the real field data of an automobile component with incomplete information on covariates using the Weibull regression model. Model parameters were estimated using the expectation maximization algorithm. The results of the data analysis and simulation demonstrated that the D&R approach is applicable for analyzing such datasets. Further, the percentiles and reliability functions of the distribution under different covariate conditions were estimated to evaluate the component performance of these covariates. The findings of this study have managerial implications regarding design decisions, safety, and reliability of automobile components.展开更多
In time series modeling, the residuals are often checked for white noise and normality. In practice, the useful tests are Ljung Box test. Mcleod Li test and Lin Mudholkar test. In this paper, we present a nonparame...In time series modeling, the residuals are often checked for white noise and normality. In practice, the useful tests are Ljung Box test. Mcleod Li test and Lin Mudholkar test. In this paper, we present a nonparametric approach for checking the residuals of time series models. This approach is based on the maximal correlation coefficient ρ 2 * between the residuals and time t . The basic idea is to use the bootstrap to form the null distribution of the statistic ρ 2 * under the null hypothesis H 0:ρ 2 * =0. For calculating ρ 2 * , we proposes a ρ algorithm, analogous to ACE procedure. Power study shows this approach is more powerful than Ljung Box test. Meanwhile, some numerical results and two examples are reported in this paper.展开更多
The vehicle routing problem(VRP)is a typical discrete combinatorial optimization problem,and many models and algorithms have been proposed to solve the VRP and its variants.Although existing approaches have contribute...The vehicle routing problem(VRP)is a typical discrete combinatorial optimization problem,and many models and algorithms have been proposed to solve the VRP and its variants.Although existing approaches have contributed significantly to the development of this field,these approaches either are limited in problem size or need manual intervention in choosing parameters.To solve these difficulties,many studies have considered learning-based optimization(LBO)algorithms to solve the VRP.This paper reviews recent advances in this field and divides relevant approaches into end-to-end approaches and step-by-step approaches.We performed a statistical analysis of the reviewed articles from various aspects and designed three experiments to evaluate the performance of four representative LBO algorithms.Finally,we conclude the applicable types of problems for different LBO algorithms and suggest directions in which researchers can improve LBO algorithms.展开更多
The variational data assimilation scheme (VAR) is applied to investigating the advective effect and the evolution of the control variables in time splitting semi-Lagrangian framework. Two variational algorithms are us...The variational data assimilation scheme (VAR) is applied to investigating the advective effect and the evolution of the control variables in time splitting semi-Lagrangian framework. Two variational algorithms are used. One is the conjugate code method-direct approach, and another is the numerical backward integration of analytical adjoint equation—indirect approach. Theoretical derivation and sensitivity tests are conducted in order to verify the consistency and inconsistency of the two algorithms under the semi-Lagrangian framework. On the other hand, the sensitivity of the perfect and imperfect initial condition is also tested in both direct and indirect approaches. Our research has shown that the two algorithms are not only identical in theory, but also identical in numerical calculation. Furthermore, the algorithms of the indirect approach are much more feasible and efficient than that of the direct one when both are employed in the semi-Lagrangian framework. Taking advantage of semi-Lagrangian framework, one purpose of this paper is to illustrate when the variational assimilation algorithm is concerned in the computational method of the backward integration, the algorithm is extremely facilitated. Such simplicity in indirect approach should be meaningful for the VAR design in passive model. Indeed, if one can successfully split the diabatic and adiabatic process, the algorithms represented in this paper might be easily used in a more general vision of atmospheric model.展开更多
In this paper, a direct probabilistic approach(DPA) is presented to formulate and solve moment equations for nonlinear systems excited by environmental loads that can be either a stationary or nonstationary random p...In this paper, a direct probabilistic approach(DPA) is presented to formulate and solve moment equations for nonlinear systems excited by environmental loads that can be either a stationary or nonstationary random process.The proposed method has the advantage of obtaining the response's moments directly from the initial conditions and statistical characteristics of the corresponding external excitations. First, the response's moment equations are directly derived based on a DPA, which is completely independent of the It?/filtering approach since no specific assumptions regarding the correlation structure of excitation are made.By solving them under Gaussian closure, the response's moments can be obtained. Subsequently, a multiscale algorithm for the numerical solution of moment equations is exploited to improve computational efficiency and avoid much wall-clock time. Finally, a comparison of the results with Monte Carlo(MC) simulation gives good agreement.Furthermore, the advantage of the multiscale algorithm in terms of efficiency is also demonstrated by an engineering example.展开更多
In this paper, we develop a modified accelerated stochastic simulation method for chemically reacting systems, called the "final all possible steps" (FAPS) method, which obtains the reliable statistics of all spec...In this paper, we develop a modified accelerated stochastic simulation method for chemically reacting systems, called the "final all possible steps" (FAPS) method, which obtains the reliable statistics of all species in any time during the time course with fewer simulation times. Moreover, the FAPS method can be incorporated into the leap methods, which makes the simulation of larger systems more efficient. Numerical results indicate that the proposed methods can be applied to a wide range of chemically reacting systems with a high-precision level and obtain a significant improvement on efficiency over the existing methods.展开更多
Skeletal dysplasias are not uncommon entities and a radiologist is likely to encounter a suspected case of dysplasia in his practice. The correct and early diagnosis of dysplasia is important for management of complic...Skeletal dysplasias are not uncommon entities and a radiologist is likely to encounter a suspected case of dysplasia in his practice. The correct and early diagnosis of dysplasia is important for management of complications and for future genetic counselling. While there is an exhaustive classification system on dysplasias, it is important to be familiar with the radiological features of common dysplasias. In this article, we enumerate a radiographic approach to skeletal dysplasias, describe the essential as well as differentiating features of common non-lethal skeletal dysplasias and conclude by presenting working algorithms to either definitively diagnose a particular dysplasia or suggest the most likely differential diagnoses to the referring clinician and thus direct further workup of the patient.展开更多
Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features ...Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.展开更多
In order to reduce the computation of complex problems, a new surrogate-assisted estimation of distribution algorithm with Gaussian process was proposed. Coevolution was used in dual populations which evolved in paral...In order to reduce the computation of complex problems, a new surrogate-assisted estimation of distribution algorithm with Gaussian process was proposed. Coevolution was used in dual populations which evolved in parallel. The search space was projected into multiple subspaces and searched by sub-populations. Also, the whole space was exploited by the other population which exchanges information with the sub-populations. In order to make the evolutionary course efficient, multivariate Gaussian model and Gaussian mixture model were used in both populations separately to estimate the distribution of individuals and reproduce new generations. For the surrogate model, Gaussian process was combined with the algorithm which predicted variance of the predictions. The results on six benchmark functions show that the new algorithm performs better than other surrogate-model based algorithms and the computation complexity is only 10% of the original estimation of distribution algorithm.展开更多
Spam has turned into a big predicament these days,due to the increase in the number of spam emails,as the recipient regularly receives piles of emails.Not only is spam wasting users’time and bandwidth.In addition,it ...Spam has turned into a big predicament these days,due to the increase in the number of spam emails,as the recipient regularly receives piles of emails.Not only is spam wasting users’time and bandwidth.In addition,it limits the storage space of the email box as well as the disk space.Thus,spam detection is a challenge for individuals and organizations alike.To advance spam email detection,this work proposes a new spam detection approach,using the grasshopper optimization algorithm(GOA)in training a multilayer perceptron(MLP)classifier for categorizing emails as ham and spam.Hence,MLP and GOA produce an artificial neural network(ANN)model,referred to(GOAMLP).Two corpora are applied Spam Base and UK-2011Web spam for this approach.Finally,the finding represents evidence that the proposed spam detection approach has achieved a better level in spam detection than the status of the art.展开更多
Antiviral software systems (AVSs) have problems in identifying polymorphic variants of viruses without explicit signatures for such variants. Alignment-based techniques from bioinformatics may provide a novel way to g...Antiviral software systems (AVSs) have problems in identifying polymorphic variants of viruses without explicit signatures for such variants. Alignment-based techniques from bioinformatics may provide a novel way to generate signatures from consensuses found in polymorphic variant code. We demonstrate how multiple sequence alignment supplemented with gap penalties leads to viral code signatures that generalize successfully to previously known polymorphic variants of JS. Cassandra virus and previously unknown polymorphic variants of W32.CTX/W32.Cholera and W32.Kitti viruses. The implications are that future smart AVSs may be able to generate effective signatures automatically from actual viral code by varying gap penalties to cover for both known and unknown polymorphic variants.展开更多
One of the main problems of machine learning and data mining is to develop a basic model with a few features,to reduce the algorithms involved in classification’s computational complexity.In this paper,the collection...One of the main problems of machine learning and data mining is to develop a basic model with a few features,to reduce the algorithms involved in classification’s computational complexity.In this paper,the collection of features has an essential importance in the classification process to be able minimize computational time,which decreases data size and increases the precision and effectiveness of specific machine learning activities.Due to its superiority to conventional optimization methods,several metaheuristics have been used to resolve FS issues.This is why hybrid metaheuristics help increase the search and convergence rate of the critical algorithms.A modern hybrid selection algorithm combining the two algorithms;the genetic algorithm(GA)and the Particle Swarm Optimization(PSO)to enhance search capabilities is developed in this paper.The efficacy of our proposed method is illustrated in a series of simulation phases,using the UCI learning array as a benchmark dataset.展开更多
The traveling salesman problem (TSP) is a classical optimization problem and it is one of a class of NP- Problem. This paper presents a new method named multiagent approach based genetic algorithm and ant colony sys...The traveling salesman problem (TSP) is a classical optimization problem and it is one of a class of NP- Problem. This paper presents a new method named multiagent approach based genetic algorithm and ant colony system to solve the TSP. Three kinds of agents with different function were designed in the multi-agent architecture proposed by this paper. The first kind of agent is ant colony optimization agent and its function is generating the new solution continuously. The second kind of agent is selection agent, crossover agent and mutation agent, their function is optimizing the current solutions group. The third kind of agent is fast local searching agent and its function is optimizing the best solution from the beginning of the trial. At the end of this paper, the experimental results have shown that the proposed hybrid ap proach has good performance with respect to the quality of solution and the speed of computation.展开更多
Low-density parity-check (LDPC) codes are very efficient for communicating reliably through a noisy channel. N.Sourlas [1] showed that LDPC codes, which revolutionize the codes domain and used in many communications s...Low-density parity-check (LDPC) codes are very efficient for communicating reliably through a noisy channel. N.Sourlas [1] showed that LDPC codes, which revolutionize the codes domain and used in many communications standards, can be mapped onto an Ising spin systems. Besides, it has been shown that the Belief-Propagation (BP) algorithm, the LDPC codes decoding algorithm, is equivalent to the Thouless- Anderson-Palmer (TAP) approach [2]. Unfortunately, no study has been made for the other decoding algorithms. In this paper, we develop the Log-Likelihood Ratios-Belief Propagation (LLR-BP) algorithm and its simplifications the BP-Based algorithm and the λ-min algorithm with the TAP approach. We present the performance of these decoding algorithms using statistical physics argument i.e., we present the performance as function of the magnetization.展开更多
文摘Nowadays,succeeding safe communication and protection-sensitive data from unauthorized access above public networks are the main worries in cloud servers.Hence,to secure both data and keys ensuring secured data storage and access,our proposed work designs a Novel Quantum Key Distribution(QKD)relying upon a non-commutative encryption framework.It makes use of a Novel Quantum Key Distribution approach,which guarantees high level secured data transmission.Along with this,a shared secret is generated using Diffie Hellman(DH)to certify secured key generation at reduced time complexity.Moreover,a non-commutative approach is used,which effectively allows the users to store and access the encrypted data into the cloud server.Also,to prevent data loss or corruption caused by the insiders in the cloud,Optimized Genetic Algorithm(OGA)is utilized,which effectively recovers the data and retrieve it if the missed data without loss.It is then followed with the decryption process as if requested by the user.Thus our proposed framework ensures authentication and paves way for secure data access,with enhanced performance and reduced complexities experienced with the prior works.
文摘Background: Popliteal cysts are common and present as asymptomatic lumps in the medial popliteal fossa. Some have complex internal characteristics such as septa and loose-bodies. However, not all are popliteal cysts and can be aggressive. These lesions need to be differentiated by the absence of the communicating neck with the joint on ultrasound. Presence of Doppler flow of non-communicating cysts requires further evaluation on MRI, prior to performing a biopsy. Using a case series, we propose an algorithmic approach that is simple and will help identify the malignant lesions and institute appropriate management. Case-Presentation: Popliteal Cyst: On ultrasound: characteristic neck communicating with knee joint. Synovial Sarcoma: Gadolinium enhancement, with areas of low-, iso- and hyper-intense signal to fat on T2. Synovial-Osteochondromatosis: Non-mineralized: T1-low/intermediate intensity;T2-high intensity. Mineralized type: low intensity on T1 & T2. Thrombosed Popliteal Aneurysm: Lamellated appearance-high/low signal intensity on T2. Myxoid-Liposarcomas: Inhomogeneous appearance;homogenous with gadolinium. Usually require a biopsy for diagnosis. Conclusion: The cystic lesions in the medial aspect of the popliteal fossa can be misdiagnosed. Our article reiterates the importance of the communicating neck that separates popliteal cysts from other mimics. We have proposed an algorithm to identify these mimics.
文摘Steganography is a technique for hiding secret messages while sending and receiving communications through a cover item.From ancient times to the present,the security of secret or vital information has always been a significant problem.The development of secure communication methods that keep recipient-only data transmissions secret has always been an area of interest.Therefore,several approaches,including steganography,have been developed by researchers over time to enable safe data transit.In this review,we have discussed image steganography based on Discrete Cosine Transform(DCT)algorithm,etc.We have also discussed image steganography based on multiple hashing algorithms like the Rivest–Shamir–Adleman(RSA)method,the Blowfish technique,and the hash-least significant bit(LSB)approach.In this review,a novel method of hiding information in images has been developed with minimal variance in image bits,making our method secure and effective.A cryptography mechanism was also used in this strategy.Before encoding the data and embedding it into a carry image,this review verifies that it has been encrypted.Usually,embedded text in photos conveys crucial signals about the content.This review employs hash table encryption on the message before hiding it within the picture to provide a more secure method of data transport.If the message is ever intercepted by a third party,there are several ways to stop this operation.A second level of security process implementation involves encrypting and decrypting steganography images using different hashing algorithms.
基金This work was supported in part by the National Science and Technology Council of Taiwan,under Contract NSTC 112-2410-H-324-001-MY2.
文摘In recent decades,fog computing has played a vital role in executing parallel computational tasks,specifically,scientific workflow tasks.In cloud data centers,fog computing takes more time to run workflow applications.Therefore,it is essential to develop effective models for Virtual Machine(VM)allocation and task scheduling in fog computing environments.Effective task scheduling,VM migration,and allocation,altogether optimize the use of computational resources across different fog nodes.This process ensures that the tasks are executed with minimal energy consumption,which reduces the chances of resource bottlenecks.In this manuscript,the proposed framework comprises two phases:(i)effective task scheduling using a fractional selectivity approach and(ii)VM allocation by proposing an algorithm by the name of Fitness Sharing Chaotic Particle Swarm Optimization(FSCPSO).The proposed FSCPSO algorithm integrates the concepts of chaos theory and fitness sharing that effectively balance both global exploration and local exploitation.This balance enables the use of a wide range of solutions that leads to minimal total cost and makespan,in comparison to other traditional optimization algorithms.The FSCPSO algorithm’s performance is analyzed using six evaluation measures namely,Load Balancing Level(LBL),Average Resource Utilization(ARU),total cost,makespan,energy consumption,and response time.In relation to the conventional optimization algorithms,the FSCPSO algorithm achieves a higher LBL of 39.12%,ARU of 58.15%,a minimal total cost of 1175,and a makespan of 85.87 ms,particularly when evaluated for 50 tasks.
文摘The large-scale optimization problem requires some optimization techniques, and the Metaheuristics approach is highly useful for solving difficult optimization problems in practice. The purpose of the research is to optimize the transportation system with the help of this approach. We selected forest vehicle routing data as the case study to minimize the total cost and the distance of the forest transportation system. Matlab software helps us find the best solution for this case by applying three algorithms of Metaheuristics: Genetic Algorithm (GA), Ant Colony Optimization (ACO), and Extended Great Deluge (EGD). The results show that GA, compared to ACO and EGD, provides the best solution for the cost and the length of our case study. EGD is the second preferred approach, and ACO offers the last solution.
文摘The continuously updated database of failures and censored data of numerous products has become large, and on some covariates, information regarding the failure times is missing in the database. As the dataset is large and has missing information, the analysis tasks become complicated and a long time is required to execute the programming codes. In such situations, the divide and recombine (D&R) approach, which has a practical computational performance for big data analysis, can be applied. In this study, the D&R approach was applied to analyze the real field data of an automobile component with incomplete information on covariates using the Weibull regression model. Model parameters were estimated using the expectation maximization algorithm. The results of the data analysis and simulation demonstrated that the D&R approach is applicable for analyzing such datasets. Further, the percentiles and reliability functions of the distribution under different covariate conditions were estimated to evaluate the component performance of these covariates. The findings of this study have managerial implications regarding design decisions, safety, and reliability of automobile components.
文摘In time series modeling, the residuals are often checked for white noise and normality. In practice, the useful tests are Ljung Box test. Mcleod Li test and Lin Mudholkar test. In this paper, we present a nonparametric approach for checking the residuals of time series models. This approach is based on the maximal correlation coefficient ρ 2 * between the residuals and time t . The basic idea is to use the bootstrap to form the null distribution of the statistic ρ 2 * under the null hypothesis H 0:ρ 2 * =0. For calculating ρ 2 * , we proposes a ρ algorithm, analogous to ACE procedure. Power study shows this approach is more powerful than Ljung Box test. Meanwhile, some numerical results and two examples are reported in this paper.
文摘The vehicle routing problem(VRP)is a typical discrete combinatorial optimization problem,and many models and algorithms have been proposed to solve the VRP and its variants.Although existing approaches have contributed significantly to the development of this field,these approaches either are limited in problem size or need manual intervention in choosing parameters.To solve these difficulties,many studies have considered learning-based optimization(LBO)algorithms to solve the VRP.This paper reviews recent advances in this field and divides relevant approaches into end-to-end approaches and step-by-step approaches.We performed a statistical analysis of the reviewed articles from various aspects and designed three experiments to evaluate the performance of four representative LBO algorithms.Finally,we conclude the applicable types of problems for different LBO algorithms and suggest directions in which researchers can improve LBO algorithms.
文摘The variational data assimilation scheme (VAR) is applied to investigating the advective effect and the evolution of the control variables in time splitting semi-Lagrangian framework. Two variational algorithms are used. One is the conjugate code method-direct approach, and another is the numerical backward integration of analytical adjoint equation—indirect approach. Theoretical derivation and sensitivity tests are conducted in order to verify the consistency and inconsistency of the two algorithms under the semi-Lagrangian framework. On the other hand, the sensitivity of the perfect and imperfect initial condition is also tested in both direct and indirect approaches. Our research has shown that the two algorithms are not only identical in theory, but also identical in numerical calculation. Furthermore, the algorithms of the indirect approach are much more feasible and efficient than that of the direct one when both are employed in the semi-Lagrangian framework. Taking advantage of semi-Lagrangian framework, one purpose of this paper is to illustrate when the variational assimilation algorithm is concerned in the computational method of the backward integration, the algorithm is extremely facilitated. Such simplicity in indirect approach should be meaningful for the VAR design in passive model. Indeed, if one can successfully split the diabatic and adiabatic process, the algorithms represented in this paper might be easily used in a more general vision of atmospheric model.
基金supported by the Defense Industrial Technology Development Program (Grant JCKY2013601B)the "111" Project (Grant B07009)the National Natural Science Foundation of China (Grants 11372025, 11432002)
文摘In this paper, a direct probabilistic approach(DPA) is presented to formulate and solve moment equations for nonlinear systems excited by environmental loads that can be either a stationary or nonstationary random process.The proposed method has the advantage of obtaining the response's moments directly from the initial conditions and statistical characteristics of the corresponding external excitations. First, the response's moment equations are directly derived based on a DPA, which is completely independent of the It?/filtering approach since no specific assumptions regarding the correlation structure of excitation are made.By solving them under Gaussian closure, the response's moments can be obtained. Subsequently, a multiscale algorithm for the numerical solution of moment equations is exploited to improve computational efficiency and avoid much wall-clock time. Finally, a comparison of the results with Monte Carlo(MC) simulation gives good agreement.Furthermore, the advantage of the multiscale algorithm in terms of efficiency is also demonstrated by an engineering example.
基金the National Natural Science Foundation of China(No.30571059)the National High-Tech Research and Development Program of China(No.2006AA02Z190)
文摘In this paper, we develop a modified accelerated stochastic simulation method for chemically reacting systems, called the "final all possible steps" (FAPS) method, which obtains the reliable statistics of all species in any time during the time course with fewer simulation times. Moreover, the FAPS method can be incorporated into the leap methods, which makes the simulation of larger systems more efficient. Numerical results indicate that the proposed methods can be applied to a wide range of chemically reacting systems with a high-precision level and obtain a significant improvement on efficiency over the existing methods.
文摘Skeletal dysplasias are not uncommon entities and a radiologist is likely to encounter a suspected case of dysplasia in his practice. The correct and early diagnosis of dysplasia is important for management of complications and for future genetic counselling. While there is an exhaustive classification system on dysplasias, it is important to be familiar with the radiological features of common dysplasias. In this article, we enumerate a radiographic approach to skeletal dysplasias, describe the essential as well as differentiating features of common non-lethal skeletal dysplasias and conclude by presenting working algorithms to either definitively diagnose a particular dysplasia or suggest the most likely differential diagnoses to the referring clinician and thus direct further workup of the patient.
基金support from the Deanship of Scientific Research,University of Hail,Saudi Arabia through the project Ref.(RG-191315).
文摘Software testing has been attracting a lot of attention for effective software development.In model driven approach,Unified Modelling Language(UML)is a conceptual modelling approach for obligations and other features of the system in a model-driven methodology.Specialized tools interpret these models into other software artifacts such as code,test data and documentation.The generation of test cases permits the appropriate test data to be determined that have the aptitude to ascertain the requirements.This paper focuses on optimizing the test data obtained from UML activity and state chart diagrams by using Basic Genetic Algorithm(BGA).For generating the test cases,both diagrams were converted into their corresponding intermediate graphical forms namely,Activity Diagram Graph(ADG)and State Chart Diagram Graph(SCDG).Then both graphs will be combined to form a single graph called,Activity State Chart Diagram Graph(ASCDG).Both graphs were then joined to create a single graph known as the Activity State Chart Diagram Graph(ASCDG).Next,the ASCDG will be optimized using BGA to generate the test data.A case study involving a withdrawal from the automated teller machine(ATM)of a bank was employed to demonstrate the approach.The approach successfully identified defects in various ATM functions such as messaging and operation.
基金Project(2009CB320603)supported by the National Basic Research Program of ChinaProject(IRT0712)supported by Program for Changjiang Scholars and Innovative Research Team in University+1 种基金Project(B504)supported by the Shanghai Leading Academic Discipline ProgramProject(61174118)supported by the National Natural Science Foundation of China
文摘In order to reduce the computation of complex problems, a new surrogate-assisted estimation of distribution algorithm with Gaussian process was proposed. Coevolution was used in dual populations which evolved in parallel. The search space was projected into multiple subspaces and searched by sub-populations. Also, the whole space was exploited by the other population which exchanges information with the sub-populations. In order to make the evolutionary course efficient, multivariate Gaussian model and Gaussian mixture model were used in both populations separately to estimate the distribution of individuals and reproduce new generations. For the surrogate model, Gaussian process was combined with the algorithm which predicted variance of the predictions. The results on six benchmark functions show that the new algorithm performs better than other surrogate-model based algorithms and the computation complexity is only 10% of the original estimation of distribution algorithm.
文摘Spam has turned into a big predicament these days,due to the increase in the number of spam emails,as the recipient regularly receives piles of emails.Not only is spam wasting users’time and bandwidth.In addition,it limits the storage space of the email box as well as the disk space.Thus,spam detection is a challenge for individuals and organizations alike.To advance spam email detection,this work proposes a new spam detection approach,using the grasshopper optimization algorithm(GOA)in training a multilayer perceptron(MLP)classifier for categorizing emails as ham and spam.Hence,MLP and GOA produce an artificial neural network(ANN)model,referred to(GOAMLP).Two corpora are applied Spam Base and UK-2011Web spam for this approach.Finally,the finding represents evidence that the proposed spam detection approach has achieved a better level in spam detection than the status of the art.
文摘Antiviral software systems (AVSs) have problems in identifying polymorphic variants of viruses without explicit signatures for such variants. Alignment-based techniques from bioinformatics may provide a novel way to generate signatures from consensuses found in polymorphic variant code. We demonstrate how multiple sequence alignment supplemented with gap penalties leads to viral code signatures that generalize successfully to previously known polymorphic variants of JS. Cassandra virus and previously unknown polymorphic variants of W32.CTX/W32.Cholera and W32.Kitti viruses. The implications are that future smart AVSs may be able to generate effective signatures automatically from actual viral code by varying gap penalties to cover for both known and unknown polymorphic variants.
基金This work was partially supported by the National Natural Science Foundation of China(61876089,61876185,61902281,61375121)the Opening Project of Jiangsu Key Laboratory of Data Science and Smart Software(No.2019DS301)+1 种基金the Engineering Research Center of Digital Forensics,Ministry of Education,the Key Research and Development Program of Jiangsu Province(BE2020633)the Priority Academic Program Development of Jiangsu Higher Education Institutions。
文摘One of the main problems of machine learning and data mining is to develop a basic model with a few features,to reduce the algorithms involved in classification’s computational complexity.In this paper,the collection of features has an essential importance in the classification process to be able minimize computational time,which decreases data size and increases the precision and effectiveness of specific machine learning activities.Due to its superiority to conventional optimization methods,several metaheuristics have been used to resolve FS issues.This is why hybrid metaheuristics help increase the search and convergence rate of the critical algorithms.A modern hybrid selection algorithm combining the two algorithms;the genetic algorithm(GA)and the Particle Swarm Optimization(PSO)to enhance search capabilities is developed in this paper.The efficacy of our proposed method is illustrated in a series of simulation phases,using the UCI learning array as a benchmark dataset.
基金Supported by the National Natural Science Foun-dation of China (69973016)
文摘The traveling salesman problem (TSP) is a classical optimization problem and it is one of a class of NP- Problem. This paper presents a new method named multiagent approach based genetic algorithm and ant colony system to solve the TSP. Three kinds of agents with different function were designed in the multi-agent architecture proposed by this paper. The first kind of agent is ant colony optimization agent and its function is generating the new solution continuously. The second kind of agent is selection agent, crossover agent and mutation agent, their function is optimizing the current solutions group. The third kind of agent is fast local searching agent and its function is optimizing the best solution from the beginning of the trial. At the end of this paper, the experimental results have shown that the proposed hybrid ap proach has good performance with respect to the quality of solution and the speed of computation.
基金supported by National Natural Science Foundation of China(61364017,60804066)The Scientific and Technological Project of Education Department of Jiangxi Province(KJLD12068)Natural Science Foundation of Jiangxi Province(20132BAB201039)
文摘Low-density parity-check (LDPC) codes are very efficient for communicating reliably through a noisy channel. N.Sourlas [1] showed that LDPC codes, which revolutionize the codes domain and used in many communications standards, can be mapped onto an Ising spin systems. Besides, it has been shown that the Belief-Propagation (BP) algorithm, the LDPC codes decoding algorithm, is equivalent to the Thouless- Anderson-Palmer (TAP) approach [2]. Unfortunately, no study has been made for the other decoding algorithms. In this paper, we develop the Log-Likelihood Ratios-Belief Propagation (LLR-BP) algorithm and its simplifications the BP-Based algorithm and the λ-min algorithm with the TAP approach. We present the performance of these decoding algorithms using statistical physics argument i.e., we present the performance as function of the magnetization.