Due to coexistence of huge number of structural isomers,global search for the ground-state structures of atomic clusters is a challenging issue.The difficulty also originates from the computational cost of ab initio m...Due to coexistence of huge number of structural isomers,global search for the ground-state structures of atomic clusters is a challenging issue.The difficulty also originates from the computational cost of ab initio methods for describing the potential energy surface.Recently,machine learning techniques have been widely utilized to accelerate materials discovery and molecular simulation.Compared to the commonly used artificial neural network,graph network is naturally suitable for clusters with flexible geometric environment of each atom.Herein we develop a cluster graph attention network(CGANet)by aggregating information of neighboring vertices and edges using attention mechanism,which can precisely predict the binding energy and force of silver clusters with root mean square error of 5.4 meV/atom and mean absolute error of 42.3 meV/Å,respectively.As a proof-of-concept,we have performed global optimization of mediumsized Agn clusters(n=14–26)by combining CGANet and genetic algorithm.The reported ground-state structures for n=14–21,have been successfully reproduced,while entirely new lowest-energy structures are obtained for n=22–26.In addition to the description of potential energy surface,the CGANet is also applied to predict the electronic properties of clusters,such as HOMO energy and HOMO-LUMO gap.With accuracy comparable to ab initio methods and acceleration by at least two orders of magnitude,CGANet holds great promise in global search of lowest-energy structures of large clusters and inverse design of functional clusters.展开更多
Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstructio...Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstruction and evolution of the sampling distributions over the space of candidate solutions. Iterativeconstruction of the sampling distributions is based on the idea of the global random search of generationalmethods. Under this frame, propontional selection is characterized as a gobal search operator, and recombination is characerized as the search process that exploits similarities. It is shown-that by properly constraining the search breadth of recombination operators, weak convergence of evolutionary algorithms to aglobal optimum can be ensured.展开更多
A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search...A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search direction is based on empir- ical gradients by evaluating the response to the position changes, while step length is based on uncertainty reasoning by using a simple fuzzy rule. The effectiveness of the SOA is evaluated by using a challenging set of typically complex functions in compari- son to differential evolution (DE) and three modified particle swarm optimization (PSO) algorithms. The simulation results show that the performance of the SOA is superior or comparable to that of the other algorithms.展开更多
In this paper, the improvement of pure random search is studied. By taking some information of the function to be minimized into consideration, the authors propose two stochastic global optimization algorithms. Some n...In this paper, the improvement of pure random search is studied. By taking some information of the function to be minimized into consideration, the authors propose two stochastic global optimization algorithms. Some numerical experiments for the new stochastic global optimization algorithms are presented for a class of test problems.展开更多
In this paper we consider the global convergence of any conjugate gradient method of the form d1=-g1,dk+1=-gk+1+βkdk(k≥1)with any βk satisfying sume conditions,and with the strong wolfe line search conditions.Under...In this paper we consider the global convergence of any conjugate gradient method of the form d1=-g1,dk+1=-gk+1+βkdk(k≥1)with any βk satisfying sume conditions,and with the strong wolfe line search conditions.Under the convex assumption on the objective function,we preve the descenf property and the global convergence of this method.展开更多
In this paper, we extend a descent algorithm without line search for solving unconstrained optimization problems. Under mild conditions, its global convergence is established. Further, we generalize the search directi...In this paper, we extend a descent algorithm without line search for solving unconstrained optimization problems. Under mild conditions, its global convergence is established. Further, we generalize the search direction to more general form, and also obtain the global convergence of corresponding algorithm. The numerical results illustrate that the new algorithm is effective.展开更多
In this paper we propose a new family of curve search methods for unconstrained optimization problems, which are based on searching a new iterate along a curve through the current iterate at each iteration, while line...In this paper we propose a new family of curve search methods for unconstrained optimization problems, which are based on searching a new iterate along a curve through the current iterate at each iteration, while line search methods are based on finding a new iterate on a line starting from the current iterate at each iteration. The global convergence and linear convergence rate of these curve search methods are investigated under some mild conditions. Numerical results show that some curve search methods are stable and effective in solving some large scale minimization problems.展开更多
基金the National Natural Science Foundation of China(Grant Nos.11804076 and 91961204)the Fundamental Research Funds for the Central Universities of China(No.B210202151)the Changzhou Science and Technology Plan(No.CZ520012712).
文摘Due to coexistence of huge number of structural isomers,global search for the ground-state structures of atomic clusters is a challenging issue.The difficulty also originates from the computational cost of ab initio methods for describing the potential energy surface.Recently,machine learning techniques have been widely utilized to accelerate materials discovery and molecular simulation.Compared to the commonly used artificial neural network,graph network is naturally suitable for clusters with flexible geometric environment of each atom.Herein we develop a cluster graph attention network(CGANet)by aggregating information of neighboring vertices and edges using attention mechanism,which can precisely predict the binding energy and force of silver clusters with root mean square error of 5.4 meV/atom and mean absolute error of 42.3 meV/Å,respectively.As a proof-of-concept,we have performed global optimization of mediumsized Agn clusters(n=14–26)by combining CGANet and genetic algorithm.The reported ground-state structures for n=14–21,have been successfully reproduced,while entirely new lowest-energy structures are obtained for n=22–26.In addition to the description of potential energy surface,the CGANet is also applied to predict the electronic properties of clusters,such as HOMO energy and HOMO-LUMO gap.With accuracy comparable to ab initio methods and acceleration by at least two orders of magnitude,CGANet holds great promise in global search of lowest-energy structures of large clusters and inverse design of functional clusters.
文摘Evolutionary computation is a kind of adaptive non--numerical computation method which is designed tosimulate evolution of nature. In this paper, evolutionary algorithm behavior is described in terms of theconstruction and evolution of the sampling distributions over the space of candidate solutions. Iterativeconstruction of the sampling distributions is based on the idea of the global random search of generationalmethods. Under this frame, propontional selection is characterized as a gobal search operator, and recombination is characerized as the search process that exploits similarities. It is shown-that by properly constraining the search breadth of recombination operators, weak convergence of evolutionary algorithms to aglobal optimum can be ensured.
基金supported by the National Natural Science Foundation of China(60870004)
文摘A novel heuristic search algorithm called seeker op- timization algorithm (SOA) is proposed for the real-parameter optimization. The proposed SOA is based on simulating the act of human searching. In the SOA, search direction is based on empir- ical gradients by evaluating the response to the position changes, while step length is based on uncertainty reasoning by using a simple fuzzy rule. The effectiveness of the SOA is evaluated by using a challenging set of typically complex functions in compari- son to differential evolution (DE) and three modified particle swarm optimization (PSO) algorithms. The simulation results show that the performance of the SOA is superior or comparable to that of the other algorithms.
文摘In this paper, the improvement of pure random search is studied. By taking some information of the function to be minimized into consideration, the authors propose two stochastic global optimization algorithms. Some numerical experiments for the new stochastic global optimization algorithms are presented for a class of test problems.
基金This work is supported by the National Natural Science Foundation of China
文摘In this paper we consider the global convergence of any conjugate gradient method of the form d1=-g1,dk+1=-gk+1+βkdk(k≥1)with any βk satisfying sume conditions,and with the strong wolfe line search conditions.Under the convex assumption on the objective function,we preve the descenf property and the global convergence of this method.
文摘In this paper, we extend a descent algorithm without line search for solving unconstrained optimization problems. Under mild conditions, its global convergence is established. Further, we generalize the search direction to more general form, and also obtain the global convergence of corresponding algorithm. The numerical results illustrate that the new algorithm is effective.
文摘In this paper we propose a new family of curve search methods for unconstrained optimization problems, which are based on searching a new iterate along a curve through the current iterate at each iteration, while line search methods are based on finding a new iterate on a line starting from the current iterate at each iteration. The global convergence and linear convergence rate of these curve search methods are investigated under some mild conditions. Numerical results show that some curve search methods are stable and effective in solving some large scale minimization problems.