期刊文献+
共找到2篇文章
< 1 >
每页显示 20 50 100
LaNets:Hybrid Lagrange Neural Networks for Solving Partial Differential Equations
1
作者 Ying Li Longxiang Xu +1 位作者 Fangjun Mei Shihui Ying 《Computer Modeling in Engineering & Sciences》 SCIE EI 2023年第1期657-672,共16页
We propose new hybrid Lagrange neural networks called LaNets to predict the numerical solutions of partial differential equations.That is,we embed Lagrange interpolation and small sample learning into deep neural netw... We propose new hybrid Lagrange neural networks called LaNets to predict the numerical solutions of partial differential equations.That is,we embed Lagrange interpolation and small sample learning into deep neural network frameworks.Concretely,we first perform Lagrange interpolation in front of the deep feedforward neural network.The Lagrange basis function has a neat structure and a strong expression ability,which is suitable to be a preprocessing tool for pre-fitting and feature extraction.Second,we introduce small sample learning into training,which is beneficial to guide themodel to be corrected quickly.Taking advantages of the theoretical support of traditional numerical method and the efficient allocation of modern machine learning,LaNets achieve higher predictive accuracy compared to the state-of-the-artwork.The stability and accuracy of the proposed algorithmare demonstrated through a series of classical numerical examples,including one-dimensional Burgers equation,onedimensional carburizing diffusion equations,two-dimensional Helmholtz equation and two-dimensional Burgers equation.Experimental results validate the robustness,effectiveness and flexibility of the proposed algorithm. 展开更多
关键词 Hybrid lagrange neural networks interpolation polynomials deep learning numerical simulation partial differential equations
下载PDF
Novel Method to Handle Inequality Constraints for Nonlinear Programming
2
作者 黄远灿 《Journal of Beijing Institute of Technology》 EI CAS 2005年第2期145-149,共5页
By redefining the multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, say, u2_i, i=1, 2, ..., m, nonnegative constraints imposed on inequality constra... By redefining the multiplier associated with inequality constraint as a positive definite function of the originally-defined multiplier, say, u2_i, i=1, 2, ..., m, nonnegative constraints imposed on inequality constraints in Karush-Kuhn-Tucker necessary conditions are removed. For constructing the Lagrange neural network and Lagrange multiplier method, it is no longer necessary to convert inequality constraints into equality constraints by slack variables in order to reuse those results dedicated to equality constraints, and they can be similarly proved with minor modification. Utilizing this technique, a new type of Lagrange neural network and a new type of Lagrange multiplier method are devised, which both handle inequality constraints directly. Also, their stability and convergence are analyzed rigorously. 展开更多
关键词 nonlinear programming inequality constraint lagrange neural network lagrange multiplier method CONVERGENCE STABILITY
下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部