期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
Multilayer perceptron neural network activated by adaptive Gaussian radial basis function and its application to predict lid-driven cavity flow 被引量:2
1
作者 Qinghua Jiang Lailai Zhu +1 位作者 Chang Shu vinothkumar sekar 《Acta Mechanica Sinica》 SCIE EI CAS CSCD 2021年第12期1757-1772,共16页
To improve the performance of multilayer perceptron(MLP)neural networks activated by conventional activation functions,this paper presents a new MLP activated by univariate Gaussian radial basis functions(RBFs)with ad... To improve the performance of multilayer perceptron(MLP)neural networks activated by conventional activation functions,this paper presents a new MLP activated by univariate Gaussian radial basis functions(RBFs)with adaptive centers and widths,which is composed of more than one hidden layer.In the hidden layer of the RBF-activated MLP network(MLPRBF),the outputs of the preceding layer are first linearly transformed and then fed into the univariate Gaussian RBF,which exploits the highly nonlinear property of RBF.Adaptive RBFs might address the issues of saturated outputs,low sensitivity,and vanishing gradients in MLPs activated by other prevailing nonlinear functions.Finally,we apply four MLP networks with the rectified linear unit(ReLU),sigmoid function(sigmoid),hyperbolic tangent function(tanh),and Gaussian RBF as the activation functions to approximate the one-dimensional(1D)sinusoidal function,the analytical solution of viscous Burgers’equation,and the two-dimensional(2D)steady lid-driven cavity flows.Using the same network structure,MLP-RBF generally predicts more accurately and converges faster than the other threeMLPs.MLP-RBF using less hidden layers and/or neurons per layer can yield comparable or even higher approximation accuracy than other MLPs equipped with more layers or neurons. 展开更多
关键词 Multilayer perceptron neural network Activation function Radial basis function Numerical approximation
原文传递
上一页 1 下一页 到第
使用帮助 返回顶部