In this study we investigate neural network solutions to nonlinear differential equations of Ricatti-type. We employ a feed-forward Multilayer Perceptron Neural Network (MLPNN), but avoid the standard back-propagation...In this study we investigate neural network solutions to nonlinear differential equations of Ricatti-type. We employ a feed-forward Multilayer Perceptron Neural Network (MLPNN), but avoid the standard back-propagation algorithm for updating the intrinsic weights. Our objective is to minimize an error, which is a function of the network parameters i.e., the weights and biases. Once the weights of the neural network are obtained by our systematic procedure, we need not adjust all the parameters in the network, as postulated by many researchers before us, in order to achieve convergence. We only need to fine-tune our biases which are fixed to lie in a certain given range, and convergence to a solution with an acceptable minimum error is achieved. This greatly reduces the computational complexity of the given problem. We provide two important ODE examples, the first is a Ricatti type differential equation to which the procedure is applied, and this gave us perfect agreement with the exact solution. The second example however provided us with only an acceptable approximation to the exact solution. Our novel artificial neural networks procedure has demonstrated quite clearly the function approximation capabilities of ANN in the solution of nonlinear differential equations of Ricatti type.展开更多
文摘In this study we investigate neural network solutions to nonlinear differential equations of Ricatti-type. We employ a feed-forward Multilayer Perceptron Neural Network (MLPNN), but avoid the standard back-propagation algorithm for updating the intrinsic weights. Our objective is to minimize an error, which is a function of the network parameters i.e., the weights and biases. Once the weights of the neural network are obtained by our systematic procedure, we need not adjust all the parameters in the network, as postulated by many researchers before us, in order to achieve convergence. We only need to fine-tune our biases which are fixed to lie in a certain given range, and convergence to a solution with an acceptable minimum error is achieved. This greatly reduces the computational complexity of the given problem. We provide two important ODE examples, the first is a Ricatti type differential equation to which the procedure is applied, and this gave us perfect agreement with the exact solution. The second example however provided us with only an acceptable approximation to the exact solution. Our novel artificial neural networks procedure has demonstrated quite clearly the function approximation capabilities of ANN in the solution of nonlinear differential equations of Ricatti type.