摘要
Different learning algorithms have been developed in the literature for training the radial basis function network (RBFN). In this paper, a new neural network named as Hanman Entropy Network (HEN) is developed from RBFN based on the Information set theory that deals with the representation of possibilistic uncertainty in the attribute/property values termed as information source values. The parameters of both HEN and RBFN are learned using a new learning algorithm called JAYA that solves the constrained and unconstrained optimization problems and is bereft of algorithm-specific parameters. The performance of HEN is shown to be superior to that of RBFN on four datasets. The advantage of HEN is that it can use both information source values and their membership values in several ways whereas RBFN uses only the membership function values.
Different learning algorithms have been developed in the literature for training the radial basis function network (RBFN). In this paper, a new neural network named as Hanman Entropy Network (HEN) is developed from RBFN based on the Information set theory that deals with the representation of possibilistic uncertainty in the attribute/property values termed as information source values. The parameters of both HEN and RBFN are learned using a new learning algorithm called JAYA that solves the constrained and unconstrained optimization problems and is bereft of algorithm-specific parameters. The performance of HEN is shown to be superior to that of RBFN on four datasets. The advantage of HEN is that it can use both information source values and their membership values in several ways whereas RBFN uses only the membership function values.