Stochastic neural networks are usually built by introducing random fluctuations into the network. A natural method is to use stochastic connections rather than stochastic activation functions. We propose a new model i...Stochastic neural networks are usually built by introducing random fluctuations into the network. A natural method is to use stochastic connections rather than stochastic activation functions. We propose a new model in which each neuron has very simple functionality but all the connections are stochastic. It is shown that the stationary distribution of the network uniquely exists and it is approxi-mately a Boltzmann-Gibbs distribution. The relationship between the model and the Markov random field is discussed. New techniques to implement simulated annealing and Boltzmann learning are pro-posed. Simulation results on the graph bisection problem and image recognition show that the network is powerful enough to solve real world problems.展开更多
基金This work was supported by the National Natural Science Foundation of China (Grant No. NSFC69805002) the Natural Science Foundation of Zhejiang Province, the Ministry of Eductation Grant for Excellent Youth the Ningbo Science Foundation for Youth (G
文摘Stochastic neural networks are usually built by introducing random fluctuations into the network. A natural method is to use stochastic connections rather than stochastic activation functions. We propose a new model in which each neuron has very simple functionality but all the connections are stochastic. It is shown that the stationary distribution of the network uniquely exists and it is approxi-mately a Boltzmann-Gibbs distribution. The relationship between the model and the Markov random field is discussed. New techniques to implement simulated annealing and Boltzmann learning are pro-posed. Simulation results on the graph bisection problem and image recognition show that the network is powerful enough to solve real world problems.