摘要
深度置信网络DBN(deep belief network)由受限玻尔兹曼机RBM(restricted Boltzmann machine)堆叠而成。针对RBM只能接受二值输入而导致的信息丢失问题,给出将可视层节点替换为具有高斯噪音的实数节点的解决方法,并且用线性修正节点替代隐层的Sigmoid节点。线性修正单元具有良好的稀疏性,可以很好地提高网络性能。DBN自底向上逐层训练网络,初始化网络的参数。在自然图像数据库中与传统DBN以及BP神经网络做分类性能比较,实验结果表明,改进的DBN的图像平均分类正确率以及时间复杂度都得到了较好的改善。
DBN (deep belief network) is stacked by RBM (restricted Boltzmann machine). RBM can only accept two-valued input,which causes the problem of information loss. Aiming at this problem, we solve it by replacing the visual layer nodes with real number nodes contai- ning Gauss noise, and use rectified linear nodes instead of hidden layer Sigmoid node. Rectified linear unit has good sparsity, and can well enhance network performance. DBN uses bottom- up method to train network layer by layer, and initialises network parameters. Comparison of classification performance between traditional DBN and BP neural network is conducted on nature image datasets. Experimental result indicates that in improved DBN the average classification accuracy of images and the time complexity are both improved significantly.
出处
《计算机应用与软件》
CSCD
2016年第9期221-223,244,共4页
Computer Applications and Software
关键词
深度置信网络
受限玻尔兹曼机
线性修正单元
Deep belief network Restricted Boltzmann machines Rectified linear unit