摘要
针对修正线性单元(ReLU)完全丢弃网络中包含有用信息的负激活值问题,基于参数化修正线性单元(PReLU)和指数线性单元(ELU)的研究,提出一种新颖的参数化激活函数幂线性单元(PoLU)。PoLU对输入的负激活部分实施有符号的幂非线性变化,幂函数的参数是可以在CNN训练过程中自适应学习的,同时像ReLU那样保持正激活部分不变。PoLU可以高效地实现并且灵活地运用到不同的卷积神经网络架构中。在广泛使用的CIFAR-10/100数据库上的实验结果表明,PoLU要优于ReLU和它相对应的激活函数。
Aiming at the problem that ReLU completely discards negative activations which often contain much information.Based on the research of parametric rectified linear unit(PReLU)and exponential linear unit(ELU),this paper proposed a novel parametric activation function called power linear unit(PoLU).The proposed PoLU performed the signed power non-linear transformation on negative activations.It adaptively learned the parameters of power function during the training process of CNN.Meanwhile,PoLU remained the positive activations unchanged.It efficiently implemented PoLU and be flexibly adopted to various CNN.The experimental results on widely-used CIFAR-10/100 benchmarks demonstrate that PoLU is much better than ReLU and outperforms its counterparts.
作者
骆训浩
李培华
Luo Xunhao;Li Peihua(Faculty of Electronic Information & Electrical Engineering,Dalian University of Technology,Dalian Liaoning 116024,China)
出处
《计算机应用研究》
CSCD
北大核心
2019年第10期3145-3147,3178,共4页
Application Research of Computers
基金
国家自然科学基金资助项目(61471082)
关键词
幂线性单元
参数化激活函数
卷积神经网络
power linear unit
parametric activation function
convolutional neural network(CNN)