摘要
In this paper,we introduce a novel approach to automatically regulate receptive fields in deep image parsing networks.Unlike previous work which placed much importance on obtaining better receptive fields using manually selected dilated convolutional kernels,our approach uses two affine transformation layers in the network’s backbone and operates on feature maps.Feature maps are inflated or shrunk by the new layer,thereby changing the receptive fields in the following layers.By use of end-to-end training,the whole framework is data-driven,without laborious manual intervention.The proposed method is generic across datasets and different tasks.We have conducted extensive experiments on both general image parsing tasks,and face parsing tasks as concrete examples,to demonstrate the method’s superior ability to regulate over manual designs.
In this paper, we introduce a novel approach to automatically regulate receptive fields in deep image parsing networks. Unlike previous work which placed much importance on obtaining better receptive fields using manually selected dilated convolutional kernels, our approach uses two affine transformation layers in the network's backbone and operates on feature maps. Feature maps are inflated or shrunk by the new layer, thereby changing the receptive fields in the following layers. By use of end-to-end training, the whole framework is data-driven, without laborious manual intervention. The proposed method is generic across datasets and different tasks. We have conducted extensive experiments on both general image parsing tasks, and face parsing tasks as concrete examples, to demonstrate the method's superior ability to regulate over manual designs.
基金
supported by the National Natural Science Foundation of China (Nos.U1536203,61572493)
the Cutting Edge Technology Research Program of the Institute of Information Engineering,CAS (No.Y7Z0241102)
the Key Laboratory of Intelligent Perception and Systems for High-Dimensional Information of the Ministry of Education (No.Y6Z0021102)
Nanjing University of Science and Technology (No.JYB201702)