摘要
This paper proposes a stable method of generating a stereoscopic panoramic video with the omnidirectional stereo (ODS) format. Different from the traditional image stitching method which can only be applied to generate a monocular panorama, we adopt an optical flow-based blending method to create two panoramas for binocular vision. In addition, traditional image stitching methods based on seam-finding tend to cause the problem of temporal flicker. We address this problem by restricting the optical flow field of the new frame with its previous frame's optical flow field. Thus, the generated video is stable and out of temporal flicker. There are four key operations in our approach. First, we adopt the ODS format which is the basis of stereoscopic panorama. Second, we do effective exposure compensation, making the brightness of the two eyes' panoramas consistent. Third, we employ an optical flow-based blending method to synthesis the final panorama effectively. Fourth, we take the previous frame's optical flow field as the restriction of the present frame's optical flow field to acquire a stable video. The final output videos can deliver a pleasant and impressive stereoscopic viewing experience to the audience when the audience watches the videos in the virtual reality headset.
作者
Zhuo Tan
Shenghao Zhang
Ronggang Wang
Zhuo Tan;Shenghao Zhang;Ronggang Wang(School of Electronic and Computer Engineering, Peking University Shenzhen Graduate School, Shenzhen 518055,People's Republic of China)
基金
the National Natural Science Foundation of China 61672063, Shenzhen Peacock Plan, Shenzhen Research Projects of JCYJ20160506172227337 and GGFW2017041215130858 for funding.