期刊文献+

SG-NeRF:Sparse-Input Generalized Neural Radiance Fields for Novel View Synthesis

原文传递
导出
摘要 Traditional neural radiance fields for rendering novel views require intensive input images and pre-scene optimization,which limits their practical applications.We propose a generalization method to infer scenes from input images and perform high-quality rendering without pre-scene optimization named SG-NeRF(Sparse-Input Generalized Neural Radiance Fields).Firstly,we construct an improved multi-view stereo structure based on the convolutional attention and multi-level fusion mechanism to obtain the geometric features and appearance features of the scene from the sparse input images,and then these features are aggregated by multi-head attention as the input of the neural radiance fields.This strategy of utilizing neural radiance fields to decode scene features instead of mapping positions and orientations enables our method to perform cross-scene training as well as inference,thus enabling neural radiance fields to generalize for novel view synthesis on unseen scenes.We tested the generalization ability on DTU dataset,and our PSNR(peak signal-to-noise ratio)improved by 3.14 compared with the baseline method under the same input conditions.In addition,if the scene has dense input views available,the average PSNR can be improved by 1.04 through further refinement training in a short time,and a higher quality rendering effect can be obtained.
作者 Kuo Xu Jie Li Zhen-Qiang Li Yang-Jie Cao 徐阔;李颉;李振强;曹仰杰(School of Cyber Science and Engineering,Zhengzhou University,Zhengzhou 450002,China;Intelligent Big Data System(iBDSys)Lab,Shanghai Jiao Tong University,Shanghai 200240,China;School of Computer and Artificial Intelligence,Zhengzhou University,Zhengzhou 450001,China)
出处 《Journal of Computer Science & Technology》 SCIE EI CSCD 2024年第4期785-797,共13页 计算机科学技术学报(英文版)
基金 supported by the Zhengzhou Collaborative Innovation Major Project under Grant No.20XTZX06013 the Henan Provincial Key Scientific Research Project of China under Grant No.22A520042。
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部