期刊文献+

基于G-Buffer的深度学习反走样算法

G-Buffer Based Deep Learning Anti-Aliasing Algorithm
下载PDF
导出
摘要 随着深度学习技术的发展,研究人员已提出诸多基于深度学习的反走样算法,用于提升图形渲染的质量,然而现有的深度学习反走样算法并未有效利用图形渲染的优势。针对上述问题,将图形绘制过程中涉及到的G-Buffer信息、时序反走样的理念及循环神经网络相结合,提出一种基于G-Buffer的深度学习反走样算法。实验结果表明,该算法可有效解决几何边缘等高频区域的走样现象,并取得和超采样反走样算法类似的效果。 With the development of deep learning technology,researchers have proposed many anti-aliasing algorithms based on deep learning to im⁃prove the quality of graphics rendering for the past few years,but existing deep learning anti-aliasing algorithms haven’t effectively uti⁃lized the advantages of graphics rendering.To deal with the problem just mentioned,a G-Buffer based deep learning anti-aliasing algo⁃rithm is proposed by combining G-Buffer information involved in graphics rendering process,the concept of temporal anti-aliasing and re⁃current neural network.Experimental results show that this algorithm can effectively solve the aliasing phenomenon in high frequency re⁃gions such as geometric edges and achieve similar results to super-sampling anti-aliasing algorithm.
作者 张靖仪 ZHANG Jing-yi(College of Computer Science,Sichuan University,Chengdu 610065)
出处 《现代计算机》 2020年第36期73-76,共4页 Modern Computer
关键词 走样 时序反走样 循环神经网络 G-Buffer Aliasing Temporal Anti-Aliasing Recurrent Neural Network G-Buffer

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部