Background Real-time 3D rendering and interaction is important for virtual reality(VR)experimental education.Unfortunately,standard end-computing methods prohibitively escalate computational costs.Thus,reducing or dis...Background Real-time 3D rendering and interaction is important for virtual reality(VR)experimental education.Unfortunately,standard end-computing methods prohibitively escalate computational costs.Thus,reducing or distributing these requirements needs urgent attention,especially in light of the COVID-19 pandemic.Methods In this study,we design a cloud-to-end rendering and storage system for VR experimental education comprising two models:background and interactive.The cloud server renders items in the background and sends the results to an end terminal in a video stream.Interactive models are then lightweight-rendered and blended at the end terminal.An improved 3D warping and hole-filling algorithm is also proposed to improve image quality when the user's viewpoint changes.Results We build three scenes to test image quality and network latency.The results show that our system can render 3D experimental education scenes with higher image quality and lower latency than any other cloud rendering systems.Conclusions Our study is the first to use cloud and lightweight rendering for VR experimental education.The results demonstrate that our system provides good rendering experience without exceeding computation costs.展开更多
基金the National Key Research and Development Project of China(2018YFB1004904)National Natural Science Foundation of China(U1909204).
文摘Background Real-time 3D rendering and interaction is important for virtual reality(VR)experimental education.Unfortunately,standard end-computing methods prohibitively escalate computational costs.Thus,reducing or distributing these requirements needs urgent attention,especially in light of the COVID-19 pandemic.Methods In this study,we design a cloud-to-end rendering and storage system for VR experimental education comprising two models:background and interactive.The cloud server renders items in the background and sends the results to an end terminal in a video stream.Interactive models are then lightweight-rendered and blended at the end terminal.An improved 3D warping and hole-filling algorithm is also proposed to improve image quality when the user's viewpoint changes.Results We build three scenes to test image quality and network latency.The results show that our system can render 3D experimental education scenes with higher image quality and lower latency than any other cloud rendering systems.Conclusions Our study is the first to use cloud and lightweight rendering for VR experimental education.The results demonstrate that our system provides good rendering experience without exceeding computation costs.