The Camera Offset Space: Real-time Potentially Visible Set Computations for Streaming Rendering


IT / Software / Bioinformatik

Ref.-Nr.: 1200-5472-BC

The presented technology offers an enhanced streaming rendering pipeline that enables the display of high fidelity, high framerate and high solution for noval views in real time on thin lightweight HMDs suitable especially for VR experiences.

How does it work?

The key idea of this technology is based on a novel algorithm for Potentially Visible Set (PVS) creation: the camera offset space (COS), which for a given point on the image plane provides information under which camera offset the point is covered by a given polygon, e.g. triangle. By constructing COS for every pixel location of a given frame a whole potentially visible set for any translation and rotation within a given view-cell can be computed.

The client’s load is greatly reduced by sending only the triangles that will potentially be needed for rendering while discarding the not useful data entirely. Sending only the pre-shaded PVS geometry to the client results in a convenient light-weight package suited for streaming, from which even a thin lightweight untethered client device (e.g. a smartphone in HMD) can render novel views at very high frame rates in high resolutions.

Background

In real-time graphics, we face an ever-increasing demand for more processing power, as the industry aims to provide photo-realistic real-time 3D visuals, pushing current graphics processing units (GPUs) to the limit. The rising popularity of thin, lightweight, untethered devices, like head-mounted displays (HMDs) for virtual reality (VR) applications or smartphones and tablets for gaming adds further constraints. These devices operate with limited resources (voltage/on-device memory/processing power) in comparison to workstation and server-grade hardware.

To compensate for the low compute power of untethered HMDs, the HMDs can be combined with a remote server, e.g., in a cloud or a Wi-Fi connected desktop computer. However, Wi-Fi connections increase the rendering latency by up to 100 milliseconds, and wide-area connections up to 200 milliseconds.

To deal with these problems a new approach to streaming rendering pipelines towards novel rendering technique of 3D data on the client has been proposed by inventors of the Max-Planck-Institute for Computer Graphics in Saarbrücken, Germany; solving the question of how to obtain a potentially visible geometry that may be visible on the client under any camera offset within a given range of supported camera offsets.

Technology

Here we present an enhanced streaming rendering pipeline (Fig. 1), enabling the display of high fidelity, high framerate and high resolution novel views on thin lightweight HMDs in real-time.

Fig. 1: Our enhanced streaming rendering pipeline.

The key idea of this technology is based on a novel algorithm for Potentially Visible Set (PVS) creation: the camera offset space (COS), which for a given point on the image plane provides information under which camera offset the point is covered by a given polygon, e.g. triangle. By constructing COS for every pixel location of a given frame a whole potentially visible set for any translation and rotation within a given view-cell can be computed.

Furthermore, it is possible to construct a full PVS for a 360° cuboid view cell region by constructing the COS for four to six views around the camera position (Fig. 2).

The client’s load is greatly reduced by sending only the triangles that will potentially be needed for rendering while discarding the not useful data entirely. Sending only the pre-shaded PVS geometry to the client results in a convenient light-weight package suited for streaming, from which even a thin lightweight untethered client device (e.g. a smartphone in HMD) can render novel views at very high frame rates in high resolutions.

Fig. 2: A birds-eye view of a full 360 ° PVS computed for a region around the camera from four reference views (in the corners) in an outdoor scene. Geometry labeled as invisible shown in magenta color, the red dot shows camera position in the scene. Each reference view contains a corner inset depicting its corresponding PVS---starting from top right, clockwise: North, East, South and West. The final PVS is constructed by joining these 4 sets. Novel views rendered with the PVS for any views around the reference location are complete and hole-free.

The invention was tested for end-to-end prototype of the whole pipeline, on four 3D scenes with various characteristics and use-cases (960 configurations were tested in total). For a better smartphone display, frame rates above 120Hz are possible.

Patent Information

International PCT/EP2017/078087 application filed on November 2017.

Literature

[Hladky 2019:1] J. HLADKY, H.P. SEIDEL, M. STEINBERGER. The Camera Offset Space: Real-time Potentially Visible Set Computations for Streaming Rendering. ACM Trans. Graph., Vol. 38, No. 6, Article 231. November 2019. Proceedings of SIGGRAPH Asia 2019.

[Hladky 2019:2] J. HLADKY, H.P. SEIDEL, M. STEINBERGER. Tessellated Shading Streaming. Computer Graphics Forum 2019. Proceedings of Eurographics Symposium on Rendering 2019.

PDF Download

Kontaktperson

Dr. Bernd Ctortecka, M. Phil.

Senior Patent- & Lizenzmanager

Diplom-Physiker

Telefon: 089 / 29 09 19-20
E-Mail:
ctortecka@max-planck-innovation.de