| 1 | = WebGL Extension Proposal = |
| 2 | |
| 3 | Byungseon Shin, LG |
| 4 | |
| 5 | == Motivations == |
| 6 | |
| 7 | * YouTube on TV HTML5 |
| 8 | * YouTube 360 does not play at smooth 60FPS on embedded devices |
| 9 | * WebGL for 360-degree video. |
| 10 | * Must support VideoElement source |
| 11 | * Should render smoothly (~60 fps) |
| 12 | * Most of the GPU resources are consumed by Color Space Conversion (CSC) processing |
| 13 | |
| 14 | |
| 15 | == VR Technology == |
| 16 | |
| 17 | * 360 VR Tech: spherical mapping of equirectangular textures |
| 18 | * Maping a flat surface onto a 3D model (in this case, a sphere). |
| 19 | * Equirectanular Format: -- A bit like a map projection. |
| 20 | * Native OpenGL ES implementation was 60 fps. Video -> libvt - { texture } -> Sphere Mapping (OpenGL) -> Frame Buffer. |
| 21 | * Initially tried WebGL, but was SLOW. (~30 fps). Video -> libvt - { texture } -> WebGL (YUV -> RGBA) -> Sphere Mappig (OpenGL) -> Frame Buffer. |
| 22 | * There's an extra step of converting from libvt to RGBA |
| 23 | * In the case of 360VR, final screen size will be far smaller than input screen, but we have to convert full screen to use WebGL |
| 24 | * Three bottlenecks: |
| 25 | * Color Space Conversion (YUV -> RGB): 90% |
| 26 | * Texture Copy inside Video Decoder: 8-9% |
| 27 | * Compositing of WebGL canvas: 1-2% |
| 28 | |
| 29 | * Currently WebGL only supports TEXTURE_2D (RGBA) as an import. Not YUV. |
| 30 | * YUV -> RGB conversion is very slow. |
| 31 | |
| 32 | == Proposal == |
| 33 | |
| 34 | === OES_EGL_image_external === |
| 35 | |
| 36 | * Add an OES_EGL_image_external extension on WebGL. -- Allows YUV textures to be handled without color space conversion. |
| 37 | * Solution: currently, WebGL only supports TEXTURE_2D (RGBA). We should make it support TEXTURE_EXTERNAL_OES as well |
| 38 | * Expose OES_EGL_image_external functionality to WebGL (define a new texture target TEXTURE_EXTERNAL_OES) |
| 39 | * Advantage over existing proposals: focus on extending texture format of WebGL. Make it easy to port and use |
| 40 | * 2 issues pointed out in the WG: |
| 41 | * EGL image sync issue (syncing EGL image between video decoder) |
| 42 | * Audio sync (need exact timestamp of the frame currently being rendered) |
| 43 | * Also need other per-frame metadata like dimensions of the texture |
| 44 | * This approach allows avoiding converting the entire video image from YUV -> RGBA. Instead, only need to decode the region presented in the frame buffer. |
| 45 | * This extension already exists in OpenGL. This is just a matter of exposing through WebGL. |
| 46 | * Some competing proposals work on a similar problem, but are not as good a fit |
| 47 | |
| 48 | == Challenges == |
| 49 | |
| 50 | * Audio sync is a key challenge to overcome. |
| 51 | |
| 52 | == Questions == |
| 53 | |
| 54 | * Why not just decode the portion of the image that will be displayed? |
| 55 | * Might work if the viewport is fixed, but this won't have optimal performance when changing the viewport |