wiki:April 2012 Canvas Canvass

Version 1 (modified by wingo@igalia.com, 12 years ago) (diff)

--

Canvas canvass

dmazzoni opens the discussion and polls for discussion topics:

  1. Focusable fallback content
  2. Hit testing
  3. Async / off-the-main-thread rendering
  4. WebGL
  5. Accelerated canvas
  6. Paths and other canvas topics

Accessibility (A11y)

Dmazzoni jumps into (1) and (2).

Lots of discussion on public HTML canvas list about how to do canvas a11y. Lots of talk about how to provide standards that devs can use. A11y: supporting any user that has a disability. Labelling items for a screen reader. Tracking focussed element and location of that element for use in screen magnifiers that display only a part of the screen. Users with special input devices needing keyboard a11y.

Although canvas was not meant for accessible interfaces, devs are starting to use it more and more, so there is a need to add a11y interfaces.

Use case: web dev that uses canvas for an interface, and wants it to be accessible.

First approach: add DOM elements within canvas. Controversial, but IE and FF have this. Dmazzoni says that WK should do this too. Not rendered, but you could tab to it. You can then sync those elements with display in the canvas. The "fallback content" in the canvas would then be visible to a11y technology.

How in WK to implement this? RenderReplaced / RenderBlock at the same time. What perf impact? Does it matter? Open questions.

Hit testing: Hixie proposes hit regions, says dmazzoni -- see the current html spec.. You can associate paths with elements in fallback canvas content. Clicking the path focuses fallback content, for example. What do people think?

Maciej says we could get rid of the assumption that a focused element is associated with a rendered object. Maciej also brings up hit regions without associated DOM elements as a possible solution.

Question about using hit regions for generic hit testing optimizations by users for apps. Maciej confirms usefulness for general interactivity in canvas. Maciej asks, do you make "virtual targets" rendering objects for a11y convenience, or do we add non-render-object support to a11y code? Related to ARIA and hidden content. Maciej notes focusability is somewhat orthogonal to traditional purpose of fallback content.

Maciej brings up another point: there is the possibility that hit regions don't map 1-to-1 with fallback content, that perhaps you need a two-way mapping: from hit regions to fallback content and vice versa. Also document order relationship is unclear. Feedback needed to spec.

Dino notes that he went through recent canvas changes and filed WK bugs for all features that we don't implement yet. No tracking bug though.

Maciej notes that this is all related to the 2D context, and that no one has thought about 3D a11y.

Kbr_google notes that he saw a SXSW presentation related to 3D hit resting. Also notes that application is the only one that can definitively do 3D hit testing.

Maciej notes the app can do this already. Maciej says "focus is about where your typing will go, hit regions are about what is under your mouse." There is need for both but they are not necessarily the same.

Maciej summarizes: a11y is about 3 things:

  • Scan through some representation of what is on screen and expose that to the user, letting the user see some associated text
  • Magnify a particular region of the screen; requires knowledge of regions of interest
  • Translate from mouse position to objects in a11y tree. Hit testing is only relevant to the third thing.

More 3D hit testing discussions. Summary: regions are expensive to make for changing content. A more expressive spec could be needed. Maciej thinks more design is needed and that we should move on.

Question about relationship between imperative feel of canvas to declarative feel of SVG. Where are things going? Maciej says that hit regions could and probably would encompass multiple drawn things, and in that regard is different from SVG that logically associates one region of many drawn things with one a11y object.

Similarity with client-side image maps. Dmazzoni thinks it's a fine generalization.

Path Objects

Hit regions is a step on the way to paths. Paths are new things in Canvas -- a new API. Canvas "used to be wonky" in that there was only an implicit path. New API exposes first-class path objects. New things: Text to path, text on a path. Also new elements: transform additions, dash styles on strokes, ellipses and arcs, patterns. Also ability to scale images with nearest-neighbor scaling, more text measurement functionality...

Dmazzoni asks about implementation status. Consensus seems to be that these additions will probably go in. Maciej notes there was some pushback from W3C, but that WK should probably go forward with WHATWG spec.

Should we protect hit testing implementation with a prefix or something? Maciej smiles enigmatically.

Proposal to getContext("experimental"). Maciej thinks it's not so good because you can't feature-test.

Reiteration of consensus within room and among all but one implementor that these changes are good.

Accelerated canvas

Crickets.

What it is, yo? Accelerated implementations of gfx contexts. Accelerated canvases get promoted to composited element, which falls through to composited operations, which usually bottom out in opengl and textures.

That is the summary of what it is.

Asynchronous rendering

Trying to get webworkers more involved in getting canvas graphics made. How to do so? Production of pixels for the canvas. There are use cases where you want to use the browser's image decoding to decode jpeg and produce pixels for the main thread.

Basically: what kinds of APIs can we have to enable more asynchrony? Canvas -> drawing surface -> worker -> back to main thread. Transferrable canvas drawing surfaces? Would need to double buffer.

WebGL

Kbr_google presents. OpenGL ES 2 to the web. Actually a 2D API that gives you access to the GPU. !WebGL Water demo: no JS involved, everything is in the GPU. This is on every mobile phone out there today.

Conformance testing: (search webgl wiki for conformance) -- large set of conformance tests. Tests many corner cases, very extensive. All the GPU vendors are paying attention to this conformance suite. The graphics drivers are actually getting more correct and more robust in response to the web getting access to 3D hardware. Kbr_google is pretty happy about that. You should really be here. WebGL has huge market penetration.

Dino notes that it's a conformance test of the spec.

Oh snap, chrome just crashed!

Dino continues, how useful is it as a test of the implementation? Kbr_google says it is good and getting better. Now showing aquarium demo.

Dino notes that webkit webgl bugfixes have seemed to migrate upstream to khronos. Is this the right thing to do? Kbr_google says: yes. No assignment of anything needed. Also notes there is a plan to move webgl things from khronos CVS to github. The test suite is already public, under the MIT license.

Next demo: webgl-path-tracing. Raycasting within OpenGL ES 2.0 fragment shaders. Monte carlo simulation! Wild.

Kbr_google is ebullient. Searches for scott drave's "videoriot". Some happiness relative to state of webgl stability relative to 6 months ago. Demonstrates uploading pixels from videos or other DOM elements to the GPU.

~fin~