|Version 14 (modified by firstname.lastname@example.org, 11 months ago) (diff)|
This is my scratch-pad for thoughts about QtWebKit for Qt 5. After spending plenty of time talking to different people, I'm slowly exceeding the capacity of my brain and it is time to digest, summarize and write down some content :)
Please feel free to edit right away or add comments with prefix between the paragraphs ("JoeDoe: I think this is all complete rubbish.")
QtWebKit for Qt 5
We have learned a lot about WebKit2 in the past months, as well as mobile web development. It is time to bring this experience to the main line of the Qt port of WebKit:
- QtWebKit for Qt 5 should be the foundation for great mobile web browsers.
Qt5 will be an interesting foundation for QtWebKit. The changes in Qt 5 that affect QtWebKit the most are the move of QWidget and related classes into a separate library as well as the introduction of Qt Quick 2 and the Qt Scene Graph as the primary graphics/UI API.
Consequently the interface to the browser should change:
- Our focus should be on making the WebKit2 port fast and stable enough for a web browser.
- A QML based UI component offers built-in handling for gestures, touch events and viewport meta tag.
- We should not spend time designing APIs that allow for feature combinations or software fallbacks that don't make sense. Features like WebGL or video should either work out of the box properly accelerated or be disabled.
- The QWidget/QGraphicsView based WebKit1 API moves into a separate shared library that links against QtWebKit.
Many different approaches to different feature have been discussed. We can record them here as a starting point for the implementation work:
WebKit2, Scene Graph, Tiling and Accelerated Compositing
The implementation currently chosen is the following:
- We're permanently in the state of composition.
- Layers are tiled and the tiles are "managed" in the web process side.
- Layer information is serialized and sent across to the UI process (LayerTreeHost and WebGraphicsLayer)
- The UI process feeds the layers into the texture mapper (LayerTreeProxyHost)
- The OpenGL texture mapper implementation renders before/after the scene graph.
Ideally the main QtWebKit library as well as the QtWebProcess won't link against the Qt widgets library. We should investigate the option of moving all QWidget dependencies (like the old plugin code) out of WebCore and into a secondary QtWebKitWidgets library that interfaces with WebCore through a platform plugin alike mechanism and hooks up top to implement QWebView/QGraphicsWebView. In such a design, WebCore would paint primarily with the raster paint engine and load all images in QImage objects instead of QPixmaps.
Another current dependency of WebCore is QStyle, which could be replaced with the default mobile theme and a few remaining QPainter calls for what we use QWindowsStyle currently. The themeing for QWebView/QGraphicsWebView could be delegated into a plugin loaded by WebCore when widgets and QStyle is needed.
Jocelyn: This is going to be painful. Beside the memory overhead, is there worthy benefits of decoupling the QWidget dependency? If possible I wish we could live with it until the need is felt.
For an efficient WebGL implementation we could either
a) Use OpenGL directly in the web process, render into an FBO and implement cross-process texture sharing for AC
b) Serialize the OpenGL commands and execute them on the UI process side
Some research is needed here, but option a) would have the additional benefit of also simplifying the video implementation due to the required texture sharing.
In order to render embedded videos efficiently and provide seamless transitions to fullscreen playback, we require from the platform the ability to render video into textures. Implementation bits that appear to be necessary:
- For each platform, determine the way of sharing textures across process boundaries (for example X11: X pixmaps)
- Add support for platform specific cross-process texture sharing to the WK2 AC implementation and the texture mapper.
- Change the corresponding media player back-end to render into such texture-backed layers.
- Implement support on the UI process side for establishing the UI for the temporary root layer for fullscreen elements. (maybe a dedicated QWindow subclass that renders straight with a dedicated GL texturemapper instance that small tree of texture mapper nodes that represent the layers of the fullscreen element hierarchy)