Changeset 249205 in webkit


Ignore:
Timestamp:
Aug 28, 2019 10:04:32 AM (5 years ago)
Author:
aboya@igalia.com
Message:

[MSE][GStreamer] WebKitMediaSrc rework
https://bugs.webkit.org/show_bug.cgi?id=199719

Reviewed by Xabier Rodriguez-Calvar.

LayoutTests/imported/w3c:

  • web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py: Added.

(parse_range):
(main):

  • web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html: Added.
  • web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
  • web-platform-tests/media-source/mediasource-buffered-seek-expected.txt: Added.
  • web-platform-tests/media-source/mediasource-buffered-seek.html: Added.

Source/WebCore:

This patch reworks the WebKitMediaSrc element and many of the player
private methods that interacted with it.

In comparison with the old WebKitMediaSrc, in the new one seeks have
been massively simplified.

The new WebKitMediaSrc no longer relies on a bin or appsrc, having
greater control over its operation. This made it comparatively much
easier to implement features such as seek before playback or
single-stream flushing.

stream-collection events are emitted from the WebKitMediaSrc to reuse
the track handling in MediaPlayerPrivateGStreamer for playbin3, which
is now used for MSE pipelines.

Additional tests have been added to check some assumptions, and some
bugs that have surfaced with the changes have been fixed but no new
features (like multi-track support) are implemented in this patch.

One instance of these bugs is resized events, which were previously
being emitted when frames with different resolutions where appended.
This is a wrong behavior that has not been preserved in the rework, as
resize events should be emitted when the frames are shown, not
just appended.

There are subtler bugfixes, such as ignoring PTS-less frames in
AppendPipeline::appsinkNewSample(). These frames are problematic for
MSE, yet they were somehow passing through the pipelines. Since
WebKitMediaSrc is stricter with assertions, these have to be filtered.

This test gets rid of !m_mseSeekCompleted assertion failures in tests
and potentially other hard to debug bugs in the previous seek
algorithm.

This patch makes the following existing tests pass:

imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-a-bitrate.html
imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html

New test: imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html
New test: LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html (non-MSE related)

  • Headers.cmake:
  • platform/GStreamer.cmake:
  • platform/graphics/gstreamer/GRefPtrGStreamer.cpp:

(WTF::adoptGRef):
(WTF::refGPtr<GstMiniObject>):
(WTF::derefGPtr<GstMiniObject>):

  • platform/graphics/gstreamer/GRefPtrGStreamer.h:
  • platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:

(WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
(WebCore::MediaPlayerPrivateGStreamer::paused const):
(WebCore::MediaPlayerPrivateGStreamer::updateTracks):
(WebCore::MediaPlayerPrivateGStreamer::enableTrack):
(WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
(WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
(WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):

  • platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:

(WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition):

  • platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:

(WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
(WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const):
(WebCore::MediaPlayerPrivateGStreamerBase::samplesHaveDifferentNaturalSize const):
(WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):

  • platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
  • platform/graphics/gstreamer/MediaSampleGStreamer.cpp:

(WebCore::MediaSampleGStreamer::MediaSampleGStreamer):

  • platform/graphics/gstreamer/mse/AppendPipeline.cpp:

(WebCore::AppendPipeline::appsinkNewSample):
(WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):

  • platform/graphics/gstreamer/mse/AppendPipeline.h:

(WebCore::AppendPipeline::appsinkCaps):
(WebCore::AppendPipeline::streamType):
(WebCore::AppendPipeline::demuxerSrcPadCaps):

  • platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:

(WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
(WebCore::MediaPlayerPrivateGStreamerMSE::load):
(WebCore::MediaPlayerPrivateGStreamerMSE::play):
(WebCore::MediaPlayerPrivateGStreamerMSE::pause):
(WebCore::MediaPlayerPrivateGStreamerMSE::seek):
(WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
(WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
(WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
(WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
(WebCore::MediaPlayerPrivateGStreamerMSE::didEnd):
(WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
(WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
(WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):

  • platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
  • platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:

(WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
(WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
(WebCore::MediaSourceClientGStreamerMSE::flush):
(WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
(WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples):
(WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples):
(WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):

  • platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
  • platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:

(WebCore::MediaSourceGStreamer::markEndOfStream):
(WebCore::MediaSourceGStreamer::unmarkEndOfStream):
(WebCore::MediaSourceGStreamer::waitForSeekCompleted):

  • platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
  • platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Removed.
  • platform/graphics/gstreamer/mse/PlaybackPipeline.h: Removed.
  • platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:

(WebCore::SourceBufferPrivateGStreamer::enqueueSample):
(WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
(WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):

  • platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
  • platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:

(WebKitMediaSrcPrivate::streamByName):
(Stream::Stream):
(Stream::StreamingMembers::StreamingMembers):
(Stream::StreamingMembers::durationEnqueued const):
(findPipeline):
(webkit_media_src_class_init):
(webkit_media_src_init):
(webKitMediaSrcFinalize):
(debugProbe):
(collectionPlusStream):
(collectionMinusStream):
(gstStreamType):
(webKitMediaSrcAddStream):
(webKitMediaSrcRemoveStream):
(webKitMediaSrcActivateMode):
(webKitMediaSrcPadLinked):
(webKitMediaSrcStreamNotifyLowWaterLevel):
(webKitMediaSrcLoop):
(webKitMediaSrcEnqueueObject):
(webKitMediaSrcEnqueueSample):
(webKitMediaSrcEnqueueEvent):
(webKitMediaSrcEndOfStream):
(webKitMediaSrcIsReadyForMoreSamples):
(webKitMediaSrcNotifyWhenReadyForMoreSamples):
(webKitMediaSrcChangeState):
(webKitMediaSrcStreamFlushStart):
(webKitMediaSrcStreamFlushStop):
(webKitMediaSrcFlush):
(webKitMediaSrcSeek):
(countStreamsOfType):
(webKitMediaSrcGetProperty):
(webKitMediaSrcUriGetType):
(webKitMediaSrcGetProtocols):
(webKitMediaSrcGetUri):
(webKitMediaSrcSetUri):
(webKitMediaSrcUriHandlerInit):

  • platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
  • platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Removed.

Tools:

Added WebKitMediaSourceGStreamer.cpp to the GStreamer-style coding
whitelist.

  • Scripts/webkitpy/style/checker.py:

LayoutTests:

Updated expectations.

  • platform/gtk/TestExpectations:
  • platform/mac/TestExpectations:
  • platform/ios-simulator/TestExpectations:
  • platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
Location:
trunk
Files:
7 added
3 deleted
29 edited

Legend:

Unmodified
Added
Removed
  • trunk/LayoutTests/ChangeLog

    r249201 r249205  
     12019-08-28  Alicia Boya García  <aboya@igalia.com>
     2
     3        [MSE][GStreamer] WebKitMediaSrc rework
     4        https://bugs.webkit.org/show_bug.cgi?id=199719
     5
     6        Reviewed by Xabier Rodriguez-Calvar.
     7
     8        Updated expectations.
     9
     10        * platform/gtk/TestExpectations:
     11        * platform/mac/TestExpectations:
     12        * platform/ios-simulator/TestExpectations:
     13        * platform/mac/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
     14
    1152019-08-28  Jer Noble  <jer.noble@apple.com>
    216
  • trunk/LayoutTests/imported/w3c/ChangeLog

    r249130 r249205  
     12019-08-28  Alicia Boya García  <aboya@igalia.com>
     2
     3        [MSE][GStreamer] WebKitMediaSrc rework
     4        https://bugs.webkit.org/show_bug.cgi?id=199719
     5
     6        Reviewed by Xabier Rodriguez-Calvar.
     7
     8        * web-platform-tests/html/semantics/embedded-content/the-video-element/timeout_on_seek.py: Added.
     9        (parse_range):
     10        (main):
     11        * web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html: Added.
     12        * web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek-expected.txt: Added.
     13        * web-platform-tests/media-source/mediasource-buffered-seek-expected.txt: Added.
     14        * web-platform-tests/media-source/mediasource-buffered-seek.html: Added.
     15
    1162019-08-26  Chris Dumez  <cdumez@apple.com>
    217
  • trunk/LayoutTests/platform/gtk/TestExpectations

    r248521 r249205  
    229229webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-changetype.html [ Failure Crash ]
    230230webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-changetype-play.html [ Failure ]
    231 webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html [ Failure Pass ]
    232231# Crash is webkit.org/b/176020
    233232webkit.org/b/167108 imported/w3c/web-platform-tests/media-source/mediasource-duration.html [ Failure Crash ]
     
    246245# We don't support multiple streams per sourcebuffer nor dynamic type changes (audio/video/text)
    247246webkit.org/b/165394 media/media-source/media-source-seek-detach-crash.html [ Skip ]
     247
     248# There is an oggdemux bug that deadlocks WebKit: https://gitlab.freedesktop.org/gstreamer/gst-plugins-base/issues/639
     249imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout ]
    248250
    249251# CSS filters related failures
     
    24602462webkit.org/b/172284 svg/animations/animated-svg-image-outside-viewport-paused.html [ Timeout ]
    24612463
    2462 webkit.org/b/172816 media/media-source/media-source-paint-to-canvas.html [ Timeout ]
     2464webkit.org/b/172816 media/media-source/media-source-paint-to-canvas.html [ Failure ]
    24632465
    24642466webkit.org/b/174242 media/media-fullscreen-pause-inline.html [ Skip ]
  • trunk/LayoutTests/platform/ios-simulator/TestExpectations

    r247671 r249205  
    127127
    128128imported/w3c/web-platform-tests/wasm [ Skip ]
     129
     130webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ]
  • trunk/LayoutTests/platform/mac/TestExpectations

    r249153 r249205  
    19871987[ Catalina+ ] fast/text/international/system-language/han-quotes.html [ ImageOnlyFailure ]
    19881988
     1989webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ]
     1990
    19891991# rdar://52557916 (REGRESSION: fast/css/paint-order.html and fast/css/paint-order-shadow.html are failing)
    19901992[ Catalina+ ] fast/css/paint-order.html [ ImageOnlyFailure ]
  • trunk/Source/WebCore/ChangeLog

    r249203 r249205  
     12019-08-28  Alicia Boya García  <aboya@igalia.com>
     2
     3        [MSE][GStreamer] WebKitMediaSrc rework
     4        https://bugs.webkit.org/show_bug.cgi?id=199719
     5
     6        Reviewed by Xabier Rodriguez-Calvar.
     7
     8        This patch reworks the WebKitMediaSrc element and many of the player
     9        private methods that interacted with it.
     10
     11        In comparison with the old WebKitMediaSrc, in the new one seeks have
     12        been massively simplified.
     13
     14        The new WebKitMediaSrc no longer relies on a bin or appsrc, having
     15        greater control over its operation. This made it comparatively much
     16        easier to implement features such as seek before playback or
     17        single-stream flushing.
     18
     19        stream-collection events are emitted from the WebKitMediaSrc to reuse
     20        the track handling in MediaPlayerPrivateGStreamer for playbin3, which
     21        is now used for MSE pipelines.
     22
     23        Additional tests have been added to check some assumptions, and some
     24        bugs that have surfaced with the changes have been fixed but no new
     25        features (like multi-track support) are implemented in this patch.
     26
     27        One instance of these bugs is `resized` events, which were previously
     28        being emitted when frames with different resolutions where appended.
     29        This is a wrong behavior that has not been preserved in the rework, as
     30        resize events should be emitted when the frames are shown, not
     31        just appended.
     32
     33        There are subtler bugfixes, such as ignoring PTS-less frames in
     34        AppendPipeline::appsinkNewSample(). These frames are problematic for
     35        MSE, yet they were somehow passing through the pipelines. Since
     36        WebKitMediaSrc is stricter with assertions, these have to be filtered.
     37
     38        This test gets rid of !m_mseSeekCompleted assertion failures in tests
     39        and potentially other hard to debug bugs in the previous seek
     40        algorithm.
     41
     42        This patch makes the following existing tests pass:
     43
     44        imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-a-bitrate.html
     45        imported/w3c/web-platform-tests/media-source/mediasource-config-change-webm-v-framesize.html
     46
     47        New test: imported/w3c/web-platform-tests/media-source/mediasource-buffered-seek.html
     48        New test: LayoutTests/imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html (non-MSE related)
     49
     50        * Headers.cmake:
     51        * platform/GStreamer.cmake:
     52        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp:
     53        (WTF::adoptGRef):
     54        (WTF::refGPtr<GstMiniObject>):
     55        (WTF::derefGPtr<GstMiniObject>):
     56        * platform/graphics/gstreamer/GRefPtrGStreamer.h:
     57        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp:
     58        (WebCore::MediaPlayerPrivateGStreamer::playbackPosition const):
     59        (WebCore::MediaPlayerPrivateGStreamer::paused const):
     60        (WebCore::MediaPlayerPrivateGStreamer::updateTracks):
     61        (WebCore::MediaPlayerPrivateGStreamer::enableTrack):
     62        (WebCore::MediaPlayerPrivateGStreamer::notifyPlayerOfVideo):
     63        (WebCore::MediaPlayerPrivateGStreamer::sourceSetup):
     64        (WebCore::MediaPlayerPrivateGStreamer::handleSyncMessage):
     65        (WebCore::MediaPlayerPrivateGStreamer::createGSTPlayBin):
     66        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h:
     67        (WebCore::MediaPlayerPrivateGStreamer::invalidateCachedPosition):
     68        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp:
     69        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSize const):
     70        (WebCore::MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps const):
     71        (WebCore::MediaPlayerPrivateGStreamerBase::samplesHaveDifferentNaturalSize const):
     72        (WebCore::MediaPlayerPrivateGStreamerBase::triggerRepaint):
     73        * platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h:
     74        * platform/graphics/gstreamer/MediaSampleGStreamer.cpp:
     75        (WebCore::MediaSampleGStreamer::MediaSampleGStreamer):
     76        * platform/graphics/gstreamer/mse/AppendPipeline.cpp:
     77        (WebCore::AppendPipeline::appsinkNewSample):
     78        (WebCore::AppendPipeline::connectDemuxerSrcPadToAppsink):
     79        * platform/graphics/gstreamer/mse/AppendPipeline.h:
     80        (WebCore::AppendPipeline::appsinkCaps):
     81        (WebCore::AppendPipeline::streamType):
     82        (WebCore::AppendPipeline::demuxerSrcPadCaps):
     83        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp:
     84        (WebCore::MediaPlayerPrivateGStreamerMSE::~MediaPlayerPrivateGStreamerMSE):
     85        (WebCore::MediaPlayerPrivateGStreamerMSE::load):
     86        (WebCore::MediaPlayerPrivateGStreamerMSE::play):
     87        (WebCore::MediaPlayerPrivateGStreamerMSE::pause):
     88        (WebCore::MediaPlayerPrivateGStreamerMSE::seek):
     89        (WebCore::MediaPlayerPrivateGStreamerMSE::seekCompleted):
     90        (WebCore::MediaPlayerPrivateGStreamerMSE::setReadyState):
     91        (WebCore::MediaPlayerPrivateGStreamerMSE::sourceSetup):
     92        (WebCore::MediaPlayerPrivateGStreamerMSE::updateStates):
     93        (WebCore::MediaPlayerPrivateGStreamerMSE::didEnd):
     94        (WebCore::MediaPlayerPrivateGStreamerMSE::unblockDurationChanges):
     95        (WebCore::MediaPlayerPrivateGStreamerMSE::durationChanged):
     96        (WebCore::MediaPlayerPrivateGStreamerMSE::trackDetected):
     97        * platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h:
     98        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp:
     99        (WebCore::MediaSourceClientGStreamerMSE::addSourceBuffer):
     100        (WebCore::MediaSourceClientGStreamerMSE::removedFromMediaSource):
     101        (WebCore::MediaSourceClientGStreamerMSE::flush):
     102        (WebCore::MediaSourceClientGStreamerMSE::enqueueSample):
     103        (WebCore::MediaSourceClientGStreamerMSE::isReadyForMoreSamples):
     104        (WebCore::MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples):
     105        (WebCore::MediaSourceClientGStreamerMSE::allSamplesInTrackEnqueued):
     106        * platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h:
     107        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp:
     108        (WebCore::MediaSourceGStreamer::markEndOfStream):
     109        (WebCore::MediaSourceGStreamer::unmarkEndOfStream):
     110        (WebCore::MediaSourceGStreamer::waitForSeekCompleted):
     111        * platform/graphics/gstreamer/mse/MediaSourceGStreamer.h:
     112        * platform/graphics/gstreamer/mse/PlaybackPipeline.cpp: Removed.
     113        * platform/graphics/gstreamer/mse/PlaybackPipeline.h: Removed.
     114        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp:
     115        (WebCore::SourceBufferPrivateGStreamer::enqueueSample):
     116        (WebCore::SourceBufferPrivateGStreamer::isReadyForMoreSamples):
     117        (WebCore::SourceBufferPrivateGStreamer::notifyClientWhenReadyForMoreSamples):
     118        * platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h:
     119        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp:
     120        (WebKitMediaSrcPrivate::streamByName):
     121        (Stream::Stream):
     122        (Stream::StreamingMembers::StreamingMembers):
     123        (Stream::StreamingMembers::durationEnqueued const):
     124        (findPipeline):
     125        (webkit_media_src_class_init):
     126        (webkit_media_src_init):
     127        (webKitMediaSrcFinalize):
     128        (debugProbe):
     129        (collectionPlusStream):
     130        (collectionMinusStream):
     131        (gstStreamType):
     132        (webKitMediaSrcAddStream):
     133        (webKitMediaSrcRemoveStream):
     134        (webKitMediaSrcActivateMode):
     135        (webKitMediaSrcPadLinked):
     136        (webKitMediaSrcStreamNotifyLowWaterLevel):
     137        (webKitMediaSrcLoop):
     138        (webKitMediaSrcEnqueueObject):
     139        (webKitMediaSrcEnqueueSample):
     140        (webKitMediaSrcEnqueueEvent):
     141        (webKitMediaSrcEndOfStream):
     142        (webKitMediaSrcIsReadyForMoreSamples):
     143        (webKitMediaSrcNotifyWhenReadyForMoreSamples):
     144        (webKitMediaSrcChangeState):
     145        (webKitMediaSrcStreamFlushStart):
     146        (webKitMediaSrcStreamFlushStop):
     147        (webKitMediaSrcFlush):
     148        (webKitMediaSrcSeek):
     149        (countStreamsOfType):
     150        (webKitMediaSrcGetProperty):
     151        (webKitMediaSrcUriGetType):
     152        (webKitMediaSrcGetProtocols):
     153        (webKitMediaSrcGetUri):
     154        (webKitMediaSrcSetUri):
     155        (webKitMediaSrcUriHandlerInit):
     156        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h:
     157        * platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamerPrivate.h: Removed.
     158
    11592019-08-28  Simon Fraser  <simon.fraser@apple.com>
    2160
  • trunk/Source/WebCore/platform/GStreamer.cmake

    r244443 r249205  
    3333        platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp
    3434        platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp
    35         platform/graphics/gstreamer/mse/PlaybackPipeline.cpp
    3635        platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp
    3736        platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp
  • trunk/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp

    r246677 r249205  
    2626namespace WTF {
    2727
     28template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr)
     29{
     30    return GRefPtr<GstMiniObject>(ptr, GRefPtrAdopt);
     31}
     32
     33template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr)
     34{
     35    if (ptr)
     36        gst_mini_object_ref(ptr);
     37
     38    return ptr;
     39}
     40
     41template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr)
     42{
     43    if (ptr)
     44        gst_mini_object_unref(ptr);
     45}
     46
    2847template <> GRefPtr<GstElement> adoptGRef(GstElement* ptr)
    2948{
  • trunk/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h

    r246677 r249205  
    3434
    3535namespace WTF {
     36
     37template<> GRefPtr<GstMiniObject> adoptGRef(GstMiniObject* ptr);
     38template<> GstMiniObject* refGPtr<GstMiniObject>(GstMiniObject* ptr);
     39template<> void derefGPtr<GstMiniObject>(GstMiniObject* ptr);
    3640
    3741template<> GRefPtr<GstElement> adoptGRef(GstElement* ptr);
  • trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.cpp

    r248846 r249205  
    359359{
    360360    GST_TRACE_OBJECT(pipeline(), "isEndReached: %s, seeking: %s, seekTime: %s", boolForPrinting(m_isEndReached), boolForPrinting(m_seeking), m_seekTime.toString().utf8().data());
    361     if (m_isEndReached && m_seeking)
    362         return m_seekTime;
     361    if (m_isEndReached) {
     362        // Position queries on a pipeline that is not running return 0. This is the case when the prerolling
     363        // from a seek is still not done and after EOS. In these cases we want to report the seek time or the
     364        // duration respectively.
     365        if (m_seeking)
     366            return m_seekTime;
     367
     368        MediaTime duration = durationMediaTime();
     369        return duration.isInvalid() ? MediaTime::zeroTime() : duration;
     370    }
    363371
    364372    // This constant should remain lower than HTMLMediaElement's maxTimeupdateEventFrequency.
     
    662670    }
    663671
    664     GstState state;
    665     gst_element_get_state(m_pipeline.get(), &state, nullptr, 0);
     672    GstState state, pending;
     673    gst_element_get_state(m_pipeline.get(), &state, &pending, 0);
    666674    bool paused = state <= GST_STATE_PAUSED;
    667     GST_LOG_OBJECT(pipeline(), "Paused: %s", toString(paused).utf8().data());
     675    GST_LOG_OBJECT(pipeline(), "Paused: %s (pending state: %s)", toString(paused).utf8().data(), gst_element_state_get_name(pending));
    668676    return paused;
    669677}
     
    711719#define CREATE_TRACK(type, Type) G_STMT_START {                         \
    712720        m_has##Type = true;                                             \
    713         if (!useMediaSource) {                                          \
    714             RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
    715             m_##type##Tracks.add(track->id(), track);                   \
    716             m_player->add##Type##Track(*track);                         \
    717             if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
    718                 m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
    719         }                                                               \
     721        RefPtr<Type##TrackPrivateGStreamer> track = Type##TrackPrivateGStreamer::create(makeWeakPtr(*this), i, stream); \
     722        m_##type##Tracks.add(track->id(), track);                       \
     723        m_player->add##Type##Track(*track);                             \
     724        if (gst_stream_get_stream_flags(stream.get()) & GST_STREAM_FLAG_SELECT) \
     725            m_current##Type##StreamId = String(gst_stream_get_stream_id(stream.get())); \
    720726    } G_STMT_END
    721727#else
     
    731737    bool useMediaSource = isMediaSource();
    732738    unsigned length = gst_stream_collection_get_size(m_streamCollection.get());
     739    GST_DEBUG_OBJECT(pipeline(), "Inspecting stream collection: %s %" GST_PTR_FORMAT,
     740        gst_stream_collection_get_upstream_id(m_streamCollection.get()), m_streamCollection.get());
    733741
    734742    bool oldHasAudio = m_hasAudio;
     
    770778void MediaPlayerPrivateGStreamer::enableTrack(TrackPrivateBaseGStreamer::TrackType trackType, unsigned index)
    771779{
    772     // FIXME: Remove isMediaSource() test below when fixing https://bugs.webkit.org/show_bug.cgi?id=182531.
    773     if (isMediaSource()) {
    774         GST_FIXME_OBJECT(m_pipeline.get(), "Audio/Video/Text track switching is not yet supported by the MSE backend.");
    775         return;
    776     }
    777 
    778780    const char* propertyName;
    779781    const char* trackTypeAsString;
     
    781783    String selectedStreamId;
    782784
    783     GstStream* stream = nullptr;
    784 
    785785    if (!m_isLegacyPlaybin) {
    786         stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
     786        GstStream* stream = gst_stream_collection_get_stream(m_streamCollection.get(), index);
    787787        if (!stream) {
    788788            GST_WARNING_OBJECT(pipeline(), "No stream to select at index %u", index);
     
    865865        return;
    866866
    867     ASSERT(m_isLegacyPlaybin || isMediaSource());
     867    ASSERT(m_isLegacyPlaybin);
     868    ASSERT(!isMediaSource());
    868869
    869870    gint numTracks = 0;
     
    915916#endif
    916917
    917     m_player->client().mediaPlayerEngineUpdated(m_player);
    918 }
    919 
    920 void MediaPlayerPrivateGStreamer::videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer* player)
    921 {
    922     player->m_notifier->notify(MainThreadNotification::VideoCapsChanged, [player] {
    923         player->notifyPlayerOfVideoCaps();
    924     });
    925 }
    926 
    927 void MediaPlayerPrivateGStreamer::notifyPlayerOfVideoCaps()
    928 {
    929     m_videoSize = IntSize();
    930918    m_player->client().mediaPlayerEngineUpdated(m_player);
    931919}
     
    18491837void MediaPlayerPrivateGStreamer::sourceSetup(GstElement* sourceElement)
    18501838{
     1839    ASSERT(!isMediaSource());
    18511840    GST_DEBUG_OBJECT(pipeline(), "Source element set-up for %s", GST_ELEMENT_NAME(sourceElement));
    18521841
     
    20992088{
    21002089    if (GST_MESSAGE_TYPE(message) == GST_MESSAGE_STREAM_COLLECTION && !m_isLegacyPlaybin) {
     2090        // GStreamer workaround:
     2091        // Unfortunately, when we have a stream-collection aware source (like WebKitMediaSrc) parsebin and decodebin3 emit
     2092        // their own stream-collection messages, but late, and sometimes with duplicated streams. Let's only listen for
     2093        // stream-collection messages from the source in the MSE case to avoid these issues.
     2094        if (isMediaSource() && message->src != GST_OBJECT(m_source.get()))
     2095            return true;
     2096
    21012097        GRefPtr<GstStreamCollection> collection;
    21022098        gst_message_parse_stream_collection(message, &collection.outPtr());
    2103 
    2104         if (collection) {
    2105             m_streamCollection.swap(collection);
    2106             m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
    2107                 this->updateTracks();
    2108             });
    2109         }
     2099        ASSERT(collection);
     2100        m_streamCollection.swap(collection);
     2101
     2102        m_notifier->notify(MainThreadNotification::StreamCollectionChanged, [this] {
     2103            this->updateTracks();
     2104        });
    21102105    }
    21112106
     
    23912386    const gchar* playbinName = "playbin";
    23922387
    2393     // MSE doesn't support playbin3. Mediastream requires playbin3. Regular
    2394     // playback can use playbin3 on-demand with the WEBKIT_GST_USE_PLAYBIN3
    2395     // environment variable.
    2396     if ((!isMediaSource() && g_getenv("WEBKIT_GST_USE_PLAYBIN3")) || url.protocolIs("mediastream"))
     2388    // MSE and Mediastream require playbin3. Regular playback can use playbin3 on-demand with the
     2389    // WEBKIT_GST_USE_PLAYBIN3 environment variable.
     2390    if ((isMediaSource() || url.protocolIs("mediastream") || g_getenv("WEBKIT_GST_USE_PLAYBIN3")))
    23972391        playbinName = "playbin3";
    23982392
     
    24742468
    24752469    g_object_set(m_pipeline.get(), "video-sink", createVideoSink(), "audio-sink", createAudioSink(), nullptr);
    2476 
    2477     configurePlaySink();
    24782470
    24792471    if (m_preservesPitch) {
     
    24962488            GST_WARNING("The videoflip element is missing, video rotation support is now disabled. Please check your gst-plugins-good installation.");
    24972489    }
    2498 
    2499     GRefPtr<GstPad> videoSinkPad = adoptGRef(gst_element_get_static_pad(m_videoSink.get(), "sink"));
    2500     if (videoSinkPad)
    2501         g_signal_connect_swapped(videoSinkPad.get(), "notify::caps", G_CALLBACK(videoSinkCapsChangedCallback), this);
    25022490}
    25032491
  • trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamer.h

    r247010 r249205  
    115115    void loadStateChanged();
    116116    void timeChanged();
    117     void didEnd();
    118117    virtual void durationChanged();
    119118    void loadingFailed(MediaPlayer::NetworkState, MediaPlayer::ReadyState = MediaPlayer::HaveNothing, bool forceNotifications = false);
     
    122121
    123122    GstElement* audioSink() const override;
    124     virtual void configurePlaySink() { }
    125123
    126124    void simulateAudioInterruption() override;
     
    207205    bool m_resetPipeline;
    208206    bool m_seeking;
    209     bool m_seekIsPending;
     207    bool m_seekIsPending; // Set when the user requests a seek but gst can't handle it yet, so it's deferred until we're >=PAUSED.
    210208    MediaTime m_seekTime;
    211209    GRefPtr<GstElement> m_source;
     
    215213
    216214    void notifyPlayerOfVideo();
    217     void notifyPlayerOfVideoCaps();
    218215    void notifyPlayerOfAudio();
    219216
     
    226223    void setAudioStreamProperties(GObject*);
    227224
     225    virtual void didEnd();
     226    void invalidateCachedPosition() { m_lastQueryTime.reset(); }
     227
    228228    static void setAudioStreamPropertiesCallback(MediaPlayerPrivateGStreamer*, GObject*);
    229229
    230230    static void sourceSetupCallback(MediaPlayerPrivateGStreamer*, GstElement*);
    231231    static void videoChangedCallback(MediaPlayerPrivateGStreamer*);
    232     static void videoSinkCapsChangedCallback(MediaPlayerPrivateGStreamer*);
    233232    static void audioChangedCallback(MediaPlayerPrivateGStreamer*);
    234233#if ENABLE(VIDEO_TRACK)
  • trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.cpp

    r249043 r249205  
    487487FloatSize MediaPlayerPrivateGStreamerBase::naturalSize() const
    488488{
     489    ASSERT(isMainThread());
    489490#if USE(GSTREAMER_HOLEPUNCH)
    490491    // When using the holepuch we may not be able to get the video frames size, so we can't use
     
    501502
    502503    auto sampleLocker = holdLock(m_sampleMutex);
     504
    503505    if (!GST_IS_SAMPLE(m_sample.get()))
    504506        return FloatSize();
     
    508510        return FloatSize();
    509511
     512    m_videoSize = naturalSizeFromCaps(caps);
     513    GST_DEBUG_OBJECT(pipeline(), "Natural size: %.0fx%.0f", m_videoSize.width(), m_videoSize.height());
     514    return m_videoSize;
     515}
     516
     517FloatSize MediaPlayerPrivateGStreamerBase::naturalSizeFromCaps(GstCaps* caps) const
     518{
     519    ASSERT(caps);
    510520
    511521    // TODO: handle possible clean aperture data. See
     
    558568    }
    559569
    560     GST_DEBUG_OBJECT(pipeline(), "Natural size: %" G_GUINT64_FORMAT "x%" G_GUINT64_FORMAT, width, height);
    561     m_videoSize = FloatSize(static_cast<int>(width), static_cast<int>(height));
    562     return m_videoSize;
     570    return FloatSize(static_cast<int>(width), static_cast<int>(height));
    563571}
    564572
     
    612620{
    613621    return m_readyState;
    614 }
    615 
    616 void MediaPlayerPrivateGStreamerBase::sizeChanged()
    617 {
    618     notImplemented();
    619622}
    620623
     
    750753}
    751754
     755bool MediaPlayerPrivateGStreamerBase::doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const
     756{
     757    ASSERT(sampleA);
     758    ASSERT(sampleB);
     759
     760    GstCaps* capsA = gst_sample_get_caps(sampleA);
     761    GstCaps* capsB = gst_sample_get_caps(sampleB);
     762
     763    if (LIKELY(capsA == capsB))
     764        return false;
     765
     766    return naturalSizeFromCaps(capsA) != naturalSizeFromCaps(capsB);
     767}
     768
    752769void MediaPlayerPrivateGStreamerBase::triggerRepaint(GstSample* sample)
    753770{
     
    755772    {
    756773        auto sampleLocker = holdLock(m_sampleMutex);
    757         triggerResize = !m_sample;
     774        triggerResize = !m_sample || doSamplesHaveDifferentNaturalSizes(m_sample.get(), sample);
     775        if (triggerResize)
     776            m_videoSize = FloatSize(); // Force re-calculation in next call to naturalSize().
    758777        m_sample = sample;
    759778    }
  • trunk/Source/WebCore/platform/graphics/gstreamer/MediaPlayerPrivateGStreamerBase.h

    r248762 r249205  
    121121    void setVisible(bool) override { }
    122122    void setSize(const IntSize&) override;
    123     void sizeChanged();
    124123
    125124    // Prefer MediaTime based methods over float based.
     
    250249    enum MainThreadNotification {
    251250        VideoChanged = 1 << 0,
    252         VideoCapsChanged = 1 << 1,
    253251        AudioChanged = 1 << 2,
    254252        VolumeChanged = 1 << 3,
     
    270268    mutable MediaPlayer::NetworkState m_networkState;
    271269    IntSize m_size;
     270
    272271    mutable Lock m_sampleMutex;
    273272    GRefPtr<GstSample> m_sample;
    274 
    275273    mutable FloatSize m_videoSize;
     274
    276275    bool m_usingFallbackVideoSink { false };
    277276    bool m_renderingCanBeAccelerated { false };
     
    310309    enum class WebKitGstVideoDecoderPlatform { Video4Linux };
    311310    Optional<WebKitGstVideoDecoderPlatform> m_videoDecoderPlatform;
     311
     312private:
     313    FloatSize naturalSizeFromCaps(GstCaps*) const;
     314    bool doSamplesHaveDifferentNaturalSizes(GstSample* sampleA, GstSample* sampleB) const;
    312315};
    313316
  • trunk/Source/WebCore/platform/graphics/gstreamer/MediaSampleGStreamer.cpp

    r246490 r249205  
    4444    auto createMediaTime =
    4545        [](GstClockTime time) -> MediaTime {
    46             return MediaTime(GST_TIME_AS_USECONDS(time), G_USEC_PER_SEC);
     46            return MediaTime(time, GST_SECOND);
    4747        };
    4848
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.cpp

    r248521 r249205  
    453453    }
    454454
     455    if (!GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get()))) {
     456        // When demuxing Vorbis, matroskademux creates several PTS-less frames with header information. We don't need those.
     457        GST_DEBUG("Ignoring sample without PTS: %" GST_PTR_FORMAT, gst_sample_get_buffer(sample.get()));
     458        return;
     459    }
     460
    455461    auto mediaSample = WebCore::MediaSampleGStreamer::create(WTFMove(sample), m_presentationSize, trackId());
    456462
     
    741747    ASSERT(!gst_pad_is_linked(sinkSinkPad.get()));
    742748
     749    // As it is now, resetParserState() will cause the pads to be disconnected, so they will later be re-added on the next initialization segment.
     750    bool firstTimeConnectingTrack = m_track == nullptr;
     751
    743752    GRefPtr<GstCaps> caps = adoptGRef(gst_pad_get_current_caps(GST_PAD(demuxerSrcPad)));
    744753
     
    781790
    782791    m_appsinkCaps = WTFMove(caps);
    783     m_playerPrivate->trackDetected(this, m_track, true);
     792    m_playerPrivate->trackDetected(this, m_track, firstTimeConnectingTrack);
    784793}
    785794
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/AppendPipeline.h

    r246490 r249205  
    5353    void resetParserState();
    5454    Ref<SourceBufferPrivateGStreamer> sourceBufferPrivate() { return m_sourceBufferPrivate.get(); }
    55     GstCaps* appsinkCaps() { return m_appsinkCaps.get(); }
     55    const GRefPtr<GstCaps>& appsinkCaps() { return m_appsinkCaps; }
    5656    RefPtr<WebCore::TrackPrivateBase> track() { return m_track; }
     57    MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
    5758    MediaPlayerPrivateGStreamerMSE* playerPrivate() { return m_playerPrivate; }
    5859
     
    8182    GstElement* appsink() { return m_appsink.get(); }
    8283    GstCaps* demuxerSrcPadCaps() { return m_demuxerSrcPadCaps.get(); }
    83     WebCore::MediaSourceStreamTypeGStreamer streamType() { return m_streamType; }
    8484
    8585    void disconnectDemuxerSrcPadFromAppsinkFromAnyThread(GstPad*);
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.cpp

    r248846 r249205  
    44 * Copyright (C) 2007 Alp Toker <alp@atoker.com>
    55 * Copyright (C) 2009 Gustavo Noronha Silva <gns@gnome.org>
    6  * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017 Igalia S.L
     6 * Copyright (C) 2009, 2010, 2011, 2012, 2013, 2016, 2017, 2018, 2019 Igalia S.L
    77 * Copyright (C) 2015 Sebastian Dröge <sebastian@centricular.com>
    8  * Copyright (C) 2015, 2016, 2017 Metrological Group B.V.
     8 * Copyright (C) 2015, 2016, 2017, 2018, 2019 Metrological Group B.V.
    99 *
    1010 * This library is free software; you can redistribute it and/or
     
    3838#include "MediaPlayer.h"
    3939#include "NotImplemented.h"
    40 #include "PlaybackPipeline.h"
    4140#include "SourceBufferPrivateGStreamer.h"
    4241#include "TimeRanges.h"
     
    101100    m_appendPipelinesMap.clear();
    102101
    103     if (m_source) {
    104         webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), nullptr);
    105         g_signal_handlers_disconnect_by_data(m_source.get(), this);
    106     }
    107 
    108     if (m_playbackPipeline)
    109         m_playbackPipeline->setWebKitMediaSrc(nullptr);
     102    m_source.clear();
    110103}
    111104
     
    119112    }
    120113
    121     if (!m_playbackPipeline)
    122         m_playbackPipeline = PlaybackPipeline::create();
    123 
    124114    MediaPlayerPrivateGStreamer::load(urlString);
    125115}
     
    131121}
    132122
     123void MediaPlayerPrivateGStreamerMSE::play()
     124{
     125    GST_DEBUG_OBJECT(pipeline(), "Play requested");
     126    m_paused = false;
     127    updateStates();
     128}
     129
    133130void MediaPlayerPrivateGStreamerMSE::pause()
    134131{
     132    GST_DEBUG_OBJECT(pipeline(), "Pause requested");
    135133    m_paused = true;
    136     MediaPlayerPrivateGStreamer::pause();
     134    updateStates();
    137135}
    138136
     
    147145void MediaPlayerPrivateGStreamerMSE::seek(const MediaTime& time)
    148146{
    149     if (UNLIKELY(!m_pipeline || m_errorOccured))
    150         return;
    151 
    152     GST_INFO("[Seek] seek attempt to %s secs", toString(time).utf8().data());
    153 
    154     // Avoid useless seeking.
    155     MediaTime current = currentMediaTime();
    156     if (time == current) {
    157         if (!m_seeking)
    158             timeChanged();
    159         return;
    160     }
    161 
    162     if (isLiveStream())
    163         return;
    164 
    165     if (m_seeking && m_seekIsPending) {
    166         m_seekTime = time;
    167         return;
    168     }
    169 
    170     GST_DEBUG("Seeking from %s to %s seconds", toString(current).utf8().data(), toString(time).utf8().data());
    171 
    172     MediaTime previousSeekTime = m_seekTime;
    173147    m_seekTime = time;
    174 
    175     if (!doSeek()) {
    176         m_seekTime = previousSeekTime;
    177         GST_WARNING("Seeking to %s failed", toString(time).utf8().data());
    178         return;
    179     }
    180 
    181     m_isEndReached = false;
    182     GST_DEBUG("m_seeking=%s, m_seekTime=%s", boolForPrinting(m_seeking), toString(m_seekTime).utf8().data());
    183 }
    184 
    185 void MediaPlayerPrivateGStreamerMSE::configurePlaySink()
    186 {
    187     MediaPlayerPrivateGStreamer::configurePlaySink();
    188 
    189     GRefPtr<GstElement> playsink = adoptGRef(gst_bin_get_by_name(GST_BIN(m_pipeline.get()), "playsink"));
    190     if (playsink) {
    191         // The default value (0) means "send events to all the sinks", instead
    192         // of "only to the first that returns true". This is needed for MSE seek.
    193         g_object_set(G_OBJECT(playsink.get()), "send-event-mode", 0, nullptr);
    194     }
    195 }
    196 
    197 bool MediaPlayerPrivateGStreamerMSE::changePipelineState(GstState newState)
    198 {
    199     if (seeking()) {
    200         GST_DEBUG("Rejected state change to %s while seeking",
    201             gst_element_state_get_name(newState));
    202         return true;
    203     }
    204 
    205     return MediaPlayerPrivateGStreamer::changePipelineState(newState);
    206 }
    207 
    208 void MediaPlayerPrivateGStreamerMSE::notifySeekNeedsDataForTime(const MediaTime& seekTime)
    209 {
    210     // Reenqueue samples needed to resume playback in the new position.
    211     m_mediaSource->seekToTime(seekTime);
    212 
    213     GST_DEBUG("MSE seek to %s finished", toString(seekTime).utf8().data());
    214 
    215     if (!m_gstSeekCompleted) {
    216         m_gstSeekCompleted = true;
    217         maybeFinishSeek();
    218     }
    219 }
    220 
    221 bool MediaPlayerPrivateGStreamerMSE::doSeek(const MediaTime&, float, GstSeekFlags)
    222 {
    223     // Use doSeek() instead. If anybody is calling this version of doSeek(), something is wrong.
    224     ASSERT_NOT_REACHED();
    225     return false;
    226 }
    227 
    228 bool MediaPlayerPrivateGStreamerMSE::doSeek()
    229 {
    230     MediaTime seekTime = m_seekTime;
    231     double rate = m_player->rate();
    232     GstSeekFlags seekType = static_cast<GstSeekFlags>(GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_ACCURATE);
    233 
    234     // Always move to seeking state to report correct 'currentTime' while pending for actual seek to complete.
    235148    m_seeking = true;
    236149
    237     // Check if playback pipeline is ready for seek.
    238     GstState state, newState;
    239     GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
    240     if (getStateResult == GST_STATE_CHANGE_FAILURE || getStateResult == GST_STATE_CHANGE_NO_PREROLL) {
    241         GST_DEBUG("[Seek] cannot seek, current state change is %s", gst_element_state_change_return_get_name(getStateResult));
    242         webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
    243         m_seeking = false;
    244         return false;
    245     }
    246     if ((getStateResult == GST_STATE_CHANGE_ASYNC
    247         && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED))
    248         || state < GST_STATE_PAUSED
    249         || m_isEndReached
    250         || !m_gstSeekCompleted) {
    251         CString reason = "Unknown reason";
    252         if (getStateResult == GST_STATE_CHANGE_ASYNC) {
    253             reason = makeString("In async change ",
    254                 gst_element_state_get_name(state), " --> ",
    255                 gst_element_state_get_name(newState)).utf8();
    256         } else if (state < GST_STATE_PAUSED)
    257             reason = "State less than PAUSED";
    258         else if (m_isEndReached)
    259             reason = "End reached";
    260         else if (!m_gstSeekCompleted)
    261             reason = "Previous seek is not finished yet";
    262 
    263         GST_DEBUG("[Seek] Delaying the seek: %s", reason.data());
    264 
    265         m_seekIsPending = true;
    266 
    267         if (m_isEndReached) {
    268             GST_DEBUG("[Seek] reset pipeline");
    269             m_resetPipeline = true;
    270             m_seeking = false;
    271             if (!changePipelineState(GST_STATE_PAUSED))
    272                 loadingFailed(MediaPlayer::Empty);
    273             else
    274                 m_seeking = true;
    275         }
    276 
    277         return m_seeking;
    278     }
    279 
    280     // Stop accepting new samples until actual seek is finished.
    281     webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), false);
    282 
    283     // Correct seek time if it helps to fix a small gap.
    284     if (!isTimeBuffered(seekTime)) {
    285         // Look if a near future time (<0.1 sec.) is buffered and change the seek target time.
    286         if (m_mediaSource) {
    287             const MediaTime miniGap = MediaTime(1, 10);
    288             MediaTime nearest = m_mediaSource->buffered()->nearest(seekTime);
    289             if (nearest.isValid() && nearest > seekTime && (nearest - seekTime) <= miniGap && isTimeBuffered(nearest + miniGap)) {
    290                 GST_DEBUG("[Seek] Changed the seek target time from %s to %s, a near point in the future", toString(seekTime).utf8().data(), toString(nearest).utf8().data());
    291                 seekTime = nearest;
    292             }
    293         }
    294     }
    295 
    296     // Check if MSE has samples for requested time and defer actual seek if needed.
    297     if (!isTimeBuffered(seekTime)) {
    298         GST_DEBUG("[Seek] Delaying the seek: MSE is not ready");
    299         GstStateChangeReturn setStateResult = gst_element_set_state(m_pipeline.get(), GST_STATE_PAUSED);
    300         if (setStateResult == GST_STATE_CHANGE_FAILURE) {
    301             GST_DEBUG("[Seek] Cannot seek, failed to pause playback pipeline.");
    302             webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
    303             m_seeking = false;
    304             return false;
    305         }
    306         m_readyState = MediaPlayer::HaveMetadata;
    307         notifySeekNeedsDataForTime(seekTime);
    308         ASSERT(!m_mseSeekCompleted);
    309         return true;
    310     }
    311 
    312     // Complete previous MSE seek if needed.
    313     if (!m_mseSeekCompleted) {
    314         m_mediaSource->monitorSourceBuffers();
    315         ASSERT(m_mseSeekCompleted);
    316         // Note: seekCompleted will recursively call us.
    317         return m_seeking;
    318     }
    319 
    320     GST_DEBUG("We can seek now");
    321 
    322     MediaTime startTime = seekTime, endTime = MediaTime::invalidTime();
    323 
    324     if (rate < 0) {
    325         startTime = MediaTime::zeroTime();
    326         endTime = seekTime;
    327     }
    328 
    329     if (!rate)
    330         rate = 1;
    331 
    332     GST_DEBUG("Actual seek to %s, end time:  %s, rate: %f", toString(startTime).utf8().data(), toString(endTime).utf8().data(), rate);
    333 
    334     // This will call notifySeekNeedsData() after some time to tell that the pipeline is ready for sample enqueuing.
    335     webKitMediaSrcPrepareSeek(WEBKIT_MEDIA_SRC(m_source.get()), seekTime);
    336 
    337     m_gstSeekCompleted = false;
    338     if (!gst_element_seek(m_pipeline.get(), rate, GST_FORMAT_TIME, seekType, GST_SEEK_TYPE_SET, toGstClockTime(startTime), GST_SEEK_TYPE_SET, toGstClockTime(endTime))) {
    339         webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
    340         m_seeking = false;
    341         m_gstSeekCompleted = true;
    342         GST_DEBUG("doSeek(): gst_element_seek() failed, returning false");
    343         return false;
    344     }
    345 
    346     // The samples will be enqueued in notifySeekNeedsData().
    347     GST_DEBUG("doSeek(): gst_element_seek() succeeded, returning true");
    348     return true;
    349 }
    350 
    351 void MediaPlayerPrivateGStreamerMSE::maybeFinishSeek()
    352 {
    353     if (!m_seeking || !m_mseSeekCompleted || !m_gstSeekCompleted)
    354         return;
    355 
    356     GstState state, newState;
    357     GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &newState, 0);
    358 
    359     if (getStateResult == GST_STATE_CHANGE_ASYNC
    360         && !(state == GST_STATE_PLAYING && newState == GST_STATE_PAUSED)) {
    361         GST_DEBUG("[Seek] Delaying seek finish");
    362         return;
    363     }
    364 
    365     if (m_seekIsPending) {
    366         GST_DEBUG("[Seek] Committing pending seek to %s", toString(m_seekTime).utf8().data());
    367         m_seekIsPending = false;
    368         if (!doSeek()) {
    369             GST_WARNING("[Seek] Seeking to %s failed", toString(m_seekTime).utf8().data());
    370             m_cachedPosition = MediaTime::invalidTime();
    371         }
    372         return;
    373     }
    374 
    375     GST_DEBUG("[Seek] Seeked to %s", toString(m_seekTime).utf8().data());
    376 
    377     webKitMediaSrcSetReadyForSamples(WEBKIT_MEDIA_SRC(m_source.get()), true);
     150    webKitMediaSrcSeek(WEBKIT_MEDIA_SRC(m_source.get()), toGstClockTime(m_seekTime), m_playbackRate);
     151
     152    invalidateCachedPosition();
     153    m_canFallBackToLastFinishedSeekPosition = true;
     154
     155    // Notify MediaSource and have new frames enqueued (when they're available).
     156    m_mediaSource->seekToTime(time);
     157}
     158
     159void MediaPlayerPrivateGStreamerMSE::reportSeekCompleted()
     160{
    378161    m_seeking = false;
    379     m_cachedPosition = MediaTime::invalidTime();
    380     // The pipeline can still have a pending state. In this case a position query will fail.
    381     // Right now we can use m_seekTime as a fallback.
    382     m_canFallBackToLastFinishedSeekPosition = true;
    383     timeChanged();
    384 }
    385 
    386 void MediaPlayerPrivateGStreamerMSE::updatePlaybackRate()
    387 {
    388     notImplemented();
    389 }
    390 
    391 bool MediaPlayerPrivateGStreamerMSE::seeking() const
    392 {
    393     return m_seeking;
    394 }
    395 
    396 // FIXME: MediaPlayerPrivateGStreamer manages the ReadyState on its own. We shouldn't change it manually.
     162    m_player->timeChanged();
     163}
     164
    397165void MediaPlayerPrivateGStreamerMSE::setReadyState(MediaPlayer::ReadyState readyState)
    398166{
     
    400168        return;
    401169
    402     if (seeking()) {
    403         GST_DEBUG("Skip ready state change(%s -> %s) due to seek\n", dumpReadyState(m_readyState), dumpReadyState(readyState));
    404         return;
    405     }
    406 
    407     GST_DEBUG("Ready State Changed manually from %u to %u", m_readyState, readyState);
    408     MediaPlayer::ReadyState oldReadyState = m_readyState;
     170    GST_DEBUG("MediaPlayerPrivateGStreamerMSE::setReadyState(%p): %s -> %s", this, dumpReadyState(m_readyState), dumpReadyState(readyState));
    409171    m_readyState = readyState;
    410     GST_DEBUG("m_readyState: %s -> %s", dumpReadyState(oldReadyState), dumpReadyState(m_readyState));
    411 
    412     if (oldReadyState < MediaPlayer::HaveCurrentData && m_readyState >= MediaPlayer::HaveCurrentData) {
    413         GST_DEBUG("[Seek] Reporting load state changed to trigger seek continuation");
    414         loadStateChanged();
    415     }
     172    updateStates();
     173
     174    // Both readyStateChanged() and timeChanged() check for "seeked" condition, which requires all the following three things:
     175    //   1. HTMLMediaPlayer.m_seekRequested == true.
     176    //   2. Our seeking() method to return false (that is, we have completed the seek).
     177    //   3. readyState > HaveMetadata.
     178    //
     179    // We normally would set m_seeking = false in seekCompleted(), but unfortunately by that time, playback has already
     180    // started which means that the "playing" event is emitted before "seeked". In order to avoid that wrong order,
     181    // we do it here already.
     182    if (m_seeking && readyState > MediaPlayer::ReadyState::HaveMetadata)
     183        m_seeking = false;
    416184    m_player->readyStateChanged();
    417185
    418     GstState pipelineState;
    419     GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &pipelineState, nullptr, 250 * GST_NSECOND);
    420     bool isPlaying = (getStateResult == GST_STATE_CHANGE_SUCCESS && pipelineState == GST_STATE_PLAYING);
    421 
    422     if (m_readyState == MediaPlayer::HaveMetadata && oldReadyState > MediaPlayer::HaveMetadata && isPlaying) {
    423         GST_TRACE("Changing pipeline to PAUSED...");
    424         bool ok = changePipelineState(GST_STATE_PAUSED);
    425         GST_TRACE("Changed pipeline to PAUSED: %s", ok ? "Success" : "Error");
    426     }
    427 }
    428 
    429 void MediaPlayerPrivateGStreamerMSE::waitForSeekCompleted()
    430 {
    431     if (!m_seeking)
    432         return;
    433 
    434     GST_DEBUG("Waiting for MSE seek completed");
    435     m_mseSeekCompleted = false;
    436 }
    437 
    438 void MediaPlayerPrivateGStreamerMSE::seekCompleted()
    439 {
    440     if (m_mseSeekCompleted)
    441         return;
    442 
    443     GST_DEBUG("MSE seek completed");
    444     m_mseSeekCompleted = true;
    445 
    446     doSeek();
    447 
    448     if (!seeking() && m_readyState >= MediaPlayer::HaveFutureData)
    449         changePipelineState(GST_STATE_PLAYING);
    450 
    451     if (!seeking())
    452         m_player->timeChanged();
     186    // The readyState change may be a result of monitorSourceBuffers() finding that currentTime == duration, which
     187    // should cause the video to be marked as ended. Let's have the player check that.
     188    m_player->timeChanged();
    453189}
    454190
     
    466202{
    467203    m_source = sourceElement;
    468 
    469204    ASSERT(WEBKIT_IS_MEDIA_SRC(m_source.get()));
    470 
    471     m_playbackPipeline->setWebKitMediaSrc(WEBKIT_MEDIA_SRC(m_source.get()));
    472 
    473205    MediaSourceGStreamer::open(*m_mediaSource.get(), *this);
    474     g_signal_connect_swapped(m_source.get(), "video-changed", G_CALLBACK(videoChangedCallback), this);
    475     g_signal_connect_swapped(m_source.get(), "audio-changed", G_CALLBACK(audioChangedCallback), this);
    476     g_signal_connect_swapped(m_source.get(), "text-changed", G_CALLBACK(textChangedCallback), this);
    477     webKitMediaSrcSetMediaPlayerPrivate(WEBKIT_MEDIA_SRC(m_source.get()), this);
    478206}
    479207
    480208void MediaPlayerPrivateGStreamerMSE::updateStates()
    481209{
    482     if (UNLIKELY(!m_pipeline || m_errorOccured))
    483         return;
    484 
    485     MediaPlayer::NetworkState oldNetworkState = m_networkState;
    486     MediaPlayer::ReadyState oldReadyState = m_readyState;
    487     GstState state, pending;
    488 
    489     GstStateChangeReturn getStateResult = gst_element_get_state(m_pipeline.get(), &state, &pending, 250 * GST_NSECOND);
    490 
    491     bool shouldUpdatePlaybackState = false;
    492     switch (getStateResult) {
    493     case GST_STATE_CHANGE_SUCCESS: {
    494         GST_DEBUG("State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
    495 
    496         // Do nothing if on EOS and state changed to READY to avoid recreating the player
    497         // on HTMLMediaElement and properly generate the video 'ended' event.
    498         if (m_isEndReached && state == GST_STATE_READY)
    499             break;
    500 
    501         m_resetPipeline = (state <= GST_STATE_READY);
    502         if (m_resetPipeline)
    503             m_mediaTimeDuration = MediaTime::zeroTime();
    504 
    505         // Update ready and network states.
    506         switch (state) {
    507         case GST_STATE_NULL:
    508             m_readyState = MediaPlayer::HaveNothing;
    509             GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    510             m_networkState = MediaPlayer::Empty;
    511             break;
    512         case GST_STATE_READY:
    513             m_readyState = MediaPlayer::HaveMetadata;
    514             GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    515             m_networkState = MediaPlayer::Empty;
    516             break;
    517         case GST_STATE_PAUSED:
    518         case GST_STATE_PLAYING:
    519             if (seeking()) {
    520                 m_readyState = MediaPlayer::HaveMetadata;
    521                 // FIXME: Should we manage NetworkState too?
    522                 GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    523             } else {
    524                 if (m_readyState < MediaPlayer::HaveFutureData)
    525                     m_readyState = MediaPlayer::HaveFutureData;
    526                 GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    527                 m_networkState = MediaPlayer::Loading;
    528             }
    529 
    530             if (m_eosMarked && state == GST_STATE_PLAYING)
    531                 m_eosPending = true;
    532 
    533             break;
    534         default:
    535             ASSERT_NOT_REACHED();
    536             break;
    537         }
    538 
    539         // Sync states where needed.
    540         if (state == GST_STATE_PAUSED) {
    541             if (!m_volumeAndMuteInitialized) {
    542                 notifyPlayerOfVolumeChange();
    543                 notifyPlayerOfMute();
    544                 m_volumeAndMuteInitialized = true;
    545             }
    546 
    547             if (!seeking() && !m_paused && m_playbackRate) {
    548                 GST_DEBUG("[Buffering] Restarting playback.");
    549                 changePipelineState(GST_STATE_PLAYING);
    550             }
    551         } else if (state == GST_STATE_PLAYING) {
    552             m_paused = false;
    553 
    554             if (!m_playbackRate) {
    555                 GST_DEBUG("[Buffering] Pausing stream for buffering.");
    556                 changePipelineState(GST_STATE_PAUSED);
    557             }
    558         } else
    559             m_paused = true;
    560 
    561         if (m_requestedState == GST_STATE_PAUSED && state == GST_STATE_PAUSED) {
    562             shouldUpdatePlaybackState = true;
    563             GST_DEBUG("Requested state change to %s was completed", gst_element_state_get_name(state));
    564         }
    565 
    566         break;
    567     }
    568     case GST_STATE_CHANGE_ASYNC:
    569         GST_DEBUG("Async: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
    570         // Change in progress.
    571         break;
    572     case GST_STATE_CHANGE_FAILURE:
    573         GST_WARNING("Failure: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
    574         // Change failed.
    575         return;
    576     case GST_STATE_CHANGE_NO_PREROLL:
    577         GST_DEBUG("No preroll: State: %s, pending: %s", gst_element_state_get_name(state), gst_element_state_get_name(pending));
    578 
    579         // Live pipelines go in PAUSED without prerolling.
    580         m_isStreaming = true;
    581 
    582         if (state == GST_STATE_READY) {
    583             m_readyState = MediaPlayer::HaveNothing;
    584             GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    585         } else if (state == GST_STATE_PAUSED) {
    586             m_readyState = MediaPlayer::HaveEnoughData;
    587             GST_DEBUG("m_readyState=%s", dumpReadyState(m_readyState));
    588             m_paused = true;
    589         } else if (state == GST_STATE_PLAYING)
    590             m_paused = false;
    591 
    592         if (!m_paused && m_playbackRate)
    593             changePipelineState(GST_STATE_PLAYING);
    594 
    595         m_networkState = MediaPlayer::Loading;
    596         break;
    597     default:
    598         GST_DEBUG("Else : %d", getStateResult);
    599         break;
    600     }
    601 
    602     m_requestedState = GST_STATE_VOID_PENDING;
    603 
    604     if (shouldUpdatePlaybackState)
    605         m_player->playbackStateChanged();
    606 
    607     if (m_networkState != oldNetworkState) {
    608         GST_DEBUG("Network State Changed from %u to %u", oldNetworkState, m_networkState);
    609         m_player->networkStateChanged();
    610     }
    611     if (m_readyState != oldReadyState) {
    612         GST_DEBUG("Ready State Changed from %u to %u", oldReadyState, m_readyState);
    613         m_player->readyStateChanged();
    614     }
    615 
    616     if (getStateResult == GST_STATE_CHANGE_SUCCESS && state >= GST_STATE_PAUSED) {
    617         updatePlaybackRate();
    618         maybeFinishSeek();
    619     }
    620 }
    621 void MediaPlayerPrivateGStreamerMSE::asyncStateChangeDone()
    622 {
    623     if (UNLIKELY(!m_pipeline || m_errorOccured))
    624         return;
    625 
    626     if (m_seeking)
    627         maybeFinishSeek();
    628     else
    629         updateStates();
     210    bool shouldBePlaying = !m_paused && readyState() >= MediaPlayer::ReadyState::HaveFutureData;
     211    GST_DEBUG_OBJECT(pipeline(), "shouldBePlaying = %d, m_isPipelinePlaying = %d", static_cast<int>(shouldBePlaying), static_cast<int>(m_isPipelinePlaying));
     212    if (shouldBePlaying && !m_isPipelinePlaying) {
     213        if (!changePipelineState(GST_STATE_PLAYING))
     214            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PLAYING failed");
     215        m_isPipelinePlaying = true;
     216    } else if (!shouldBePlaying && m_isPipelinePlaying) {
     217        if (!changePipelineState(GST_STATE_PAUSED))
     218            GST_ERROR_OBJECT(pipeline(), "Setting the pipeline to PAUSED failed");
     219        m_isPipelinePlaying = false;
     220    }
     221}
     222
     223void MediaPlayerPrivateGStreamerMSE::didEnd()
     224{
     225    GST_DEBUG_OBJECT(pipeline(), "EOS received, currentTime=%s duration=%s", currentMediaTime().toString().utf8().data(), durationMediaTime().toString().utf8().data());
     226    m_isEndReached = true;
     227    invalidateCachedPosition();
     228    // HTMLMediaElement will emit ended if currentTime >= duration (which should now be the case).
     229    ASSERT(currentMediaTime() == durationMediaTime());
     230    m_player->timeChanged();
    630231}
    631232
     
    642243}
    643244
    644 RefPtr<MediaSourceClientGStreamerMSE> MediaPlayerPrivateGStreamerMSE::mediaSourceClient()
    645 {
    646     return m_mediaSourceClient;
    647 }
    648 
    649245void MediaPlayerPrivateGStreamerMSE::blockDurationChanges()
    650246{
     
    659255    if (m_shouldReportDurationWhenUnblocking) {
    660256        m_player->durationChanged();
    661         m_playbackPipeline->notifyDurationChanged();
    662257        m_shouldReportDurationWhenUnblocking = false;
    663258    }
     
    682277    // by the HTMLMediaElement.
    683278    if (m_mediaTimeDuration != previousDuration && m_mediaTimeDuration.isValid() && previousDuration.isValid()) {
    684         if (!m_areDurationChangesBlocked) {
     279        if (!m_areDurationChangesBlocked)
    685280            m_player->durationChanged();
    686             m_playbackPipeline->notifyDurationChanged();
    687         } else
     281        else
    688282            m_shouldReportDurationWhenUnblocking = true;
    689283        m_mediaSource->durationChanged(m_mediaTimeDuration);
     
    695289    ASSERT(appendPipeline->track() == newTrack);
    696290
    697     GstCaps* caps = appendPipeline->appsinkCaps();
     291    GRefPtr<GstCaps> caps = appendPipeline->appsinkCaps();
    698292    ASSERT(caps);
    699     GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps);
    700 
    701     if (doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
    702         Optional<FloatSize> size = getVideoResolutionFromCaps(caps);
     293    GST_DEBUG("track ID: %s, caps: %" GST_PTR_FORMAT, newTrack->id().string().latin1().data(), caps.get());
     294
     295    if (doCapsHaveType(caps.get(), GST_VIDEO_CAPS_TYPE_PREFIX)) {
     296        Optional<FloatSize> size = getVideoResolutionFromCaps(caps.get());
    703297        if (size.hasValue())
    704298            m_videoSize = size.value();
     
    706300
    707301    if (firstTrackDetected)
    708         m_playbackPipeline->attachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
    709     else
    710         m_playbackPipeline->reattachTrack(appendPipeline->sourceBufferPrivate(), newTrack, caps);
     302        webKitMediaSrcAddStream(WEBKIT_MEDIA_SRC(m_source.get()), newTrack->id(), appendPipeline->streamType(), WTFMove(caps));
    711303}
    712304
     
    743335    GST_DEBUG("Supported: %s", convertEnumerationToString(finalResult).utf8().data());
    744336    return finalResult;
    745 }
    746 
    747 void MediaPlayerPrivateGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
    748 {
    749     if (status != MediaSourcePrivate::EosNoError)
    750         return;
    751 
    752     GST_DEBUG("Marking end of stream");
    753     m_eosMarked = true;
    754     updateStates();
    755 }
    756 
    757 MediaTime MediaPlayerPrivateGStreamerMSE::currentMediaTime() const
    758 {
    759     MediaTime position = MediaPlayerPrivateGStreamer::currentMediaTime();
    760 
    761     if (m_eosPending && position >= durationMediaTime()) {
    762         if (m_networkState != MediaPlayer::Loaded) {
    763             m_networkState = MediaPlayer::Loaded;
    764             m_player->networkStateChanged();
    765         }
    766 
    767         m_eosPending = false;
    768         m_isEndReached = true;
    769         m_cachedPosition = m_mediaTimeDuration;
    770         m_player->timeChanged();
    771     }
    772     return position;
    773337}
    774338
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaPlayerPrivateGStreamerMSE.h

    r247010 r249205  
    5656
    5757    bool isLiveStream() const override { return false; }
    58     MediaTime currentMediaTime() const override;
    5958
     59    void play() override;
    6060    void pause() override;
    61     bool seeking() const override;
    6261    void seek(const MediaTime&) override;
    63     void configurePlaySink() override;
    64     bool changePipelineState(GstState) override;
     62    void reportSeekCompleted();
     63    void updatePipelineState(GstState);
    6564
    6665    void durationChanged() override;
     
    7473
    7574    void setReadyState(MediaPlayer::ReadyState);
    76     void waitForSeekCompleted();
    77     void seekCompleted();
    7875    MediaSourcePrivateClient* mediaSourcePrivateClient() { return m_mediaSource.get(); }
    7976
    80     void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
    81 
    8277    void trackDetected(RefPtr<AppendPipeline>, RefPtr<WebCore::TrackPrivateBase>, bool firstTrackDetected);
    83     void notifySeekNeedsDataForTime(const MediaTime&);
    8478
    8579    void blockDurationChanges();
    8680    void unblockDurationChanges();
     81
     82    void asyncStateChangeDone() override { }
     83
     84protected:
     85    void didEnd() override;
    8786
    8887private:
     
    9089    static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&);
    9190
    92     // FIXME: Reduce code duplication.
    9391    void updateStates() override;
    94 
    95     bool doSeek(const MediaTime&, float, GstSeekFlags) override;
    96     bool doSeek();
    97     void maybeFinishSeek();
    98     void updatePlaybackRate() override;
    99     void asyncStateChangeDone() override;
    10092
    10193    // FIXME: Implement videoPlaybackQualityMetrics.
     
    10597
    10698    void setMediaSourceClient(Ref<MediaSourceClientGStreamerMSE>);
    107     RefPtr<MediaSourceClientGStreamerMSE> mediaSourceClient();
    10899
    109100    HashMap<RefPtr<SourceBufferPrivateGStreamer>, RefPtr<AppendPipeline>> m_appendPipelinesMap;
    110     bool m_eosMarked = false;
    111     mutable bool m_eosPending = false;
    112     bool m_gstSeekCompleted = true;
    113101    RefPtr<MediaSourcePrivateClient> m_mediaSource;
    114102    RefPtr<MediaSourceClientGStreamerMSE> m_mediaSourceClient;
    115103    MediaTime m_mediaTimeDuration;
    116     bool m_mseSeekCompleted = true;
    117104    bool m_areDurationChangesBlocked = false;
    118105    bool m_shouldReportDurationWhenUnblocking = false;
    119     RefPtr<PlaybackPipeline> m_playbackPipeline;
     106    bool m_isPipelinePlaying = true;
    120107};
    121108
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.cpp

    r246490 r249205  
    2424#include "AppendPipeline.h"
    2525#include "MediaPlayerPrivateGStreamerMSE.h"
    26 #include "PlaybackPipeline.h"
    2726#include "WebKitMediaSourceGStreamer.h"
    2827#include <gst/gst.h>
     
    6160    ASSERT(WTF::isMainThread());
    6261
    63     ASSERT(m_playerPrivate.m_playbackPipeline);
    6462    ASSERT(sourceBufferPrivate);
    6563
     
    6866    m_playerPrivate.m_appendPipelinesMap.add(sourceBufferPrivate, appendPipeline);
    6967
    70     return m_playerPrivate.m_playbackPipeline->addSourceBuffer(sourceBufferPrivate);
     68    return MediaSourcePrivate::Ok;
    7169}
    7270
     
    138136}
    139137
    140 void MediaSourceClientGStreamerMSE::markEndOfStream(MediaSourcePrivate::EndOfStreamStatus status)
    141 {
    142     ASSERT(WTF::isMainThread());
    143 
    144     m_playerPrivate.markEndOfStream(status);
    145 }
    146 
    147138void MediaSourceClientGStreamerMSE::removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer> sourceBufferPrivate)
    148139{
    149140    ASSERT(WTF::isMainThread());
    150 
    151     ASSERT(m_playerPrivate.m_playbackPipeline);
    152141
    153142    // Remove the AppendPipeline from the map. This should cause its destruction since there should be no alive
     
    156145    m_playerPrivate.m_appendPipelinesMap.remove(sourceBufferPrivate);
    157146
    158     m_playerPrivate.m_playbackPipeline->removeSourceBuffer(sourceBufferPrivate);
     147    if (!sourceBufferPrivate->trackId().isNull())
     148        webKitMediaSrcRemoveStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), sourceBufferPrivate->trackId());
    159149}
    160150
     
    165155    // This is only for on-the-fly reenqueues after appends. When seeking, the seek will do its own flush.
    166156    if (!m_playerPrivate.m_seeking)
    167         m_playerPrivate.m_playbackPipeline->flush(trackId);
    168 }
    169 
    170 void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample)
    171 {
    172     ASSERT(WTF::isMainThread());
    173 
    174     m_playerPrivate.m_playbackPipeline->enqueueSample(WTFMove(sample));
     157        webKitMediaSrcFlush(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
     158}
     159
     160void MediaSourceClientGStreamerMSE::enqueueSample(Ref<MediaSample>&& sample, AtomString trackId)
     161{
     162    ASSERT(WTF::isMainThread());
     163
     164    GST_TRACE("enqueing sample trackId=%s PTS=%f presentationSize=%.0fx%.0f at %" GST_TIME_FORMAT " duration: %" GST_TIME_FORMAT,
     165        trackId.string().utf8().data(), sample->presentationTime().toFloat(),
     166        sample->presentationSize().width(), sample->presentationSize().height(),
     167        GST_TIME_ARGS(WebCore::toGstClockTime(sample->presentationTime())),
     168        GST_TIME_ARGS(WebCore::toGstClockTime(sample->duration())));
     169
     170    GRefPtr<GstSample> gstSample = sample->platformSample().sample.gstSample;
     171    ASSERT(gstSample);
     172    ASSERT(gst_sample_get_buffer(gstSample.get()));
     173
     174    webKitMediaSrcEnqueueSample(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, WTFMove(gstSample));
     175}
     176
     177bool MediaSourceClientGStreamerMSE::isReadyForMoreSamples(const AtomString& trackId)
     178{
     179    return webKitMediaSrcIsReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
     180}
     181
     182void MediaSourceClientGStreamerMSE::notifyClientWhenReadyForMoreSamples(const AtomString& trackId, SourceBufferPrivateClient* sourceBuffer)
     183{
     184    webKitMediaSrcNotifyWhenReadyForMoreSamples(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId, sourceBuffer);
    175185}
    176186
     
    179189    ASSERT(WTF::isMainThread());
    180190
    181     m_playerPrivate.m_playbackPipeline->allSamplesInTrackEnqueued(trackId);
     191    webKitMediaSrcEndOfStream(WEBKIT_MEDIA_SRC(m_playerPrivate.m_source.get()), trackId);
    182192}
    183193
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceClientGStreamerMSE.h

    r246490 r249205  
    4444    MediaSourcePrivate::AddStatus addSourceBuffer(RefPtr<SourceBufferPrivateGStreamer>, const ContentType&);
    4545    void durationChanged(const MediaTime&);
    46     void markEndOfStream(MediaSourcePrivate::EndOfStreamStatus);
    4746
    4847    // From SourceBufferPrivateGStreamer.
     
    5251    void removedFromMediaSource(RefPtr<SourceBufferPrivateGStreamer>);
    5352    void flush(AtomString);
    54     void enqueueSample(Ref<MediaSample>&&);
     53    void enqueueSample(Ref<MediaSample>&&, AtomString trackId);
    5554    void allSamplesInTrackEnqueued(const AtomString&);
     55
     56    bool isReadyForMoreSamples(const AtomString&);
     57    void notifyClientWhenReadyForMoreSamples(const AtomString&, SourceBufferPrivateClient*);
    5658
    5759    const MediaTime& duration();
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.cpp

    r216702 r249205  
    9393}
    9494
    95 void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus status)
     95void MediaSourceGStreamer::markEndOfStream(EndOfStreamStatus)
    9696{
    97     m_client->markEndOfStream(status);
     97    // We don't need to do anything in the AppendPipeline nor the playback pipeline. Instead, SourceBuffer knows better
     98    // when .endOfStream() has been called and there are no more samples to enqueue, which it will signal with a call
     99    // to SourceBufferPrivateGStreamer::allSamplesInTrackEnqueued(), where we enqueue an EOS event into WebKitMediaSrc.
     100
     101    // At this point it would be dangerously early to do that! There may be samples waiting to reach WebKitMediaSrc
     102    // (e.g. because high water level is hit) that will not be shown if we enqueue an EOS now.
    98103}
    99104
    100105void MediaSourceGStreamer::unmarkEndOfStream()
    101106{
    102     notImplemented();
    103107}
    104108
     
    115119void MediaSourceGStreamer::waitForSeekCompleted()
    116120{
    117     m_playerPrivate.waitForSeekCompleted();
    118121}
    119122
    120123void MediaSourceGStreamer::seekCompleted()
    121124{
    122     m_playerPrivate.seekCompleted();
     125    m_playerPrivate.reportSeekCompleted();
    123126}
    124127
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/MediaSourceGStreamer.h

    r207880 r249205  
    4040#include <wtf/Forward.h>
    4141#include <wtf/HashSet.h>
    42 
    43 typedef struct _WebKitMediaSrc WebKitMediaSrc;
    4442
    4543namespace WebCore {
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.cpp

    r246490 r249205  
    4646#include "NotImplemented.h"
    4747#include "WebKitMediaSourceGStreamer.h"
     48
     49GST_DEBUG_CATEGORY_EXTERN(webkit_mse_debug);
     50#define GST_CAT_DEFAULT webkit_mse_debug
    4851
    4952namespace WebCore {
     
    110113}
    111114
    112 void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString&)
     115void SourceBufferPrivateGStreamer::enqueueSample(Ref<MediaSample>&& sample, const AtomString& trackId)
    113116{
    114     m_notifyWhenReadyForMoreSamples = false;
    115 
    116     m_client->enqueueSample(WTFMove(sample));
     117    m_client->enqueueSample(WTFMove(sample), trackId);
    117118}
    118119
     
    122123}
    123124
    124 bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString&)
    125 {
    126     return m_isReadyForMoreSamples;
    127 }
    128 
    129 void SourceBufferPrivateGStreamer::setReadyForMoreSamples(bool isReady)
     125bool SourceBufferPrivateGStreamer::isReadyForMoreSamples(const AtomString& trackId)
    130126{
    131127    ASSERT(WTF::isMainThread());
    132     m_isReadyForMoreSamples = isReady;
    133 }
    134 
    135 void SourceBufferPrivateGStreamer::notifyReadyForMoreSamples()
    136 {
    137     ASSERT(WTF::isMainThread());
    138     setReadyForMoreSamples(true);
    139     if (m_notifyWhenReadyForMoreSamples)
    140         m_sourceBufferPrivateClient->sourceBufferPrivateDidBecomeReadyForMoreSamples(m_trackId);
     128    bool isReadyForMoreSamples = m_client->isReadyForMoreSamples(trackId);
     129    GST_DEBUG("SourceBufferPrivate(%p) - isReadyForMoreSamples: %d", this, (int) isReadyForMoreSamples);
     130    return isReadyForMoreSamples;
    141131}
    142132
     
    150140{
    151141    ASSERT(WTF::isMainThread());
    152     m_notifyWhenReadyForMoreSamples = true;
    153     m_trackId = trackId;
     142    return m_client->notifyClientWhenReadyForMoreSamples(trackId, m_sourceBufferPrivateClient);
    154143}
    155144
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/SourceBufferPrivateGStreamer.h

    r246490 r249205  
    7070    void notifyClientWhenReadyForMoreSamples(const AtomString&) final;
    7171
    72     void setReadyForMoreSamples(bool);
    73     void notifyReadyForMoreSamples();
    74 
    7572    void didReceiveInitializationSegment(const SourceBufferPrivateClient::InitializationSegment&);
    7673    void didReceiveSample(MediaSample&);
     
    7976
    8077    ContentType type() const { return m_type; }
     78    AtomString trackId() const { return m_trackId; }
    8179
    8280private:
     
    8886    Ref<MediaSourceClientGStreamerMSE> m_client;
    8987    SourceBufferPrivateClient* m_sourceBufferPrivateClient { nullptr };
    90     bool m_isReadyForMoreSamples = true;
    91     bool m_notifyWhenReadyForMoreSamples = false;
    9288    AtomString m_trackId;
    9389};
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.cpp

    r239477 r249205  
    44 *  Copyright (C) 2013 Orange
    55 *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
    6  *  Copyright (C) 2015, 2016 Metrological Group B.V.
    7  *  Copyright (C) 2015, 2016 Igalia, S.L
     6 *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
     7 *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
    88 *
    99 *  This library is free software; you can redistribute it and/or
     
    2525#include "WebKitMediaSourceGStreamer.h"
    2626
    27 #include "PlaybackPipeline.h"
    28 
    2927#if ENABLE(VIDEO) && ENABLE(MEDIA_SOURCE) && USE(GSTREAMER)
    3028
    31 #include "AudioTrackPrivateGStreamer.h"
    3229#include "GStreamerCommon.h"
    33 #include "MediaDescription.h"
    34 #include "MediaPlayerPrivateGStreamerMSE.h"
    35 #include "MediaSample.h"
    36 #include "MediaSourceGStreamer.h"
    37 #include "NotImplemented.h"
    38 #include "SourceBufferPrivateGStreamer.h"
    39 #include "TimeRanges.h"
    4030#include "VideoTrackPrivateGStreamer.h"
    41 #include "WebKitMediaSourceGStreamerPrivate.h"
    42 
    43 #include <gst/pbutils/pbutils.h>
    44 #include <gst/video/video.h>
     31
     32#include <gst/gst.h>
    4533#include <wtf/Condition.h>
     34#include <wtf/DataMutex.h>
     35#include <wtf/HashMap.h>
    4636#include <wtf/MainThread.h>
     37#include <wtf/MainThreadData.h>
    4738#include <wtf/RefPtr.h>
     39#include <wtf/glib/WTFGType.h>
     40#include <wtf/text/AtomString.h>
     41#include <wtf/text/AtomStringHash.h>
    4842#include <wtf/text/CString.h>
     43
     44using namespace WTF;
     45using namespace WebCore;
    4946
    5047GST_DEBUG_CATEGORY_STATIC(webkit_media_src_debug);
    5148#define GST_CAT_DEFAULT webkit_media_src_debug
    5249
     50static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%s", GST_PAD_SRC,
     51    GST_PAD_SOMETIMES, GST_STATIC_CAPS_ANY);
     52
     53enum {
     54    PROP_0,
     55    PROP_N_AUDIO,
     56    PROP_N_VIDEO,
     57    PROP_N_TEXT,
     58    PROP_LAST
     59};
     60
     61struct Stream;
     62
     63struct WebKitMediaSrcPrivate {
     64    HashMap<AtomString, RefPtr<Stream>> streams;
     65    Stream* streamByName(const AtomString& name)
     66    {
     67        Stream* stream = streams.get(name);
     68        ASSERT(stream);
     69        return stream;
     70    }
     71
     72    // Used for stream-start events, shared by all streams.
     73    const unsigned groupId { gst_util_group_id_next() };
     74
     75    // Every time a track is added or removed this collection is swapped by an updated one and a STREAM_COLLECTION
     76    // message is posted in the bus.
     77    GRefPtr<GstStreamCollection> collection { adoptGRef(gst_stream_collection_new("WebKitMediaSrc")) };
     78
     79    // Changed on seeks.
     80    GstClockTime startTime { 0 };
     81    double rate { 1.0 };
     82
     83    // Only used by URI Handler API implementation.
     84    GUniquePtr<char> uri;
     85};
     86
     87static void webKitMediaSrcUriHandlerInit(gpointer, gpointer);
     88static void webKitMediaSrcFinalize(GObject*);
     89static GstStateChangeReturn webKitMediaSrcChangeState(GstElement*, GstStateChange);
     90static gboolean webKitMediaSrcActivateMode(GstPad*, GstObject*, GstPadMode, gboolean activate);
     91static void webKitMediaSrcLoop(void*);
     92static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>&);
     93static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>&, bool resetTime);
     94static void webKitMediaSrcGetProperty(GObject*, unsigned propId, GValue*, GParamSpec*);
     95
    5396#define webkit_media_src_parent_class parent_class
    54 #define WEBKIT_MEDIA_SRC_CATEGORY_INIT GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "websrc element");
    55 
    56 static GstStaticPadTemplate srcTemplate = GST_STATIC_PAD_TEMPLATE("src_%u", GST_PAD_SRC,
    57     GST_PAD_SOMETIMES, GST_STATIC_CAPS_ANY);
    58 
    59 static void enabledAppsrcNeedData(GstAppSrc*, guint, gpointer);
    60 static void enabledAppsrcEnoughData(GstAppSrc*, gpointer);
    61 static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer);
    62 
    63 static void disabledAppsrcNeedData(GstAppSrc*, guint, gpointer) { };
    64 static void disabledAppsrcEnoughData(GstAppSrc*, gpointer) { };
    65 static gboolean disabledAppsrcSeekData(GstAppSrc*, guint64, gpointer)
    66 {
    67     return FALSE;
     97
     98struct WebKitMediaSrcPadPrivate {
     99    RefPtr<Stream> stream;
    68100};
    69101
    70 GstAppSrcCallbacks enabledAppsrcCallbacks = {
    71     enabledAppsrcNeedData,
    72     enabledAppsrcEnoughData,
    73     enabledAppsrcSeekData,
    74     { 0 }
     102struct WebKitMediaSrcPad {
     103    GstPad parent;
     104    WebKitMediaSrcPadPrivate* priv;
    75105};
    76106
    77 GstAppSrcCallbacks disabledAppsrcCallbacks = {
    78     disabledAppsrcNeedData,
    79     disabledAppsrcEnoughData,
    80     disabledAppsrcSeekData,
    81     { 0 }
     107struct WebKitMediaSrcPadClass {
     108    GstPadClass parent;
    82109};
    83110
    84 static Stream* getStreamByAppsrc(WebKitMediaSrc*, GstElement*);
    85 static void seekNeedsDataMainThread(WebKitMediaSrc*);
    86 static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc*, Stream*);
    87 
    88 static void enabledAppsrcNeedData(GstAppSrc* appsrc, guint, gpointer userData)
    89 {
    90     WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
    91     ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
    92 
    93     GST_OBJECT_LOCK(webKitMediaSrc);
    94     OnSeekDataAction appsrcSeekDataNextAction = webKitMediaSrc->priv->appsrcSeekDataNextAction;
    95     Stream* appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
    96     bool allAppsrcNeedDataAfterSeek = false;
    97 
    98     if (webKitMediaSrc->priv->appsrcSeekDataCount > 0) {
    99         if (appsrcStream && !appsrcStream->appsrcNeedDataFlag) {
    100             ++webKitMediaSrc->priv->appsrcNeedDataCount;
    101             appsrcStream->appsrcNeedDataFlag = true;
    102         }
    103         int numAppsrcs = webKitMediaSrc->priv->streams.size();
    104         if (webKitMediaSrc->priv->appsrcSeekDataCount == numAppsrcs && webKitMediaSrc->priv->appsrcNeedDataCount == numAppsrcs) {
    105             GST_DEBUG("All needDatas completed");
    106             allAppsrcNeedDataAfterSeek = true;
    107             webKitMediaSrc->priv->appsrcSeekDataCount = 0;
    108             webKitMediaSrc->priv->appsrcNeedDataCount = 0;
    109             webKitMediaSrc->priv->appsrcSeekDataNextAction = Nothing;
    110 
    111             for (Stream* stream : webKitMediaSrc->priv->streams)
    112                 stream->appsrcNeedDataFlag = false;
    113         }
    114     }
    115     GST_OBJECT_UNLOCK(webKitMediaSrc);
    116 
    117     if (allAppsrcNeedDataAfterSeek) {
    118         GST_DEBUG("All expected appsrcSeekData() and appsrcNeedData() calls performed. Running next action (%d)", static_cast<int>(appsrcSeekDataNextAction));
    119 
    120         switch (appsrcSeekDataNextAction) {
    121         case MediaSourceSeekToTime:
    122             webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::SeekNeedsData, [webKitMediaSrc] {
    123                 seekNeedsDataMainThread(webKitMediaSrc);
     111namespace WTF {
     112
     113template<> GRefPtr<WebKitMediaSrcPad> adoptGRef(WebKitMediaSrcPad* ptr)
     114{
     115    ASSERT(!ptr || !g_object_is_floating(ptr));
     116    return GRefPtr<WebKitMediaSrcPad>(ptr, GRefPtrAdopt);
     117}
     118
     119template<> WebKitMediaSrcPad* refGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
     120{
     121    if (ptr)
     122        gst_object_ref_sink(GST_OBJECT(ptr));
     123
     124    return ptr;
     125}
     126
     127template<> void derefGPtr<WebKitMediaSrcPad>(WebKitMediaSrcPad* ptr)
     128{
     129    if (ptr)
     130        gst_object_unref(ptr);
     131}
     132
     133} // namespace WTF
     134
     135static GType webkit_media_src_pad_get_type();
     136WEBKIT_DEFINE_TYPE(WebKitMediaSrcPad, webkit_media_src_pad, GST_TYPE_PAD);
     137#define WEBKIT_TYPE_MEDIA_SRC_PAD (webkit_media_src_pad_get_type())
     138#define WEBKIT_MEDIA_SRC_PAD(obj) (G_TYPE_CHECK_INSTANCE_CAST((obj), WEBKIT_TYPE_MEDIA_SRC_PAD, WebKitMediaSrcPad))
     139
     140static void webkit_media_src_pad_class_init(WebKitMediaSrcPadClass*)
     141{
     142}
     143
     144G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_ELEMENT,
     145    G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, webKitMediaSrcUriHandlerInit);
     146    G_ADD_PRIVATE(WebKitMediaSrc);
     147    GST_DEBUG_CATEGORY_INIT(webkit_media_src_debug, "webkitmediasrc", 0, "WebKit MSE source element"));
     148
     149struct Stream : public ThreadSafeRefCounted<Stream> {
     150    Stream(WebKitMediaSrc* source, GRefPtr<GstPad>&& pad, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps, GRefPtr<GstStream>&& streamInfo)
     151        : source(source)
     152        , pad(WTFMove(pad))
     153        , name(name)
     154        , type(type)
     155        , streamInfo(WTFMove(streamInfo))
     156        , streamingMembersDataMutex(WTFMove(initialCaps), source->priv->startTime, source->priv->rate, adoptGRef(gst_event_new_stream_collection(source->priv->collection.get())))
     157    { }
     158
     159    WebKitMediaSrc* const source;
     160    GRefPtr<GstPad> const pad;
     161    AtomString const name;
     162    WebCore::MediaSourceStreamTypeGStreamer type;
     163    GRefPtr<GstStream> streamInfo;
     164
     165    // The point of having a queue in WebKitMediaSource is to limit the number of context switches per second.
     166    // If we had no queue, the main thread would have to be awaken for every frame. On the other hand, if the
     167    // queue had unlimited size WebKit would end up requesting flushes more often than necessary when frames
     168    // in the future are re-appended. As a sweet spot between these extremes we choose to allow enqueueing a
     169    // few seconds worth of samples.
     170
     171    // `isReadyForMoreSamples` follows the classical two water levels strategy: initially it's true until the
     172    // high water level is reached, then it becomes false until the queue drains down to the low water level
     173    // and the cycle repeats. This way we avoid stalls and minimize context switches.
     174
     175    static const uint64_t durationEnqueuedHighWaterLevel = 5 * GST_SECOND;
     176    static const uint64_t durationEnqueuedLowWaterLevel = 2 * GST_SECOND;
     177
     178    struct StreamingMembers {
     179        StreamingMembers(GRefPtr<GstCaps>&& initialCaps, GstClockTime startTime, double rate, GRefPtr<GstEvent>&& pendingStreamCollectionEvent)
     180            : pendingStreamCollectionEvent(WTFMove(pendingStreamCollectionEvent))
     181            , pendingInitialCaps(WTFMove(initialCaps))
     182        {
     183            gst_segment_init(&segment, GST_FORMAT_TIME);
     184            segment.start = segment.time = startTime;
     185            segment.rate = rate;
     186
     187            GstStreamCollection* collection;
     188            gst_event_parse_stream_collection(this->pendingStreamCollectionEvent.get(), &collection);
     189            ASSERT(collection);
     190        }
     191
     192        bool hasPushedFirstBuffer { false };
     193        bool wasStreamStartSent { false };
     194        bool doesNeedSegmentEvent { true };
     195        GstSegment segment;
     196        GRefPtr<GstEvent> pendingStreamCollectionEvent;
     197        GRefPtr<GstCaps> pendingInitialCaps;
     198        GRefPtr<GstCaps> previousCaps;
     199
     200        Condition padLinkedOrFlushedCondition;
     201        Condition queueChangedOrFlushedCondition;
     202        Deque<GRefPtr<GstMiniObject>> queue;
     203        bool isFlushing { false };
     204        bool doesNeedToNotifyOnLowWaterLevel { false };
     205
     206        uint64_t durationEnqueued() const
     207        {
     208            // Find the first and last GstSample in the queue and subtract their DTS.
     209            auto frontIter = std::find_if(queue.begin(), queue.end(), [](const GRefPtr<GstMiniObject>& object) {
     210                return GST_IS_SAMPLE(object.get());
    124211            });
    125             break;
    126         case Nothing:
    127             break;
    128         }
    129     } else if (appsrcSeekDataNextAction == Nothing) {
    130         LockHolder locker(webKitMediaSrc->priv->streamLock);
    131 
    132         GST_OBJECT_LOCK(webKitMediaSrc);
    133 
    134         // Search again for the Stream, just in case it was removed between the previous lock and this one.
    135         appsrcStream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
    136 
    137         if (appsrcStream && appsrcStream->type != WebCore::Invalid)
    138             webKitMediaSrc->priv->notifier->notify(WebKitMediaSrcMainThreadNotification::ReadyForMoreSamples, [webKitMediaSrc, appsrcStream] {
    139                 notifyReadyForMoreSamplesMainThread(webKitMediaSrc, appsrcStream);
     212
     213            // If there are no samples in the queue, that makes total duration of enqueued frames of zero.
     214            if (frontIter == queue.end())
     215                return 0;
     216
     217            auto backIter = std::find_if(queue.rbegin(), queue.rend(), [](const GRefPtr<GstMiniObject>& object) {
     218                return GST_IS_SAMPLE(object.get());
    140219            });
    141220
    142         GST_OBJECT_UNLOCK(webKitMediaSrc);
    143     }
    144 }
    145 
    146 static void enabledAppsrcEnoughData(GstAppSrc *appsrc, gpointer userData)
    147 {
    148     // No need to lock on webKitMediaSrc, we're on the main thread and nobody is going to remove the stream in the meantime.
    149     ASSERT(WTF::isMainThread());
    150 
    151     WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
    152     ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
    153     Stream* stream = getStreamByAppsrc(webKitMediaSrc, GST_ELEMENT(appsrc));
    154 
    155     // This callback might have been scheduled from a child thread before the stream was removed.
    156     // Then, the removal code might have run, and later this callback.
    157     // This check solves the race condition.
    158     if (!stream || stream->type == WebCore::Invalid)
    159         return;
    160 
    161     stream->sourceBuffer->setReadyForMoreSamples(false);
    162 }
    163 
    164 static gboolean enabledAppsrcSeekData(GstAppSrc*, guint64, gpointer userData)
    165 {
    166     ASSERT(WTF::isMainThread());
    167 
    168     WebKitMediaSrc* webKitMediaSrc = static_cast<WebKitMediaSrc*>(userData);
    169 
    170     ASSERT(WEBKIT_IS_MEDIA_SRC(webKitMediaSrc));
    171 
    172     GST_OBJECT_LOCK(webKitMediaSrc);
    173     webKitMediaSrc->priv->appsrcSeekDataCount++;
    174     GST_OBJECT_UNLOCK(webKitMediaSrc);
    175 
    176     return TRUE;
    177 }
    178 
    179 static Stream* getStreamByAppsrc(WebKitMediaSrc* source, GstElement* appsrc)
    180 {
    181     for (Stream* stream : source->priv->streams) {
    182         if (stream->appsrc == appsrc)
    183             return stream;
    184     }
    185     return nullptr;
    186 }
    187 
    188 G_DEFINE_TYPE_WITH_CODE(WebKitMediaSrc, webkit_media_src, GST_TYPE_BIN,
    189     G_IMPLEMENT_INTERFACE(GST_TYPE_URI_HANDLER, webKitMediaSrcUriHandlerInit);
    190     WEBKIT_MEDIA_SRC_CATEGORY_INIT);
    191 
    192 guint webKitMediaSrcSignals[LAST_SIGNAL] = { 0 };
     221            const GstBuffer* front = gst_sample_get_buffer(GST_SAMPLE(frontIter->get()));
     222            const GstBuffer* back = gst_sample_get_buffer(GST_SAMPLE(backIter->get()));
     223            return GST_BUFFER_DTS_OR_PTS(back) - GST_BUFFER_DTS_OR_PTS(front);
     224        }
     225    };
     226    DataMutex<StreamingMembers> streamingMembersDataMutex;
     227
     228    struct ReportedStatus {
     229        // Set to true when the pad is removed. In the case where a reference to the Stream object is alive because of
     230        // a posted task to notify isReadyForMoreSamples, the notification must not be delivered if this flag is true.
     231        bool wasRemoved { false };
     232
     233        bool isReadyForMoreSamples { true };
     234        SourceBufferPrivateClient* sourceBufferPrivateToNotify { nullptr };
     235    };
     236    MainThreadData<ReportedStatus> reportedStatus;
     237};
     238
     239static GRefPtr<GstElement> findPipeline(GRefPtr<GstElement> element)
     240{
     241    while (true) {
     242        GRefPtr<GstElement> parentElement = adoptGRef(GST_ELEMENT(gst_element_get_parent(element.get())));
     243        if (!parentElement)
     244            return element;
     245        element = parentElement;
     246    }
     247}
    193248
    194249static void webkit_media_src_class_init(WebKitMediaSrcClass* klass)
     
    198253
    199254    oklass->finalize = webKitMediaSrcFinalize;
    200     oklass->set_property = webKitMediaSrcSetProperty;
    201255    oklass->get_property = webKitMediaSrcGetProperty;
    202256
    203     gst_element_class_add_pad_template(eklass, gst_static_pad_template_get(&srcTemplate));
    204 
    205     gst_element_class_set_static_metadata(eklass, "WebKit Media source element", "Source", "Handles Blob uris", "Stephane Jadaud <sjadaud@sii.fr>, Sebastian Dröge <sebastian@centricular.com>, Enrique Ocaña González <eocanha@igalia.com>");
    206 
    207     // Allows setting the uri using the 'location' property, which is used for example by gst_element_make_from_uri().
    208     g_object_class_install_property(oklass,
    209         PROP_LOCATION,
    210         g_param_spec_string("location", "location", "Location to read from", nullptr,
    211         GParamFlags(G_PARAM_READWRITE | G_PARAM_STATIC_STRINGS)));
     257    gst_element_class_add_static_pad_template_with_gtype(eklass, &srcTemplate, webkit_media_src_pad_get_type());
     258
     259    gst_element_class_set_static_metadata(eklass, "WebKit MediaSource source element", "Source/Network", "Feeds samples coming from WebKit MediaSource object", "Igalia <aboya@igalia.com>");
     260
     261    eklass->change_state = webKitMediaSrcChangeState;
     262
    212263    g_object_class_install_property(oklass,
    213264        PROP_N_AUDIO,
     
    222273        g_param_spec_int("n-text", "Number Text", "Total number of text streams",
    223274        0, G_MAXINT, 0, GParamFlags(G_PARAM_READABLE | G_PARAM_STATIC_STRINGS)));
    224 
    225     webKitMediaSrcSignals[SIGNAL_VIDEO_CHANGED] =
    226         g_signal_new("video-changed", G_TYPE_FROM_CLASS(oklass),
    227         G_SIGNAL_RUN_LAST,
    228         G_STRUCT_OFFSET(WebKitMediaSrcClass, videoChanged), nullptr, nullptr,
    229         g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
    230     webKitMediaSrcSignals[SIGNAL_AUDIO_CHANGED] =
    231         g_signal_new("audio-changed", G_TYPE_FROM_CLASS(oklass),
    232         G_SIGNAL_RUN_LAST,
    233         G_STRUCT_OFFSET(WebKitMediaSrcClass, audioChanged), nullptr, nullptr,
    234         g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
    235     webKitMediaSrcSignals[SIGNAL_TEXT_CHANGED] =
    236         g_signal_new("text-changed", G_TYPE_FROM_CLASS(oklass),
    237         G_SIGNAL_RUN_LAST,
    238         G_STRUCT_OFFSET(WebKitMediaSrcClass, textChanged), nullptr, nullptr,
    239         g_cclosure_marshal_generic, G_TYPE_NONE, 0, G_TYPE_NONE);
    240 
    241     eklass->change_state = webKitMediaSrcChangeState;
    242 
    243     g_type_class_add_private(klass, sizeof(WebKitMediaSrcPrivate));
    244 }
    245 
    246 static GstFlowReturn webkitMediaSrcChain(GstPad* pad, GstObject* parent, GstBuffer* buffer)
    247 {
    248     GRefPtr<WebKitMediaSrc> self = adoptGRef(WEBKIT_MEDIA_SRC(gst_object_get_parent(parent)));
    249 
    250     return gst_flow_combiner_update_pad_flow(self->priv->flowCombiner.get(), pad, gst_proxy_pad_chain_default(pad, GST_OBJECT(self.get()), buffer));
    251275}
    252276
    253277static void webkit_media_src_init(WebKitMediaSrc* source)
    254278{
    255     source->priv = WEBKIT_MEDIA_SRC_GET_PRIVATE(source);
     279    ASSERT(isMainThread());
     280
     281    GST_OBJECT_FLAG_SET(source, GST_ELEMENT_FLAG_SOURCE);
     282    source->priv = G_TYPE_INSTANCE_GET_PRIVATE((source), WEBKIT_TYPE_MEDIA_SRC, WebKitMediaSrcPrivate);
    256283    new (source->priv) WebKitMediaSrcPrivate();
    257     source->priv->seekTime = MediaTime::invalidTime();
    258     source->priv->appsrcSeekDataCount = 0;
    259     source->priv->appsrcNeedDataCount = 0;
    260     source->priv->appsrcSeekDataNextAction = Nothing;
    261     source->priv->flowCombiner = GUniquePtr<GstFlowCombiner>(gst_flow_combiner_new());
    262     source->priv->notifier = WebCore::MainThreadNotifier<WebKitMediaSrcMainThreadNotification>::create();
    263 
    264     // No need to reset Stream.appsrcNeedDataFlag because there are no Streams at this point yet.
    265 }
    266 
    267 void webKitMediaSrcFinalize(GObject* object)
    268 {
    269     ASSERT(WTF::isMainThread());
     284}
     285
     286static void webKitMediaSrcFinalize(GObject* object)
     287{
     288    ASSERT(isMainThread());
    270289
    271290    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
    272     WebKitMediaSrcPrivate* priv = source->priv;
    273 
    274     Vector<Stream*> oldStreams;
    275     source->priv->streams.swap(oldStreams);
    276 
    277     for (Stream* stream : oldStreams)
    278         webKitMediaSrcFreeStream(source, stream);
    279 
    280     priv->seekTime = MediaTime::invalidTime();
    281 
    282     source->priv->notifier->invalidate();
    283 
    284     if (priv->mediaPlayerPrivate)
    285         webKitMediaSrcSetMediaPlayerPrivate(source, nullptr);
    286 
    287     // We used a placement new for construction, the destructor won't be called automatically.
    288     priv->~_WebKitMediaSrcPrivate();
    289 
     291    source->priv->~WebKitMediaSrcPrivate();
    290292    GST_CALL_PARENT(G_OBJECT_CLASS, finalize, (object));
    291293}
    292294
    293 void webKitMediaSrcSetProperty(GObject* object, guint propId, const GValue* value, GParamSpec* pspec)
     295static GstPadProbeReturn debugProbe(GstPad* pad, GstPadProbeInfo* info, void*)
     296{
     297    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
     298    GST_TRACE_OBJECT(stream->source, "track %s: %" GST_PTR_FORMAT, stream->name.string().utf8().data(), info->data);
     299    return GST_PAD_PROBE_OK;
     300}
     301
     302// GstStreamCollection are immutable objects once posted. THEY MUST NOT BE MODIFIED once they have been posted.
     303// Instead, when stream changes occur a new collection must be made. The following functions help to create
     304// such new collections:
     305
     306static GRefPtr<GstStreamCollection> copyCollectionAndAddStream(GstStreamCollection* collection, GRefPtr<GstStream>&& stream)
     307{
     308    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
     309
     310    unsigned n = gst_stream_collection_get_size(collection);
     311    for (unsigned i = 0; i < n; i++)
     312        gst_stream_collection_add_stream(newCollection.get(), static_cast<GstStream*>(gst_object_ref(gst_stream_collection_get_stream(collection, i))));
     313    gst_stream_collection_add_stream(newCollection.get(), stream.leakRef());
     314
     315    return newCollection;
     316}
     317
     318static GRefPtr<GstStreamCollection> copyCollectionWithoutStream(GstStreamCollection* collection, const GstStream* stream)
     319{
     320    GRefPtr<GstStreamCollection> newCollection = adoptGRef(gst_stream_collection_new(collection->upstream_id));
     321
     322    unsigned n = gst_stream_collection_get_size(collection);
     323    for (unsigned i = 0; i < n; i++) {
     324        GRefPtr<GstStream> oldStream = gst_stream_collection_get_stream(collection, i);
     325        if (oldStream.get() != stream)
     326            gst_stream_collection_add_stream(newCollection.get(), oldStream.leakRef());
     327    }
     328
     329    return newCollection;
     330}
     331
     332static GstStreamType gstStreamType(WebCore::MediaSourceStreamTypeGStreamer type)
     333{
     334    switch (type) {
     335    case WebCore::MediaSourceStreamTypeGStreamer::Video:
     336        return GST_STREAM_TYPE_VIDEO;
     337    case WebCore::MediaSourceStreamTypeGStreamer::Audio:
     338        return GST_STREAM_TYPE_AUDIO;
     339    case WebCore::MediaSourceStreamTypeGStreamer::Text:
     340        return GST_STREAM_TYPE_TEXT;
     341    default:
     342        GST_ERROR("Received unexpected stream type");
     343        return GST_STREAM_TYPE_UNKNOWN;
     344    }
     345}
     346
     347void webKitMediaSrcAddStream(WebKitMediaSrc* source, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer type, GRefPtr<GstCaps>&& initialCaps)
     348{
     349    ASSERT(isMainThread());
     350    ASSERT(!source->priv->streams.contains(name));
     351
     352    GRefPtr<GstStream> streamInfo = adoptGRef(gst_stream_new(name.string().utf8().data(), initialCaps.get(), gstStreamType(type), GST_STREAM_FLAG_SELECT));
     353    source->priv->collection = copyCollectionAndAddStream(source->priv->collection.get(), GRefPtr<GstStream>(streamInfo));
     354    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
     355
     356    GRefPtr<WebKitMediaSrcPad> pad = WEBKIT_MEDIA_SRC_PAD(g_object_new(webkit_media_src_pad_get_type(), "name", makeString("src_", name).utf8().data(), "direction", GST_PAD_SRC, NULL));
     357    gst_pad_set_activatemode_function(GST_PAD(pad.get()), webKitMediaSrcActivateMode);
     358
     359    {
     360        RefPtr<Stream> stream = adoptRef(new Stream(source, GRefPtr<GstPad>(GST_PAD(pad.get())), name, type, WTFMove(initialCaps), WTFMove(streamInfo)));
     361        pad->priv->stream = stream;
     362        source->priv->streams.set(name, WTFMove(stream));
     363    }
     364
     365    if (gst_debug_category_get_threshold(webkit_media_src_debug) >= GST_LEVEL_TRACE)
     366        gst_pad_add_probe(GST_PAD(pad.get()), static_cast<GstPadProbeType>(GST_PAD_PROBE_TYPE_DATA_DOWNSTREAM | GST_PAD_PROBE_TYPE_EVENT_FLUSH), debugProbe, nullptr, nullptr);
     367
     368    // Workaround: gst_element_add_pad() should already call gst_pad_set_active() if the element is PAUSED or
     369    // PLAYING. Unfortunately, as of GStreamer 1.14.4 it does so with the element lock taken, causing a deadlock
     370    // in gst_pad_start_task(), who tries to post a `stream-status` message in the element, which also requires
     371    // the element lock. Activating the pad beforehand avoids that codepath.
     372    GstState state;
     373    gst_element_get_state(GST_ELEMENT(source), &state, nullptr, 0);
     374    if (state > GST_STATE_READY)
     375        gst_pad_set_active(GST_PAD(pad.get()), true);
     376
     377    gst_element_add_pad(GST_ELEMENT(source), GST_PAD(pad.get()));
     378}
     379
     380void webKitMediaSrcRemoveStream(WebKitMediaSrc* source, const AtomString& name)
     381{
     382    ASSERT(isMainThread());
     383    Stream* stream = source->priv->streamByName(name);
     384
     385    source->priv->collection = copyCollectionWithoutStream(source->priv->collection.get(), stream->streamInfo.get());
     386    gst_element_post_message(GST_ELEMENT(source), gst_message_new_stream_collection(GST_OBJECT(source), source->priv->collection.get()));
     387
     388    // Flush the source element **and** downstream. We want to stop the streaming thread and for that we need all elements downstream to be idle.
     389    webKitMediaSrcStreamFlushStart(stream);
     390    webKitMediaSrcStreamFlushStop(stream, false);
     391    // Stop the thread now.
     392    gst_pad_set_active(stream->pad.get(), false);
     393
     394    stream->reportedStatus->wasRemoved = true;
     395    gst_element_remove_pad(GST_ELEMENT(source), stream->pad.get());
     396    source->priv->streams.remove(name);
     397}
     398
     399static gboolean webKitMediaSrcActivateMode(GstPad* pad, GstObject* source, GstPadMode mode, gboolean active)
     400{
     401    if (mode != GST_PAD_MODE_PUSH) {
     402        GST_ERROR_OBJECT(source, "Unexpected pad mode in WebKitMediaSrc");
     403        return false;
     404    }
     405
     406    if (active)
     407        gst_pad_start_task(pad, webKitMediaSrcLoop, pad, nullptr);
     408    else {
     409        // Unblock the streaming thread.
     410        RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
     411        {
     412            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     413            streamingMembers->isFlushing = true;
     414            streamingMembers->padLinkedOrFlushedCondition.notifyOne();
     415            streamingMembers->queueChangedOrFlushedCondition.notifyOne();
     416        }
     417        // Following gstbasesrc implementation, this code is not flushing downstream.
     418        // If there is any possibility of the streaming thread being blocked downstream the caller MUST flush before.
     419        // Otherwise a deadlock would occur as the next function tries to join the thread.
     420        gst_pad_stop_task(pad);
     421        {
     422            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     423            streamingMembers->isFlushing = false;
     424        }
     425    }
     426    return true;
     427}
     428
     429static void webKitMediaSrcPadLinked(GstPad* pad, GstPad*, void*)
     430{
     431    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
     432    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     433    streamingMembers->padLinkedOrFlushedCondition.notifyOne();
     434}
     435
     436static void webKitMediaSrcStreamNotifyLowWaterLevel(const RefPtr<Stream>& stream)
     437{
     438    RunLoop::main().dispatch([stream]() {
     439        if (stream->reportedStatus->wasRemoved)
     440            return;
     441
     442        stream->reportedStatus->isReadyForMoreSamples = true;
     443        if (stream->reportedStatus->sourceBufferPrivateToNotify) {
     444            // We need to set sourceBufferPrivateToNotify BEFORE calling sourceBufferPrivateDidBecomeReadyForMoreSamples(),
     445            // not after, since otherwise it would destroy a notification request should the callback request one.
     446            SourceBufferPrivateClient* sourceBuffer = stream->reportedStatus->sourceBufferPrivateToNotify;
     447            stream->reportedStatus->sourceBufferPrivateToNotify = nullptr;
     448            sourceBuffer->sourceBufferPrivateDidBecomeReadyForMoreSamples(stream->name);
     449        }
     450    });
     451}
     452
     453// Called with STREAM_LOCK.
     454static void webKitMediaSrcLoop(void* userData)
     455{
     456    GstPad* pad = GST_PAD(userData);
     457    RefPtr<Stream>& stream = WEBKIT_MEDIA_SRC_PAD(pad)->priv->stream;
     458
     459    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     460    if (streamingMembers->isFlushing) {
     461        gst_pad_pause_task(pad);
     462        return;
     463    }
     464
     465    // Since the pad can and will be added when the element is in PLAYING state, this task can start running
     466    // before the pad is linked. Wait for the pad to be linked to avoid buffers being lost to not-linked errors.
     467    GST_OBJECT_LOCK(pad);
     468    if (!GST_PAD_IS_LINKED(pad)) {
     469        g_signal_connect(pad, "linked", G_CALLBACK(webKitMediaSrcPadLinked), nullptr);
     470        GST_OBJECT_UNLOCK(pad);
     471
     472        streamingMembers->padLinkedOrFlushedCondition.wait(streamingMembers.mutex());
     473
     474        g_signal_handlers_disconnect_by_func(pad, reinterpret_cast<void*>(webKitMediaSrcPadLinked), nullptr);
     475        if (streamingMembers->isFlushing)
     476            return;
     477    } else
     478        GST_OBJECT_UNLOCK(pad);
     479    ASSERT(gst_pad_is_linked(pad));
     480
     481    // By keeping the lock we are guaranteed that a flush will not happen while we send essential events.
     482    // These events should never block downstream, so the lock should be released in little time in every
     483    // case.
     484
     485    if (streamingMembers->pendingStreamCollectionEvent)
     486        gst_pad_push_event(stream->pad.get(), streamingMembers->pendingStreamCollectionEvent.leakRef());
     487
     488    if (!streamingMembers->wasStreamStartSent) {
     489        GUniquePtr<char> streamId(g_strdup_printf("mse/%s", stream->name.string().utf8().data()));
     490        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_stream_start(streamId.get()));
     491        gst_event_set_group_id(event.get(), stream->source->priv->groupId);
     492        gst_event_set_stream(event.get(), stream->streamInfo.get());
     493
     494        bool wasStreamStartSent = gst_pad_push_event(pad, event.leakRef());
     495        streamingMembers->wasStreamStartSent = wasStreamStartSent;
     496    }
     497
     498    if (streamingMembers->pendingInitialCaps) {
     499        GRefPtr<GstEvent> event = adoptGRef(gst_event_new_caps(streamingMembers->pendingInitialCaps.get()));
     500
     501        gst_pad_push_event(pad, event.leakRef());
     502
     503        streamingMembers->previousCaps = WTFMove(streamingMembers->pendingInitialCaps);
     504        ASSERT(!streamingMembers->pendingInitialCaps);
     505    }
     506
     507    streamingMembers->queueChangedOrFlushedCondition.wait(streamingMembers.mutex(), [&]() {
     508        return !streamingMembers->queue.isEmpty() || streamingMembers->isFlushing;
     509    });
     510    if (streamingMembers->isFlushing)
     511        return;
     512
     513    // We wait to get a sample before emitting the first segment. This way, if we get a seek before any
     514    // enqueue, we're sending only one segment. This also ensures that when such a seek is made, where we also
     515    // omit the flush (see webKitMediaSrcFlush) we actually emit the updated, correct segment.
     516    if (streamingMembers->doesNeedSegmentEvent) {
     517        gst_pad_push_event(pad, gst_event_new_segment(&streamingMembers->segment));
     518        streamingMembers->doesNeedSegmentEvent = false;
     519    }
     520
     521    GRefPtr<GstMiniObject> object = streamingMembers->queue.takeFirst();
     522    if (GST_IS_SAMPLE(object.get())) {
     523        GRefPtr<GstSample> sample = adoptGRef(GST_SAMPLE(object.leakRef()));
     524
     525        if (!gst_caps_is_equal(gst_sample_get_caps(sample.get()), streamingMembers->previousCaps.get())) {
     526            // This sample needs new caps (typically because of a quality change).
     527            gst_pad_push_event(stream->pad.get(), gst_event_new_caps(gst_sample_get_caps(sample.get())));
     528            streamingMembers->previousCaps = gst_sample_get_caps(sample.get());
     529        }
     530
     531        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel && streamingMembers->durationEnqueued() <= Stream::durationEnqueuedLowWaterLevel) {
     532            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
     533            webKitMediaSrcStreamNotifyLowWaterLevel(RefPtr<Stream>(stream));
     534        }
     535
     536        GRefPtr<GstBuffer> buffer = gst_sample_get_buffer(sample.get());
     537        sample.clear();
     538
     539        if (!streamingMembers->hasPushedFirstBuffer) {
     540            GUniquePtr<char> fileName { g_strdup_printf("playback-pipeline-before-playback-%s", stream->name.string().utf8().data()) };
     541            GST_DEBUG_BIN_TO_DOT_FILE_WITH_TS(GST_BIN(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(stream->source))).get()),
     542                GST_DEBUG_GRAPH_SHOW_ALL, fileName.get());
     543            streamingMembers->hasPushedFirstBuffer = true;
     544        }
     545
     546        // Push the buffer without the streamingMembers lock so that flushes can happen while it travels downstream.
     547        streamingMembers.lockHolder().unlockEarly();
     548
     549        ASSERT(GST_BUFFER_PTS_IS_VALID(buffer.get()));
     550        GstFlowReturn ret = gst_pad_push(pad, buffer.leakRef());
     551        if (ret != GST_FLOW_OK && ret != GST_FLOW_FLUSHING) {
     552            GST_ERROR_OBJECT(pad, "Pushing buffer returned %s", gst_flow_get_name(ret));
     553            gst_pad_pause_task(pad);
     554        }
     555    } else if (GST_IS_EVENT(object.get())) {
     556        // EOS events and other enqueued events are also sent unlocked so they can react to flushes if necessary.
     557        GRefPtr<GstEvent> event = GRefPtr<GstEvent>(GST_EVENT(object.leakRef()));
     558
     559        streamingMembers.lockHolder().unlockEarly();
     560        bool eventHandled = gst_pad_push_event(pad, GRefPtr<GstEvent>(event).leakRef());
     561        if (!eventHandled)
     562            GST_DEBUG_OBJECT(pad, "Pushed event was not handled: %" GST_PTR_FORMAT, event.get());
     563    } else
     564        ASSERT_NOT_REACHED();
     565}
     566
     567static void webKitMediaSrcEnqueueObject(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstMiniObject>&& object)
     568{
     569    ASSERT(isMainThread());
     570    ASSERT(object);
     571
     572    Stream* stream = source->priv->streamByName(streamName);
     573    DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     574    streamingMembers->queue.append(WTFMove(object));
     575    if (stream->reportedStatus->isReadyForMoreSamples && streamingMembers->durationEnqueued() > Stream::durationEnqueuedHighWaterLevel) {
     576        stream->reportedStatus->isReadyForMoreSamples = false;
     577        streamingMembers->doesNeedToNotifyOnLowWaterLevel = true;
     578    }
     579    streamingMembers->queueChangedOrFlushedCondition.notifyOne();
     580}
     581
     582void webKitMediaSrcEnqueueSample(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstSample>&& sample)
     583{
     584    ASSERT(GST_BUFFER_PTS_IS_VALID(gst_sample_get_buffer(sample.get())));
     585    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(sample.leakRef())));
     586}
     587
     588static void webKitMediaSrcEnqueueEvent(WebKitMediaSrc* source, const AtomString& streamName, GRefPtr<GstEvent>&& event)
     589{
     590    webKitMediaSrcEnqueueObject(source, streamName, adoptGRef(GST_MINI_OBJECT(event.leakRef())));
     591}
     592
     593void webKitMediaSrcEndOfStream(WebKitMediaSrc* source, const AtomString& streamName)
     594{
     595    webKitMediaSrcEnqueueEvent(source, streamName, adoptGRef(gst_event_new_eos()));
     596}
     597
     598bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName)
     599{
     600    ASSERT(isMainThread());
     601    Stream* stream = source->priv->streamByName(streamName);
     602    return stream->reportedStatus->isReadyForMoreSamples;
     603}
     604
     605void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc* source, const AtomString& streamName, WebCore::SourceBufferPrivateClient* sourceBufferPrivate)
     606{
     607    ASSERT(isMainThread());
     608    Stream* stream = source->priv->streamByName(streamName);
     609    ASSERT(!stream->reportedStatus->isReadyForMoreSamples);
     610    stream->reportedStatus->sourceBufferPrivateToNotify = sourceBufferPrivate;
     611}
     612
     613static GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
     614{
     615    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
     616    if (transition == GST_STATE_CHANGE_PAUSED_TO_READY) {
     617        while (!source->priv->streams.isEmpty())
     618            webKitMediaSrcRemoveStream(source, source->priv->streams.begin()->key);
     619    }
     620    return GST_ELEMENT_CLASS(webkit_media_src_parent_class)->change_state(element, transition);
     621}
     622
     623static void webKitMediaSrcStreamFlushStart(const RefPtr<Stream>& stream)
     624{
     625    ASSERT(isMainThread());
     626    {
     627        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     628
     629        streamingMembers->isFlushing = true;
     630        streamingMembers->queueChangedOrFlushedCondition.notifyOne();
     631        streamingMembers->padLinkedOrFlushedCondition.notifyOne();
     632    }
     633
     634    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_start());
     635}
     636
     637static void webKitMediaSrcStreamFlushStop(const RefPtr<Stream>& stream, bool resetTime)
     638{
     639    ASSERT(isMainThread());
     640
     641    // By taking the stream lock we are waiting for the streaming thread task to stop if it hadn't yet.
     642    GST_PAD_STREAM_LOCK(stream->pad.get());
     643    {
     644        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     645
     646        streamingMembers->isFlushing = false;
     647        streamingMembers->doesNeedSegmentEvent = true;
     648        streamingMembers->queue.clear();
     649        if (streamingMembers->doesNeedToNotifyOnLowWaterLevel) {
     650            streamingMembers->doesNeedToNotifyOnLowWaterLevel = false;
     651            webKitMediaSrcStreamNotifyLowWaterLevel(stream);
     652        }
     653    }
     654
     655    // Since FLUSH_STOP is a synchronized event, we send it while we still hold the stream lock of the pad.
     656    gst_pad_push_event(stream->pad.get(), gst_event_new_flush_stop(resetTime));
     657
     658    gst_pad_start_task(stream->pad.get(), webKitMediaSrcLoop, stream->pad.get(), nullptr);
     659    GST_PAD_STREAM_UNLOCK(stream->pad.get());
     660}
     661
     662void webKitMediaSrcFlush(WebKitMediaSrc* source, const AtomString& streamName)
     663{
     664    ASSERT(isMainThread());
     665    Stream* stream = source->priv->streamByName(streamName);
     666
     667    bool hasPushedFirstBuffer;
     668    {
     669        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     670        hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
     671    }
     672
     673    if (hasPushedFirstBuffer) {
     674        // If no buffer has been pushed there is no need for flush... and flushing at that point could
     675        // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
     676        // having linked the chain so far and forgetting to do it after the flush).
     677        webKitMediaSrcStreamFlushStart(stream);
     678    }
     679
     680    GstClockTime pipelineStreamTime;
     681    gst_element_query_position(findPipeline(GRefPtr<GstElement>(GST_ELEMENT(source))).get(), GST_FORMAT_TIME,
     682        reinterpret_cast<gint64*>(&pipelineStreamTime));
     683    // -1 is returned when the pipeline is not yet pre-rolled (e.g. just after a seek). In this case we don't need to
     684    // adjust the segment though, as running time has not advanced.
     685    if (GST_CLOCK_TIME_IS_VALID(pipelineStreamTime)) {
     686        DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     687        // We need to increase the base by the running time accumulated during the previous segment.
     688
     689        GstClockTime pipelineRunningTime = gst_segment_to_running_time(&streamingMembers->segment, GST_FORMAT_TIME, pipelineStreamTime);
     690        assert(GST_CLOCK_TIME_IS_VALID(pipelineRunningTime));
     691        streamingMembers->segment.base = pipelineRunningTime;
     692
     693        streamingMembers->segment.start = streamingMembers->segment.time = static_cast<GstClockTime>(pipelineStreamTime);
     694    }
     695
     696    if (hasPushedFirstBuffer)
     697        webKitMediaSrcStreamFlushStop(stream, false);
     698}
     699
     700void webKitMediaSrcSeek(WebKitMediaSrc* source, uint64_t startTime, double rate)
     701{
     702    ASSERT(isMainThread());
     703    source->priv->startTime = startTime;
     704    source->priv->rate = rate;
     705
     706    for (auto& pair : source->priv->streams) {
     707        const RefPtr<Stream>& stream = pair.value;
     708        bool hasPushedFirstBuffer;
     709        {
     710            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     711            hasPushedFirstBuffer = streamingMembers->hasPushedFirstBuffer;
     712        }
     713
     714        if (hasPushedFirstBuffer) {
     715            // If no buffer has been pushed there is no need for flush... and flushing at that point could
     716            // expose bugs in downstream which may have not completely initialized (e.g. decodebin3 not
     717            // having linked the chain so far and forgetting to do it after the flush).
     718            webKitMediaSrcStreamFlushStart(stream);
     719        }
     720
     721        {
     722            DataMutex<Stream::StreamingMembers>::LockedWrapper streamingMembers(stream->streamingMembersDataMutex);
     723            streamingMembers->segment.base = 0;
     724            streamingMembers->segment.rate = rate;
     725            streamingMembers->segment.start = streamingMembers->segment.time = startTime;
     726        }
     727
     728        if (hasPushedFirstBuffer)
     729            webKitMediaSrcStreamFlushStop(stream, true);
     730    }
     731}
     732
     733static int countStreamsOfType(WebKitMediaSrc* source, WebCore::MediaSourceStreamTypeGStreamer type)
     734{
     735    // Barring pipeline dumps someone may add during debugging, WebKit will only read these properties (n-video etc.) from the main thread.
     736    return std::count_if(source->priv->streams.begin(), source->priv->streams.end(), [type](auto item) {
     737        return item.value->type == type;
     738    });
     739}
     740
     741static void webKitMediaSrcGetProperty(GObject* object, unsigned propId, GValue* value, GParamSpec* pspec)
    294742{
    295743    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
    296744
    297745    switch (propId) {
    298     case PROP_LOCATION:
    299         gst_uri_handler_set_uri(reinterpret_cast<GstURIHandler*>(source), g_value_get_string(value), nullptr);
     746    case PROP_N_AUDIO:
     747        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Audio));
     748        break;
     749    case PROP_N_VIDEO:
     750        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Video));
     751        break;
     752    case PROP_N_TEXT:
     753        g_value_set_int(value, countStreamsOfType(source, WebCore::MediaSourceStreamTypeGStreamer::Text));
    300754        break;
    301755    default:
    302756        G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
    303         break;
    304     }
    305 }
    306 
    307 void webKitMediaSrcGetProperty(GObject* object, guint propId, GValue* value, GParamSpec* pspec)
    308 {
    309     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(object);
    310     WebKitMediaSrcPrivate* priv = source->priv;
    311 
    312     GST_OBJECT_LOCK(source);
    313     switch (propId) {
    314     case PROP_LOCATION:
    315         g_value_set_string(value, priv->location.get());
    316         break;
    317     case PROP_N_AUDIO:
    318         g_value_set_int(value, priv->numberOfAudioStreams);
    319         break;
    320     case PROP_N_VIDEO:
    321         g_value_set_int(value, priv->numberOfVideoStreams);
    322         break;
    323     case PROP_N_TEXT:
    324         g_value_set_int(value, priv->numberOfTextStreams);
    325         break;
    326     default:
    327         G_OBJECT_WARN_INVALID_PROPERTY_ID(object, propId, pspec);
    328         break;
    329     }
    330     GST_OBJECT_UNLOCK(source);
    331 }
    332 
    333 void webKitMediaSrcDoAsyncStart(WebKitMediaSrc* source)
    334 {
    335     source->priv->asyncStart = true;
    336     GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
    337         gst_message_new_async_start(GST_OBJECT(source)));
    338 }
    339 
    340 void webKitMediaSrcDoAsyncDone(WebKitMediaSrc* source)
    341 {
    342     WebKitMediaSrcPrivate* priv = source->priv;
    343     if (priv->asyncStart) {
    344         GST_BIN_CLASS(parent_class)->handle_message(GST_BIN(source),
    345             gst_message_new_async_done(GST_OBJECT(source), GST_CLOCK_TIME_NONE));
    346         priv->asyncStart = false;
    347     }
    348 }
    349 
    350 GstStateChangeReturn webKitMediaSrcChangeState(GstElement* element, GstStateChange transition)
    351 {
    352     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(element);
    353     WebKitMediaSrcPrivate* priv = source->priv;
    354 
    355     switch (transition) {
    356     case GST_STATE_CHANGE_READY_TO_PAUSED:
    357         priv->allTracksConfigured = false;
    358         webKitMediaSrcDoAsyncStart(source);
    359         break;
    360     default:
    361         break;
    362     }
    363 
    364     GstStateChangeReturn result = GST_ELEMENT_CLASS(parent_class)->change_state(element, transition);
    365     if (G_UNLIKELY(result == GST_STATE_CHANGE_FAILURE)) {
    366         GST_WARNING_OBJECT(source, "State change failed");
    367         webKitMediaSrcDoAsyncDone(source);
    368         return result;
    369     }
    370 
    371     switch (transition) {
    372     case GST_STATE_CHANGE_READY_TO_PAUSED:
    373         result = GST_STATE_CHANGE_ASYNC;
    374         break;
    375     case GST_STATE_CHANGE_PAUSED_TO_READY:
    376         webKitMediaSrcDoAsyncDone(source);
    377         priv->allTracksConfigured = false;
    378         break;
    379     default:
    380         break;
    381     }
    382 
    383     return result;
    384 }
    385 
    386 gint64 webKitMediaSrcGetSize(WebKitMediaSrc* webKitMediaSrc)
    387 {
    388     gint64 duration = 0;
    389     for (Stream* stream : webKitMediaSrc->priv->streams)
    390         duration = std::max<gint64>(duration, gst_app_src_get_size(GST_APP_SRC(stream->appsrc)));
    391     return duration;
    392 }
    393 
    394 gboolean webKitMediaSrcQueryWithParent(GstPad* pad, GstObject* parent, GstQuery* query)
    395 {
    396     WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(GST_ELEMENT(parent));
    397     gboolean result = FALSE;
    398 
    399     switch (GST_QUERY_TYPE(query)) {
    400     case GST_QUERY_DURATION: {
    401         GstFormat format;
    402         gst_query_parse_duration(query, &format, nullptr);
    403 
    404         GST_DEBUG_OBJECT(source, "duration query in format %s", gst_format_get_name(format));
    405         GST_OBJECT_LOCK(source);
    406         switch (format) {
    407         case GST_FORMAT_TIME: {
    408             if (source->priv && source->priv->mediaPlayerPrivate) {
    409                 MediaTime duration = source->priv->mediaPlayerPrivate->durationMediaTime();
    410                 if (duration > MediaTime::zeroTime()) {
    411                     gst_query_set_duration(query, format, WebCore::toGstClockTime(duration));
    412                     GST_DEBUG_OBJECT(source, "Answering: duration=%" GST_TIME_FORMAT, GST_TIME_ARGS(WebCore::toGstClockTime(duration)));
    413                     result = TRUE;
    414                 }
    415             }
    416             break;
    417         }
    418         case GST_FORMAT_BYTES: {
    419             if (source->priv) {
    420                 gint64 duration = webKitMediaSrcGetSize(source);
    421                 if (duration) {
    422                     gst_query_set_duration(query, format, duration);
    423                     GST_DEBUG_OBJECT(source, "size: %" G_GINT64_FORMAT, duration);
    424                     result = TRUE;
    425                 }
    426             }
    427             break;
    428         }
    429         default:
    430             break;
    431         }
    432 
    433         GST_OBJECT_UNLOCK(source);
    434         break;
    435     }
    436     case GST_QUERY_URI:
    437         if (source) {
    438             GST_OBJECT_LOCK(source);
    439             if (source->priv)
    440                 gst_query_set_uri(query, source->priv->location.get());
    441             GST_OBJECT_UNLOCK(source);
    442         }
    443         result = TRUE;
    444         break;
    445     default: {
    446         GRefPtr<GstPad> target = adoptGRef(gst_ghost_pad_get_target(GST_GHOST_PAD_CAST(pad)));
    447         // Forward the query to the proxy target pad.
    448         if (target)
    449             result = gst_pad_query(target.get(), query);
    450         break;
    451     }
    452     }
    453 
    454     return result;
    455 }
    456 
    457 void webKitMediaSrcUpdatePresentationSize(GstCaps* caps, Stream* stream)
    458 {
    459     GST_OBJECT_LOCK(stream->parent);
    460     if (WebCore::doCapsHaveType(caps, GST_VIDEO_CAPS_TYPE_PREFIX)) {
    461         Optional<WebCore::FloatSize> size = WebCore::getVideoResolutionFromCaps(caps);
    462         if (size.hasValue())
    463             stream->presentationSize = size.value();
    464         else
    465             stream->presentationSize = WebCore::FloatSize();
    466     } else
    467         stream->presentationSize = WebCore::FloatSize();
    468 
    469     gst_caps_ref(caps);
    470     stream->caps = adoptGRef(caps);
    471     GST_OBJECT_UNLOCK(stream->parent);
    472 }
    473 
    474 void webKitMediaSrcLinkStreamToSrcPad(GstPad* sourcePad, Stream* stream)
    475 {
    476     unsigned padId = static_cast<unsigned>(GPOINTER_TO_INT(g_object_get_data(G_OBJECT(sourcePad), "padId")));
    477     GST_DEBUG_OBJECT(stream->parent, "linking stream to src pad (id: %u)", padId);
    478 
    479     GUniquePtr<gchar> padName(g_strdup_printf("src_%u", padId));
    480     GstPad* ghostpad = WebCore::webkitGstGhostPadFromStaticTemplate(&srcTemplate, padName.get(), sourcePad);
    481 
    482     auto proxypad = adoptGRef(GST_PAD(gst_proxy_pad_get_internal(GST_PROXY_PAD(ghostpad))));
    483     gst_flow_combiner_add_pad(stream->parent->priv->flowCombiner.get(), proxypad.get());
    484     gst_pad_set_chain_function(proxypad.get(), static_cast<GstPadChainFunction>(webkitMediaSrcChain));
    485     gst_pad_set_query_function(ghostpad, webKitMediaSrcQueryWithParent);
    486 
    487     gst_pad_set_active(ghostpad, TRUE);
    488     gst_element_add_pad(GST_ELEMENT(stream->parent), ghostpad);
    489 }
    490 
    491 void webKitMediaSrcLinkSourcePad(GstPad* sourcePad, GstCaps* caps, Stream* stream)
    492 {
    493     ASSERT(caps && stream->parent);
    494     if (!caps || !stream->parent) {
    495         GST_ERROR("Unable to link parser");
    496         return;
    497     }
    498 
    499     webKitMediaSrcUpdatePresentationSize(caps, stream);
    500 
    501     // FIXME: drop webKitMediaSrcLinkStreamToSrcPad() and move its code here.
    502     if (!gst_pad_is_linked(sourcePad)) {
    503         GST_DEBUG_OBJECT(stream->parent, "pad not linked yet");
    504         webKitMediaSrcLinkStreamToSrcPad(sourcePad, stream);
    505     }
    506 
    507     webKitMediaSrcCheckAllTracksConfigured(stream->parent);
    508 }
    509 
    510 void webKitMediaSrcFreeStream(WebKitMediaSrc* source, Stream* stream)
    511 {
    512     if (GST_IS_APP_SRC(stream->appsrc)) {
    513         // Don't trigger callbacks from this appsrc to avoid using the stream anymore.
    514         gst_app_src_set_callbacks(GST_APP_SRC(stream->appsrc), &disabledAppsrcCallbacks, nullptr, nullptr);
    515         gst_app_src_end_of_stream(GST_APP_SRC(stream->appsrc));
    516     }
    517 
    518     GST_OBJECT_LOCK(source);
    519     switch (stream->type) {
    520     case WebCore::Audio:
    521         source->priv->numberOfAudioStreams--;
    522         break;
    523     case WebCore::Video:
    524         source->priv->numberOfVideoStreams--;
    525         break;
    526     case WebCore::Text:
    527         source->priv->numberOfTextStreams--;
    528         break;
    529     default:
    530         break;
    531     }
    532     GST_OBJECT_UNLOCK(source);
    533 
    534     if (stream->type != WebCore::Invalid) {
    535         GST_DEBUG("Freeing track-related info on stream %p", stream);
    536 
    537         LockHolder locker(source->priv->streamLock);
    538 
    539         if (stream->caps)
    540             stream->caps = nullptr;
    541 
    542         if (stream->audioTrack)
    543             stream->audioTrack = nullptr;
    544         if (stream->videoTrack)
    545             stream->videoTrack = nullptr;
    546 
    547         int signal = -1;
    548         switch (stream->type) {
    549         case WebCore::Audio:
    550             signal = SIGNAL_AUDIO_CHANGED;
    551             break;
    552         case WebCore::Video:
    553             signal = SIGNAL_VIDEO_CHANGED;
    554             break;
    555         case WebCore::Text:
    556             signal = SIGNAL_TEXT_CHANGED;
    557             break;
    558         default:
    559             break;
    560         }
    561         stream->type = WebCore::Invalid;
    562 
    563         if (signal != -1)
    564             g_signal_emit(G_OBJECT(source), webKitMediaSrcSignals[signal], 0, nullptr);
    565 
    566         source->priv->streamCondition.notifyOne();
    567     }
    568 
    569     GST_DEBUG("Releasing stream: %p", stream);
    570     delete stream;
    571 }
    572 
    573 void webKitMediaSrcCheckAllTracksConfigured(WebKitMediaSrc* webKitMediaSrc)
    574 {
    575     bool allTracksConfigured = false;
    576 
    577     GST_OBJECT_LOCK(webKitMediaSrc);
    578     if (!webKitMediaSrc->priv->allTracksConfigured) {
    579         allTracksConfigured = true;
    580         for (Stream* stream : webKitMediaSrc->priv->streams) {
    581             if (stream->type == WebCore::Invalid) {
    582                 allTracksConfigured = false;
    583                 break;
    584             }
    585         }
    586         if (allTracksConfigured)
    587             webKitMediaSrc->priv->allTracksConfigured = true;
    588     }
    589     GST_OBJECT_UNLOCK(webKitMediaSrc);
    590 
    591     if (allTracksConfigured) {
    592         GST_DEBUG("All tracks attached. Completing async state change operation.");
    593         gst_element_no_more_pads(GST_ELEMENT(webKitMediaSrc));
    594         webKitMediaSrcDoAsyncDone(webKitMediaSrc);
    595     }
    596 }
    597 
    598 // Uri handler interface.
    599 GstURIType webKitMediaSrcUriGetType(GType)
     757    }
     758}
     759
     760// URI handler interface. It's only purpose is for the element to be instantiated by playbin on "mediasourceblob:"
     761// URIs. The actual URI does not matter.
     762static GstURIType webKitMediaSrcUriGetType(GType)
    600763{
    601764    return GST_URI_SRC;
    602765}
    603766
    604 const gchar* const* webKitMediaSrcGetProtocols(GType)
     767static const gchar* const* webKitMediaSrcGetProtocols(GType)
    605768{
    606769    static const char* protocols[] = {"mediasourceblob", nullptr };
     
    608771}
    609772
    610 gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
     773static gchar* webKitMediaSrcGetUri(GstURIHandler* handler)
    611774{
    612775    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
     
    614777
    615778    GST_OBJECT_LOCK(source);
    616     result = g_strdup(source->priv->location.get());
     779    result = g_strdup(source->priv->uri.get());
    617780    GST_OBJECT_UNLOCK(source);
    618781    return result;
    619782}
    620783
    621 gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
     784static gboolean webKitMediaSrcSetUri(GstURIHandler* handler, const gchar* uri, GError**)
    622785{
    623786    WebKitMediaSrc* source = WEBKIT_MEDIA_SRC(handler);
     
    625788    if (GST_STATE(source) >= GST_STATE_PAUSED) {
    626789        GST_ERROR_OBJECT(source, "URI can only be set in states < PAUSED");
    627         return FALSE;
     790        return false;
    628791    }
    629792
    630793    GST_OBJECT_LOCK(source);
    631     WebKitMediaSrcPrivate* priv = source->priv;
    632     priv->location = nullptr;
    633     if (!uri) {
    634         GST_OBJECT_UNLOCK(source);
    635         return TRUE;
    636     }
    637 
    638     URL url(URL(), uri);
    639 
    640     priv->location = GUniquePtr<gchar>(g_strdup(url.string().utf8().data()));
     794    source->priv->uri = GUniquePtr<char>(g_strdup(uri));
    641795    GST_OBJECT_UNLOCK(source);
    642796    return TRUE;
    643797}
    644798
    645 void webKitMediaSrcUriHandlerInit(gpointer gIface, gpointer)
     799static void webKitMediaSrcUriHandlerInit(void* gIface, void*)
    646800{
    647801    GstURIHandlerInterface* iface = (GstURIHandlerInterface *) gIface;
     
    653807}
    654808
    655 static void seekNeedsDataMainThread(WebKitMediaSrc* source)
    656 {
    657     GST_DEBUG("Buffering needed before seek");
    658 
    659     ASSERT(WTF::isMainThread());
    660 
    661     GST_OBJECT_LOCK(source);
    662     MediaTime seekTime = source->priv->seekTime;
    663     WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
    664 
    665     if (!mediaPlayerPrivate) {
    666         GST_OBJECT_UNLOCK(source);
    667         return;
    668     }
    669 
    670     for (Stream* stream : source->priv->streams) {
    671         if (stream->type != WebCore::Invalid)
    672             stream->sourceBuffer->setReadyForMoreSamples(true);
    673     }
    674     GST_OBJECT_UNLOCK(source);
    675     mediaPlayerPrivate->notifySeekNeedsDataForTime(seekTime);
    676 }
    677 
    678 static void notifyReadyForMoreSamplesMainThread(WebKitMediaSrc* source, Stream* appsrcStream)
    679 {
    680     GST_OBJECT_LOCK(source);
    681 
    682     auto it = std::find(source->priv->streams.begin(), source->priv->streams.end(), appsrcStream);
    683     if (it == source->priv->streams.end()) {
    684         GST_OBJECT_UNLOCK(source);
    685         return;
    686     }
    687 
    688     WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate = source->priv->mediaPlayerPrivate;
    689     if (mediaPlayerPrivate && !mediaPlayerPrivate->seeking())
    690         appsrcStream->sourceBuffer->notifyReadyForMoreSamples();
    691 
    692     GST_OBJECT_UNLOCK(source);
    693 }
    694 
    695 void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc* source, WebCore::MediaPlayerPrivateGStreamerMSE* mediaPlayerPrivate)
    696 {
    697     GST_OBJECT_LOCK(source);
    698 
    699     // Set to nullptr on MediaPlayerPrivateGStreamer destruction, never a dangling pointer.
    700     source->priv->mediaPlayerPrivate = mediaPlayerPrivate;
    701     GST_OBJECT_UNLOCK(source);
    702 }
    703 
    704 void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc* source, bool isReady)
    705 {
    706     if (source) {
    707         GST_OBJECT_LOCK(source);
    708         for (Stream* stream : source->priv->streams)
    709             stream->sourceBuffer->setReadyForMoreSamples(isReady);
    710         GST_OBJECT_UNLOCK(source);
    711     }
    712 }
    713 
    714 void webKitMediaSrcPrepareSeek(WebKitMediaSrc* source, const MediaTime& time)
    715 {
    716     GST_OBJECT_LOCK(source);
    717     source->priv->seekTime = time;
    718     source->priv->appsrcSeekDataCount = 0;
    719     source->priv->appsrcNeedDataCount = 0;
    720 
    721     for (Stream* stream : source->priv->streams) {
    722         stream->appsrcNeedDataFlag = false;
    723         // Don't allow samples away from the seekTime to be enqueued.
    724         stream->lastEnqueuedTime = time;
    725     }
    726 
    727     // The pending action will be performed in enabledAppsrcSeekData().
    728     source->priv->appsrcSeekDataNextAction = MediaSourceSeekToTime;
    729     GST_OBJECT_UNLOCK(source);
    730 }
    731 
    732809namespace WTF {
    733810template <> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr)
     
    750827        gst_object_unref(ptr);
    751828}
    752 };
     829} // namespace WTF
    753830
    754831#endif // USE(GSTREAMER)
  • trunk/Source/WebCore/platform/graphics/gstreamer/mse/WebKitMediaSourceGStreamer.h

    r230625 r249205  
    44 *  Copyright (C) 2013 Orange
    55 *  Copyright (C) 2014, 2015 Sebastian Dröge <sebastian@centricular.com>
    6  *  Copyright (C) 2015, 2016 Metrological Group B.V.
    7  *  Copyright (C) 2015, 2016 Igalia, S.L
     6 *  Copyright (C) 2015, 2016, 2018, 2019 Metrological Group B.V.
     7 *  Copyright (C) 2015, 2016, 2018, 2019 Igalia, S.L
    88 *
    99 *  This library is free software; you can redistribute it and/or
     
    4040enum MediaSourceStreamTypeGStreamer { Invalid, Unknown, Audio, Video, Text };
    4141
    42 }
     42} // namespace WebCore
    4343
    4444G_BEGIN_DECLS
     
    5050#define WEBKIT_IS_MEDIA_SRC_CLASS(klass) (G_TYPE_CHECK_CLASS_TYPE ((klass), WEBKIT_TYPE_MEDIA_SRC))
    5151
    52 typedef struct _WebKitMediaSrc        WebKitMediaSrc;
    53 typedef struct _WebKitMediaSrcClass   WebKitMediaSrcClass;
    54 typedef struct _WebKitMediaSrcPrivate WebKitMediaSrcPrivate;
     52struct WebKitMediaSrcPrivate;
    5553
    56 struct _WebKitMediaSrc {
    57     GstBin parent;
     54struct WebKitMediaSrc {
     55    GstElement parent;
    5856
    5957    WebKitMediaSrcPrivate *priv;
    6058};
    6159
    62 struct _WebKitMediaSrcClass {
    63     GstBinClass parentClass;
    64 
    65     // Notify app that number of audio/video/text streams changed.
    66     void (*videoChanged)(WebKitMediaSrc*);
    67     void (*audioChanged)(WebKitMediaSrc*);
    68     void (*textChanged)(WebKitMediaSrc*);
     60struct WebKitMediaSrcClass {
     61    GstElementClass parentClass;
    6962};
    7063
    7164GType webkit_media_src_get_type(void);
    7265
    73 void webKitMediaSrcSetMediaPlayerPrivate(WebKitMediaSrc*, WebCore::MediaPlayerPrivateGStreamerMSE*);
     66void webKitMediaSrcAddStream(WebKitMediaSrc*, const AtomString& name, WebCore::MediaSourceStreamTypeGStreamer, GRefPtr<GstCaps>&& initialCaps);
     67void webKitMediaSrcRemoveStream(WebKitMediaSrc*, const AtomString& name);
    7468
    75 void webKitMediaSrcPrepareSeek(WebKitMediaSrc*, const MediaTime&);
    76 void webKitMediaSrcSetReadyForSamples(WebKitMediaSrc*, bool);
     69void webKitMediaSrcEnqueueSample(WebKitMediaSrc*, const AtomString& streamName, GRefPtr<GstSample>&&);
     70void webKitMediaSrcEndOfStream(WebKitMediaSrc*, const AtomString& streamName);
     71
     72bool webKitMediaSrcIsReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName);
     73void webKitMediaSrcNotifyWhenReadyForMoreSamples(WebKitMediaSrc*, const AtomString& streamName, WebCore::SourceBufferPrivateClient*);
     74
     75void webKitMediaSrcFlush(WebKitMediaSrc*, const AtomString& streamName);
     76void webKitMediaSrcSeek(WebKitMediaSrc*, guint64 startTime, double rate);
    7777
    7878G_END_DECLS
    7979
     80namespace WTF {
     81template<> GRefPtr<WebKitMediaSrc> adoptGRef(WebKitMediaSrc* ptr);
     82template<> WebKitMediaSrc* refGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
     83template<> void derefGPtr<WebKitMediaSrc>(WebKitMediaSrc* ptr);
     84} // namespace WTF
     85
    8086#endif // USE(GSTREAMER)
  • trunk/Source/cmake/GStreamerChecks.cmake

    r246677 r249205  
    3838endif ()
    3939
    40 if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.14")
    41     message(FATAL_ERROR "GStreamer 1.14 is needed for ENABLE_MEDIA_SOURCE.")
     40if (ENABLE_MEDIA_SOURCE AND PC_GSTREAMER_VERSION VERSION_LESS "1.16")
     41    message(FATAL_ERROR "GStreamer 1.16 is needed for ENABLE_MEDIA_SOURCE.")
    4242endif ()
    4343
  • trunk/Tools/ChangeLog

    r249204 r249205  
     12019-08-28  Alicia Boya García  <aboya@igalia.com>
     2
     3        [MSE][GStreamer] WebKitMediaSrc rework
     4        https://bugs.webkit.org/show_bug.cgi?id=199719
     5
     6        Reviewed by Xabier Rodriguez-Calvar.
     7
     8        Added WebKitMediaSourceGStreamer.cpp to the GStreamer-style coding
     9        whitelist.
     10
     11        * Scripts/webkitpy/style/checker.py:
     12
    1132019-08-28  Alexey Proskuryakov  <ap@apple.com>
    214
  • trunk/Tools/Scripts/webkitpy/style/checker.py

    r248530 r249205  
    212212      os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'VideoSinkGStreamer.cpp'),
    213213      os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'WebKitWebSourceGStreamer.cpp'),
     214      os.path.join('Source', 'WebCore', 'platform', 'graphics', 'gstreamer', 'mse', 'WebKitMediaSourceGStreamer.cpp'),
    214215      os.path.join('Source', 'WebCore', 'platform', 'audio', 'gstreamer', 'WebKitWebAudioSourceGStreamer.cpp'),
    215216      os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'gstreamer', 'GStreamerMediaStreamSource.h'),
Note: See TracChangeset for help on using the changeset viewer.