Changeset 248530 in webkit


Ignore:
Timestamp:
Aug 12, 2019 9:56:01 AM (5 years ago)
Author:
commit-queue@webkit.org
Message:

[GStreamer][WebRTC] Handle broken data in the libwebrtc GStreamer decoders
https://bugs.webkit.org/show_bug.cgi?id=200584

Patch by Thibault Saunier <tsaunier@igalia.com> on 2019-08-12
Reviewed by Philippe Normand.

Source/WebCore:

Listening to parsers warnings and error messages (synchronously so that we react
right away) and requesting keyframes from the peer.

Also simplify the decoder code by trying to make decoding happen
in one single pass, also hiding away GStreamer threading and allowing
us to react as soon as the decoder/parser fails.

  • platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp:

(WebCore::GStreamerVideoDecoder::GStreamerVideoDecoder):
(WebCore::GStreamerVideoDecoder::pullSample):
(WebCore::H264Decoder::H264Decoder):

  • platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp:

Tools:

Added a h264parse patch to post WARNING on the bus when a broken frame is detected.
Ignore style libwebrtc optionnal 'style issue'

  • Scripts/webkitpy/style/checker.py:
  • gstreamer/jhbuild.modules:
  • gstreamer/patches/gst-plugins-bad-0001-h264parse-Post-a-WARNING-when-data-is-broken.patch: Added.
Location:
trunk
Files:
1 added
6 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r248528 r248530  
     12019-08-12  Thibault Saunier  <tsaunier@igalia.com>
     2
     3        [GStreamer][WebRTC] Handle broken data in the libwebrtc GStreamer decoders
     4        https://bugs.webkit.org/show_bug.cgi?id=200584
     5
     6        Reviewed by Philippe Normand.
     7
     8        Listening to parsers warnings and error messages (synchronously so that we react
     9        right away) and requesting keyframes from the peer.
     10
     11        Also simplify the decoder code by trying to make decoding happen
     12        in one single pass, also hiding away GStreamer threading and allowing
     13        us to react as soon as the decoder/parser fails.
     14
     15        * platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp:
     16        (WebCore::GStreamerVideoDecoder::GStreamerVideoDecoder):
     17        (WebCore::GStreamerVideoDecoder::pullSample):
     18        (WebCore::H264Decoder::H264Decoder):
     19        * platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp:
     20
    1212019-08-12  Antti Koivisto  <antti@apple.com>
    222
  • trunk/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoDecoderFactory.cpp

    r248040 r248530  
    5757        , m_width(0)
    5858        , m_height(0)
     59        , m_requireParse(false)
     60        , m_needsKeyframe(true)
    5961        , m_firstBufferPts(GST_CLOCK_TIME_NONE)
    6062        , m_firstBufferDts(GST_CLOCK_TIME_NONE)
     
    8789        m_src = makeElement("appsrc");
    8890
     91        GRefPtr<GstCaps> caps = nullptr;
    8992        auto capsfilter = CreateFilter();
    9093        auto decoder = makeElement("decodebin");
     
    9598        }
    9699
     100        m_pipeline = makeElement("pipeline");
     101        connectSimpleBusMessageCallback(m_pipeline.get());
     102
     103        auto sinkpad = adoptGRef(gst_element_get_static_pad(capsfilter, "sink"));
     104        g_signal_connect(decoder, "pad-added", G_CALLBACK(decodebinPadAddedCb), sinkpad.get());
    97105        // Make the decoder output "parsed" frames only and let the main decodebin
    98106        // do the real decoding. This allows us to have optimized decoding/rendering
    99107        // happening in the main pipeline.
    100         g_object_set(decoder, "caps", adoptGRef(gst_caps_from_string(Caps())).get(), nullptr);
    101         auto sinkpad = gst_element_get_static_pad(capsfilter, "sink");
    102         g_signal_connect(decoder, "pad-added", G_CALLBACK(decodebinPadAddedCb), sinkpad);
    103 
    104         m_pipeline = makeElement("pipeline");
    105         connectSimpleBusMessageCallback(m_pipeline.get());
    106 
    107         auto sink = makeElement("appsink");
    108         gst_app_sink_set_emit_signals(GST_APP_SINK(sink), true);
    109         g_signal_connect(sink, "new-sample", G_CALLBACK(newSampleCallbackTramp), this);
    110         // This is an encoder, everything should happen as fast as possible and not
     108        if (m_requireParse) {
     109            caps = gst_caps_new_simple(Caps(), "parsed", G_TYPE_BOOLEAN, TRUE, nullptr);
     110            GRefPtr<GstBus> bus = adoptGRef(gst_pipeline_get_bus(GST_PIPELINE(m_pipeline.get())));
     111
     112            gst_bus_enable_sync_message_emission(bus.get());
     113            g_signal_connect(bus.get(), "sync-message::warning",
     114                G_CALLBACK(+[](GstBus*, GstMessage* message, GStreamerVideoDecoder* justThis) {
     115                GUniqueOutPtr<GError> err;
     116
     117                switch (GST_MESSAGE_TYPE(message)) {
     118                case GST_MESSAGE_WARNING: {
     119                    gst_message_parse_warning(message, &err.outPtr(), nullptr);
     120                    FALLTHROUGH;
     121                }
     122                case GST_MESSAGE_ERROR: {
     123                    if (!err)
     124                        gst_message_parse_error(message, &err.outPtr(), nullptr);
     125
     126                    if (g_error_matches(err.get(), GST_STREAM_ERROR, GST_STREAM_ERROR_DECODE)) {
     127                        GST_INFO_OBJECT(justThis->pipeline(), "--> needs keyframe (%s)",
     128                            err->message);
     129                        justThis->m_needsKeyframe = true;
     130                    }
     131                    break;
     132                }
     133                default:
     134                    break;
     135                }
     136                }), this);
     137        } else {
     138            /* FIXME - How could we handle missing keyframes case we do not plug parsers ? */
     139            caps = gst_caps_new_empty_simple(Caps());
     140        }
     141        g_object_set(decoder, "caps", caps.get(), nullptr);
     142
     143        m_sink = makeElement("appsink");
     144        gst_app_sink_set_emit_signals(GST_APP_SINK(m_sink), true);
     145        // This is an decoder, everything should happen as fast as possible and not
    111146        // be synced on the clock.
    112         g_object_set(sink, "sync", false, nullptr);
    113 
    114         gst_bin_add_many(GST_BIN(pipeline()), m_src, decoder, capsfilter, sink, nullptr);
     147        g_object_set(m_sink, "sync", false, nullptr);
     148
     149        gst_bin_add_many(GST_BIN(pipeline()), m_src, decoder, capsfilter, m_sink, nullptr);
    115150        if (!gst_element_link(m_src, decoder)) {
    116151            GST_ERROR_OBJECT(pipeline(), "Could not link src to decoder.");
     
    118153        }
    119154
    120         if (!gst_element_link(capsfilter, sink)) {
     155        if (!gst_element_link(capsfilter, m_sink)) {
    121156            GST_ERROR_OBJECT(pipeline(), "Could not link capsfilter to sink.");
    122157            return WEBRTC_VIDEO_CODEC_ERROR;
     
    151186            gst_element_set_state(m_pipeline.get(), GST_STATE_NULL);
    152187            m_src = nullptr;
     188            m_sink = nullptr;
    153189            m_pipeline = nullptr;
    154190        }
     
    162198        int64_t renderTimeMs) override
    163199    {
     200        if (m_needsKeyframe) {
     201            if (inputImage._frameType != webrtc::kVideoFrameKey) {
     202                GST_ERROR("Waiting for keyframe but got a delta unit... asking for keyframe");
     203                return WEBRTC_VIDEO_CODEC_ERROR;
     204            }
     205            if (inputImage._completeFrame)
     206                m_needsKeyframe = false;
     207            else {
     208                GST_ERROR("Waiting for keyframe but didn't get full frame, getting out");
     209                return WEBRTC_VIDEO_CODEC_ERROR;
     210            }
     211        }
     212
     213
    164214        if (!m_src) {
    165215            GST_ERROR("No source set, can't decode.");
     
    179229        GST_BUFFER_DTS(buffer.get()) = (static_cast<guint64>(inputImage.Timestamp()) * GST_MSECOND) - m_firstBufferDts;
    180230        GST_BUFFER_PTS(buffer.get()) = (static_cast<guint64>(renderTimeMs) * GST_MSECOND) - m_firstBufferPts;
    181         {
    182             auto locker = holdLock(m_bufferMapLock);
    183             InputTimestamps timestamps = {inputImage.Timestamp(), renderTimeMs};
    184             m_dtsPtsMap[GST_BUFFER_PTS(buffer.get())] = timestamps;
    185         }
     231        InputTimestamps timestamps = {inputImage.Timestamp(), renderTimeMs};
     232        m_dtsPtsMap[GST_BUFFER_PTS(buffer.get())] = timestamps;
    186233
    187234        GST_LOG_OBJECT(pipeline(), "%" G_GINT64_FORMAT " Decoding: %" GST_PTR_FORMAT, renderTimeMs, buffer.get());
     
    189236        switch (gst_app_src_push_sample(GST_APP_SRC(m_src), sample.get())) {
    190237        case GST_FLOW_OK:
    191             return WEBRTC_VIDEO_CODEC_OK;
     238            break;
    192239        case GST_FLOW_FLUSHING:
    193240            return WEBRTC_VIDEO_CODEC_UNINITIALIZED;
     
    195242            return WEBRTC_VIDEO_CODEC_ERROR;
    196243        }
     244
     245        return pullSample();
     246    }
     247
     248    int32_t pullSample()
     249    {
     250        auto sample = gst_app_sink_try_pull_sample(GST_APP_SINK(m_sink), GST_SECOND / 30);
     251        if (!sample) {
     252            GST_ERROR("Needs more data");
     253            return WEBRTC_VIDEO_CODEC_OK;
     254        }
     255        auto buffer = gst_sample_get_buffer(sample);
     256
     257        // Make sure that the frame.timestamp == previsouly input_frame._timeStamp
     258        // as it is required by the VideoDecoder baseclass.
     259        auto timestamps = m_dtsPtsMap[GST_BUFFER_PTS(buffer)];
     260        m_dtsPtsMap.erase(GST_BUFFER_PTS(buffer));
     261
     262        auto frame(LibWebRTCVideoFrameFromGStreamerSample(sample, webrtc::kVideoRotation_0,
     263            timestamps.timestamp, timestamps.renderTimeMs));
     264
     265        GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;
     266        GST_LOG_OBJECT(pipeline(), "Output decoded frame! %d -> %" GST_PTR_FORMAT,
     267            frame->timestamp(), buffer);
     268
     269        m_imageReadyCb->Decoded(*frame.get(), absl::optional<int32_t>(), absl::optional<uint8_t>());
     270
     271        return WEBRTC_VIDEO_CODEC_OK;
    197272    }
    198273
     
    245320    }
    246321
    247     GstFlowReturn newSampleCallback(GstElement* sink)
    248     {
    249         auto sample = gst_app_sink_pull_sample(GST_APP_SINK(sink));
    250         auto buffer = gst_sample_get_buffer(sample);
    251 
    252         m_bufferMapLock.lock();
    253         // Make sure that the frame.timestamp == previsouly input_frame._timeStamp
    254         // as it is required by the VideoDecoder baseclass.
    255         auto timestamps = m_dtsPtsMap[GST_BUFFER_PTS(buffer)];
    256         m_dtsPtsMap.erase(GST_BUFFER_PTS(buffer));
    257         m_bufferMapLock.unlock();
    258 
    259         auto frame(LibWebRTCVideoFrameFromGStreamerSample(sample, webrtc::kVideoRotation_0,
    260             timestamps.timestamp, timestamps.renderTimeMs));
    261 
    262         GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE;
    263         GST_LOG_OBJECT(pipeline(), "Output decoded frame! %d -> %" GST_PTR_FORMAT,
    264             frame->timestamp(), buffer);
    265 
    266         m_imageReadyCb->Decoded(*frame.get(), absl::optional<int32_t>(), absl::optional<uint8_t>());
    267 
    268         return GST_FLOW_OK;
    269     }
    270 
    271322    virtual const gchar* Caps() = 0;
    272323    virtual webrtc::VideoCodecType CodecType() = 0;
     
    279330    gint m_width;
    280331    gint m_height;
     332    bool m_requireParse = false;
     333    bool m_needsKeyframe;
    281334
    282335private:
    283     static GstFlowReturn newSampleCallbackTramp(GstElement* sink, GStreamerVideoDecoder* enc)
    284     {
    285         return enc->newSampleCallback(sink);
    286     }
    287 
    288336    GRefPtr<GstElement> m_pipeline;
     337    GstElement* m_sink;
    289338    GstElement* m_src;
    290339
     
    292341    webrtc::DecodedImageCallback* m_imageReadyCb;
    293342
    294     Lock m_bufferMapLock;
    295343    StdMap<GstClockTime, InputTimestamps> m_dtsPtsMap;
    296344    GstClockTime m_firstBufferPts;
     
    300348class H264Decoder : public GStreamerVideoDecoder {
    301349public:
    302     H264Decoder() { }
     350    H264Decoder() { m_requireParse = true; }
    303351
    304352    int32_t InitDecode(const webrtc::VideoCodec* codecInfo, int32_t nCores) final
  • trunk/Source/WebCore/platform/mediastream/libwebrtc/GStreamerVideoEncoderFactory.cpp

    r248500 r248530  
    196196    }
    197197
    198 
    199198    int32_t Encode(const webrtc::VideoFrame& frame,
    200199        const webrtc::CodecSpecificInfo*,
  • trunk/Tools/ChangeLog

    r248526 r248530  
     12019-08-12  Thibault Saunier  <tsaunier@igalia.com>
     2
     3        [GStreamer][WebRTC] Handle broken data in the libwebrtc GStreamer decoders
     4        https://bugs.webkit.org/show_bug.cgi?id=200584
     5
     6        Reviewed by Philippe Normand.
     7
     8        Added a h264parse patch to post WARNING on the bus when a broken frame is detected.
     9        Ignore style libwebrtc optionnal 'style issue'
     10
     11        * Scripts/webkitpy/style/checker.py:
     12        * gstreamer/jhbuild.modules:
     13        * gstreamer/patches/gst-plugins-bad-0001-h264parse-Post-a-WARNING-when-data-is-broken.patch: Added.
     14
    1152019-08-12  Youenn Fablet  <youenn@apple.com>
    216
  • trunk/Tools/Scripts/webkitpy/style/checker.py

    r247628 r248530  
    254254      "-readability/naming/underscores",
    255255      "-readability/enum_casing",
     256     ]),
     257
     258    ([  # Files following using WebRTC optionnal type
     259      os.path.join('Source', 'WebCore', 'platform', 'mediastream', 'libwebrtc', 'GStreamerVideoDecoderFactory.cpp'),
     260     ],
     261     ["-runtime/wtf_optional",
    256262     ]),
    257263
  • trunk/Tools/gstreamer/jhbuild.modules

    r247215 r248530  
    9393    </dependencies>
    9494    <branch hash="sha256:22139de35626ada6090bdfa3423b27b7fc15a0198331d25c95e6b12cb1072b05" module="gst-plugins-bad/gst-plugins-bad-${version}.tar.xz" repo="gstreamer" version="1.16.0">
    95       <patch file="gst-plugins-bad-do-not-retry-downloads-during-shutdown.patch" strip="1"/> <!-- In review: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/merge_requests/427 -->
     95      <patch file="gst-plugins-bad-do-not-retry-downloads-during-shutdown.patch" strip="1"/> <!-- Merged, discussing backporting: https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/merge_requests/427 -->
     96      <patch file="gst-plugins-bad-0001-h264parse-Post-a-WARNING-when-data-is-broken.patch" strip="1"/> <!-- Merged, discussing backporting https://gitlab.freedesktop.org/gstreamer/gst-plugins-bad/merge_requests/386-->
    9697    </branch>
    9798  </meson>
Note: See TracChangeset for help on using the changeset viewer.