Changeset 138786 in webkit


Ignore:
Timestamp:
Jan 4, 2013 2:14:43 AM (11 years ago)
Author:
Philippe Normand
Message:

[GStreamer] Port WebAudio backend to 1.0 APIs
https://bugs.webkit.org/show_bug.cgi?id=105293

Reviewed by Martin Robinson.

Port the AudioFileReader and AudioDestination to GStreamer 1.0
APIs. It would be preferable to rely on at least GStreamer 1.0.4
for this to work properly as that release contains two bug fixes
for the deinterleave and interleave elements.

  • platform/audio/FFTFrame.cpp:

(WebCore::FFTFrame::reportMemoryUsage): Don't report GstFFTF32
structures anymore because they're opaque in GStreamer 1.0.

  • platform/audio/gstreamer/AudioDestinationGStreamer.cpp:

(WebCore):
(WebCore::AudioDestinationGStreamer::AudioDestinationGStreamer):
The wavparse element in 1.0 has no sometimes-pads anymore.

  • platform/audio/gstreamer/AudioFileReaderGStreamer.cpp:

(AudioFileReader): The decodebin2 element has been renamed to
decodebin in GStreamer 1.0.
(WebCore::getGStreamerAudioCaps): Audio caps description changed a
lot in GStreamer 1.0, the function now handles both APIs.
(WebCore::copyGstreamerBuffersToAudioChannel): Adapted to
GstBufferList and GstBuffer API changes.
(WebCore::onAppsinkPullRequiredCallback): Pull a sample or buffer,
depending on which API we use.
(WebCore::AudioFileReader::~AudioFileReader): Protect
GstBufferListIterators in 0.10-only code path.
(WebCore):
(WebCore::AudioFileReader::handleSample): Pull an audio sample
from appsink and insert it in the appropriate buffer list.
(WebCore::AudioFileReader::handleNewDeinterleavePad): Handle
appsink API changes from GStreamer 0.10 to 1.0.
(WebCore::AudioFileReader::decodeAudioForBusCreation): Create the
correct decodebin element.
(WebCore::AudioFileReader::createBus): Protect GstBufferListIterators
in 0.10-only code path.

  • platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp:

(_WebKitWebAudioSourcePrivate): GstTask in GStreamer 1.0 uses a
GRecMutex instead of a (deprecated) GStaticRecMutex.
(getGStreamerMonoAudioCaps): Handle caps description changes
between GStreamer 0.10 and 1.0.
(webKitWebAudioGStreamerChannelPosition): POSITION_LFE in
GStreamer 1.0 is now POSITION_LFE1. Also map ChannelCenter to its
GStreamer equivalent.
(webkit_web_audio_src_class_init): Use generic setGstElementClassMetadata.
(webkit_web_audio_src_init): Handle GRecMutex initialisation.
(webKitWebAudioSrcConstructed): Set channel position on
capsfilter. This is done for GStreamer 1.0 code path only because
in 0.10 the caps have no way to store this information.
(webKitWebAudioSrcFinalize): Clear GRecMutex.
(webKitWebAudioSrcLoop): Handle GstBuffer API changes and add an
error check if buffers can't be chained to queue's source pad.
(webKitWebAudioSrcChangeState): As advised in the GStreamer docs,
fixup the state changes for this live source element: NO_PREROLL
in READY->PAUSED and start/stop the GstTask when going/coming
to/from PLAYING.

Location:
trunk/Source/WebCore
Files:
9 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r138785 r138786  
     12012-12-18  Philippe Normand  <pnormand@igalia.com>
     2
     3        [GStreamer] Port WebAudio backend to 1.0 APIs
     4        https://bugs.webkit.org/show_bug.cgi?id=105293
     5
     6        Reviewed by Martin Robinson.
     7
     8        Port the AudioFileReader and AudioDestination to GStreamer 1.0
     9        APIs. It would be preferable to rely on at least GStreamer 1.0.4
     10        for this to work properly as that release contains two bug fixes
     11        for the deinterleave and interleave elements.
     12
     13        * platform/audio/FFTFrame.cpp:
     14        (WebCore::FFTFrame::reportMemoryUsage): Don't report GstFFTF32
     15        structures anymore because they're opaque in GStreamer 1.0.
     16        * platform/audio/gstreamer/AudioDestinationGStreamer.cpp:
     17        (WebCore):
     18        (WebCore::AudioDestinationGStreamer::AudioDestinationGStreamer):
     19        The wavparse element in 1.0 has no sometimes-pads anymore.
     20        * platform/audio/gstreamer/AudioFileReaderGStreamer.cpp:
     21        (AudioFileReader): The decodebin2 element has been renamed to
     22        decodebin in GStreamer 1.0.
     23        (WebCore::getGStreamerAudioCaps): Audio caps description changed a
     24        lot in GStreamer 1.0, the function now handles both APIs.
     25        (WebCore::copyGstreamerBuffersToAudioChannel): Adapted to
     26        GstBufferList and GstBuffer API changes.
     27        (WebCore::onAppsinkPullRequiredCallback): Pull a sample or buffer,
     28        depending on which API we use.
     29        (WebCore::AudioFileReader::~AudioFileReader): Protect
     30        GstBufferListIterators in 0.10-only code path.
     31        (WebCore):
     32        (WebCore::AudioFileReader::handleSample): Pull an audio sample
     33        from appsink and insert it in the appropriate buffer list.
     34        (WebCore::AudioFileReader::handleNewDeinterleavePad): Handle
     35        appsink API changes from GStreamer 0.10 to 1.0.
     36        (WebCore::AudioFileReader::decodeAudioForBusCreation): Create the
     37        correct decodebin element.
     38        (WebCore::AudioFileReader::createBus): Protect GstBufferListIterators
     39        in 0.10-only code path.
     40        * platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp:
     41        (_WebKitWebAudioSourcePrivate): GstTask in GStreamer 1.0 uses a
     42        GRecMutex instead of a (deprecated) GStaticRecMutex.
     43        (getGStreamerMonoAudioCaps): Handle caps description changes
     44        between GStreamer 0.10 and 1.0.
     45        (webKitWebAudioGStreamerChannelPosition): POSITION_LFE in
     46        GStreamer 1.0 is now POSITION_LFE1. Also map ChannelCenter to its
     47        GStreamer equivalent.
     48        (webkit_web_audio_src_class_init): Use generic setGstElementClassMetadata.
     49        (webkit_web_audio_src_init): Handle GRecMutex initialisation.
     50        (webKitWebAudioSrcConstructed): Set channel position on
     51        capsfilter. This is done for GStreamer 1.0 code path only because
     52        in 0.10 the caps have no way to store this information.
     53        (webKitWebAudioSrcFinalize): Clear GRecMutex.
     54        (webKitWebAudioSrcLoop): Handle GstBuffer API changes and add an
     55        error check if buffers can't be chained to queue's source pad.
     56        (webKitWebAudioSrcChangeState): As advised in the GStreamer docs,
     57        fixup the state changes for this live source element: NO_PREROLL
     58        in READY->PAUSED and start/stop the GstTask when going/coming
     59        to/from PLAYING.
     60
    1612013-01-04  Mihnea Ovidenie  <mihnea@adobe.com>
    262
  • trunk/Source/WebCore/platform/audio/FFTFrame.cpp

    r135603 r138786  
    280280
    281281#if USE(WEBAUDIO_GSTREAMER)
     282#ifndef GST_API_VERSION_1
     283    // The GstFFTF32 structure is exposed publicly in GStreamer 0.10 only.
    282284    info.addMember(m_fft);
    283285    info.addMember(m_inverseFft);
     286#endif
    284287    info.addMember(m_complexData);
    285288    info.addMember(m_realData);
  • trunk/Source/WebCore/platform/audio/gstreamer/AudioDestinationGStreamer.cpp

    r138576 r138786  
    11/*
    2  *  Copyright (C) 2011 Igalia S.L
     2 *  Copyright (C) 2011, 2012 Igalia S.L
    33 *
    44 *  This library is free software; you can redistribute it and/or
     
    5252}
    5353
     54#ifndef GST_API_VERSION_1
    5455static void onGStreamerWavparsePadAddedCallback(GstElement*, GstPad* pad, AudioDestinationGStreamer* destination)
    5556{
    5657    destination->finishBuildingPipelineAfterWavParserPadReady(pad);
    5758}
     59#endif
    5860
    5961AudioDestinationGStreamer::AudioDestinationGStreamer(AudioIOCallback& callback, float sampleRate)
     
    8385        return;
    8486
     87#ifndef GST_API_VERSION_1
    8588    g_signal_connect(wavParser, "pad-added", G_CALLBACK(onGStreamerWavparsePadAddedCallback), this);
     89#endif
    8690    gst_bin_add_many(GST_BIN(m_pipeline), webkitAudioSrc, wavParser, NULL);
    8791    gst_element_link_pads_full(webkitAudioSrc, "src", wavParser, "sink", GST_PAD_LINK_CHECK_NOTHING);
     92
     93#ifdef GST_API_VERSION_1
     94    GRefPtr<GstPad> srcPad = adoptGRef(gst_element_get_static_pad(wavParser, "src"));
     95    finishBuildingPipelineAfterWavParserPadReady(srcPad.get());
     96#endif
    8897}
    8998
     
    126135    // Link wavparse's src pad to audioconvert sink pad.
    127136    GRefPtr<GstPad> sinkPad = adoptGRef(gst_element_get_static_pad(audioConvert, "sink"));
    128     gst_pad_link(pad, sinkPad.get());
     137    gst_pad_link_full(pad, sinkPad.get(), GST_PAD_LINK_CHECK_NOTHING);
    129138
    130139    // Link audioconvert to audiosink and roll states.
  • trunk/Source/WebCore/platform/audio/gstreamer/AudioDestinationGStreamer.h

    r138545 r138786  
    11/*
    2  *  Copyright (C) 2011 Philippe Normand <pnormand@igalia.com>
     2 *  Copyright (C) 2011, 2012 Igalia S.L
    33 *
    44 *  This library is free software; you can redistribute it and/or
  • trunk/Source/WebCore/platform/audio/gstreamer/AudioFileReaderGStreamer.cpp

    r138580 r138786  
    2525
    2626#include "AudioBus.h"
     27#include "GStreamerVersioning.h"
    2728
    2829#if PLATFORM(QT)
     
    3435#include <gio/gio.h>
    3536#include <gst/app/gstappsink.h>
    36 #include <gst/audio/multichannel.h>
    3737#include <gst/gst.h>
    3838#include <gst/pbutils/pbutils.h>
     
    4242#include <wtf/gobject/GRefPtr.h>
    4343
     44#ifdef GST_API_VERSION_1
     45#include <gst/audio/audio.h>
     46#else
     47#include <gst/audio/multichannel.h>
     48#endif
     49
     50#ifdef GST_API_VERSION_1
     51static const char* gDecodebinName = "decodebin";
     52#else
     53static const char* gDecodebinName = "decodebin2";
     54#endif
     55
    4456namespace WebCore {
    4557
     
    5365    PassOwnPtr<AudioBus> createBus(float sampleRate, bool mixToMono);
    5466
     67#ifdef GST_API_VERSION_1
     68    GstFlowReturn handleSample(GstAppSink*);
     69#else
    5570    GstFlowReturn handleBuffer(GstAppSink*);
     71#endif
    5672    gboolean handleMessage(GstMessage*);
    5773    void handleNewDeinterleavePad(GstPad*);
     
    6783    float m_sampleRate;
    6884    GstBufferList* m_frontLeftBuffers;
     85    GstBufferList* m_frontRightBuffers;
     86
     87#ifndef GST_API_VERSION_1
    6988    GstBufferListIterator* m_frontLeftBuffersIterator;
    70     GstBufferList* m_frontRightBuffers;
    7189    GstBufferListIterator* m_frontRightBuffersIterator;
     90#endif
     91
    7292    GstElement* m_pipeline;
    7393    unsigned m_channelSize;
     
    7898};
    7999
    80 static GstCaps* getGStreamerAudioCaps(int channels, float sampleRate)
    81 {
    82     return gst_caps_new_simple("audio/x-raw-float", "rate", G_TYPE_INT, static_cast<int>(sampleRate), "channels", G_TYPE_INT, channels, "endianness", G_TYPE_INT, G_BYTE_ORDER, "width", G_TYPE_INT, 32, NULL);
    83 }
    84 
    85100static void copyGstreamerBuffersToAudioChannel(GstBufferList* buffers, AudioChannel* audioChannel)
    86101{
     102#ifdef GST_API_VERSION_1
     103    gsize offset = 0;
     104    for (unsigned i = 0; i < gst_buffer_list_length(buffers); i++) {
     105        GstBuffer* buffer = gst_buffer_list_get(buffers, i);
     106        if (!buffer)
     107            continue;
     108        GstMapInfo info;
     109        gst_buffer_map(buffer, &info, GST_MAP_READ);
     110        memcpy(audioChannel->mutableData() + offset, reinterpret_cast<float*>(info.data), info.size);
     111        offset += info.size / sizeof(float);
     112        gst_buffer_unmap(buffer, &info);
     113    }
     114#else
    87115    GstBufferListIterator* iter = gst_buffer_list_iterate(buffers);
    88116    gst_buffer_list_iterator_next_group(iter);
     
    94122
    95123    gst_buffer_list_iterator_free(iter);
    96 }
    97 
    98 static GstFlowReturn onAppsinkNewBufferCallback(GstAppSink* sink, gpointer userData)
    99 {
     124#endif
     125}
     126
     127static GstFlowReturn onAppsinkPullRequiredCallback(GstAppSink* sink, gpointer userData)
     128{
     129#ifdef GST_API_VERSION_1
     130    return static_cast<AudioFileReader*>(userData)->handleSample(sink);
     131#else
    100132    return static_cast<AudioFileReader*>(userData)->handleBuffer(sink);
     133#endif
    101134}
    102135
     
    168201    }
    169202
     203#ifndef GST_API_VERSION_1
    170204    gst_buffer_list_iterator_free(m_frontLeftBuffersIterator);
    171205    gst_buffer_list_iterator_free(m_frontRightBuffersIterator);
     206#endif
    172207    gst_buffer_list_unref(m_frontLeftBuffers);
    173208    gst_buffer_list_unref(m_frontRightBuffers);
    174209}
    175210
     211#ifdef GST_API_VERSION_1
     212GstFlowReturn AudioFileReader::handleSample(GstAppSink* sink)
     213{
     214    GstSample* sample = gst_app_sink_pull_sample(sink);
     215    if (!sample)
     216        return GST_FLOW_ERROR;
     217
     218    GstBuffer* buffer = gst_sample_get_buffer(sample);
     219    if (!buffer) {
     220        gst_sample_unref(sample);
     221        return GST_FLOW_ERROR;
     222    }
     223
     224    GstCaps* caps = gst_sample_get_caps(sample);
     225    if (!caps) {
     226        gst_sample_unref(sample);
     227        return GST_FLOW_ERROR;
     228    }
     229
     230    GstAudioInfo info;
     231    gst_audio_info_from_caps(&info, caps);
     232    int frames = GST_CLOCK_TIME_TO_FRAMES(GST_BUFFER_DURATION(buffer), GST_AUDIO_INFO_RATE(&info));
     233
     234    // Check the first audio channel. The buffer is supposed to store
     235    // data of a single channel anyway.
     236    switch (GST_AUDIO_INFO_POSITION(&info, 0)) {
     237    case GST_AUDIO_CHANNEL_POSITION_FRONT_LEFT:
     238        gst_buffer_list_add(m_frontLeftBuffers, gst_buffer_ref(buffer));
     239        m_channelSize += frames;
     240        break;
     241    case GST_AUDIO_CHANNEL_POSITION_FRONT_RIGHT:
     242        gst_buffer_list_add(m_frontRightBuffers, gst_buffer_ref(buffer));
     243        break;
     244    default:
     245        break;
     246    }
     247
     248    gst_sample_unref(sample);
     249    return GST_FLOW_OK;
     250
     251}
     252#endif
     253
     254#ifndef GST_API_VERSION_1
    176255GstFlowReturn AudioFileReader::handleBuffer(GstAppSink* sink)
    177256{
     
    227306    return GST_FLOW_OK;
    228307}
     308#endif
    229309
    230310gboolean AudioFileReader::handleMessage(GstMessage* message)
     
    265345    callbacks.eos = 0;
    266346    callbacks.new_preroll = 0;
     347#ifdef GST_API_VERSION_1
     348    callbacks.new_sample = onAppsinkPullRequiredCallback;
     349#else
    267350    callbacks.new_buffer_list = 0;
    268     callbacks.new_buffer = onAppsinkNewBufferCallback;
     351    callbacks.new_buffer = onAppsinkPullRequiredCallback;
     352#endif
    269353    gst_app_sink_set_callbacks(GST_APP_SINK(sink), &callbacks, this, 0);
    270354
    271355    g_object_set(sink, "sync", FALSE, NULL);
    272356
    273     GstCaps* caps = getGStreamerAudioCaps(1, m_sampleRate);
    274     gst_app_sink_set_caps(GST_APP_SINK(sink), caps);
    275     gst_caps_unref(caps);
    276 
    277357    gst_bin_add_many(GST_BIN(m_pipeline), queue, sink, NULL);
    278358
    279359    GstPad* sinkPad = gst_element_get_static_pad(queue, "sink");
    280     gst_pad_link(pad, sinkPad);
     360    gst_pad_link_full(pad, sinkPad, GST_PAD_LINK_CHECK_NOTHING);
    281361    gst_object_unref(GST_OBJECT(sinkPad));
    282362
     
    308388    g_signal_connect(m_deInterleave.get(), "no-more-pads", G_CALLBACK(onGStreamerDeinterleaveReadyCallback), this);
    309389
    310     GstCaps* caps = getGStreamerAudioCaps(2, m_sampleRate);
     390    GstCaps* caps = getGstAudioCaps(2, m_sampleRate);
    311391    g_object_set(capsFilter, "caps", caps, NULL);
    312392    gst_caps_unref(caps);
     
    315395
    316396    GstPad* sinkPad = gst_element_get_static_pad(audioConvert, "sink");
    317     gst_pad_link(pad, sinkPad);
     397    gst_pad_link_full(pad, sinkPad, GST_PAD_LINK_CHECK_NOTHING);
    318398    gst_object_unref(GST_OBJECT(sinkPad));
    319399
     
    351431    }
    352432
    353     m_decodebin = gst_element_factory_make("decodebin2", "decodebin");
     433    m_decodebin = gst_element_factory_make(gDecodebinName, "decodebin");
    354434    g_signal_connect(m_decodebin.get(), "pad-added", G_CALLBACK(onGStreamerDecodebinPadAddedCallback), this);
    355435
     
    364444
    365445    m_frontLeftBuffers = gst_buffer_list_new();
     446    m_frontRightBuffers = gst_buffer_list_new();
     447
     448#ifndef GST_API_VERSION_1
    366449    m_frontLeftBuffersIterator = gst_buffer_list_iterate(m_frontLeftBuffers);
    367450    gst_buffer_list_iterator_add_group(m_frontLeftBuffersIterator);
    368451
    369     m_frontRightBuffers = gst_buffer_list_new();
    370452    m_frontRightBuffersIterator = gst_buffer_list_iterate(m_frontRightBuffers);
    371453    gst_buffer_list_iterator_add_group(m_frontRightBuffersIterator);
     454#endif
    372455
    373456    GRefPtr<GMainContext> context = g_main_context_new();
  • trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp

    r137925 r138786  
    11/*
    2  *  Copyright (C) 2011 Igalia S.L
     2 *  Copyright (C) 2011, 2012 Igalia S.L
    33 *
    44 *  This library is free software; you can redistribute it and/or
     
    2828#include "GRefPtrGStreamer.h"
    2929#include "GStreamerVersioning.h"
     30#ifdef GST_API_VERSION_1
     31#include <gst/audio/audio.h>
     32#else
    3033#include <gst/audio/multichannel.h>
     34#endif
    3135#include <gst/pbutils/pbutils.h>
    3236
     
    4650};
    4751
     52#define WEBKIT_WEB_AUDIO_SRC_GET_PRIVATE(obj) (G_TYPE_INSTANCE_GET_PRIVATE((obj), WEBKIT_TYPE_WEBAUDIO_SRC, WebKitWebAudioSourcePrivate))
    4853struct _WebKitWebAudioSourcePrivate {
    4954    gfloat sampleRate;
     
    5661
    5762    GRefPtr<GstTask> task;
     63#ifdef GST_API_VERSION_1
     64    GRecMutex mutex;
     65#else
    5866    GStaticRecMutex mutex;
     67#endif
    5968
    6069    GSList* pads; // List of queue sink pads. One queue for each planar audio channel.
     
    8695static GstCaps* getGStreamerMonoAudioCaps(float sampleRate)
    8796{
     97#ifdef GST_API_VERSION_1
     98    return gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, static_cast<int>(sampleRate),
     99        "channels", G_TYPE_INT, 1,
     100        "format", G_TYPE_STRING, gst_audio_format_to_string(GST_AUDIO_FORMAT_F32),
     101        "layout", G_TYPE_STRING, "non-interleaved", NULL);
     102#else
    88103    return gst_caps_new_simple("audio/x-raw-float", "rate", G_TYPE_INT, static_cast<int>(sampleRate),
    89                                "channels", G_TYPE_INT, 1,
    90                                "endianness", G_TYPE_INT, G_BYTE_ORDER,
    91                                "width", G_TYPE_INT, 32, NULL);
     104        "channels", G_TYPE_INT, 1,
     105        "endianness", G_TYPE_INT, G_BYTE_ORDER,
     106        "width", G_TYPE_INT, 32, NULL);
     107#endif
    92108}
    93109
     
    104120        break;
    105121    case AudioBus::ChannelCenter:
    106         // Center and mono are the same.
    107         position = GST_AUDIO_CHANNEL_POSITION_FRONT_MONO;
     122        position = GST_AUDIO_CHANNEL_POSITION_FRONT_CENTER;
    108123        break;
    109124    case AudioBus::ChannelLFE:
     125#ifdef GST_API_VERSION_1
     126        position = GST_AUDIO_CHANNEL_POSITION_LFE1;
     127#else
    110128        position = GST_AUDIO_CHANNEL_POSITION_LFE;
     129#endif
    111130        break;
    112131    case AudioBus::ChannelSurroundLeft:
     
    135154
    136155    gst_element_class_add_pad_template(elementClass, gst_static_pad_template_get(&srcTemplate));
    137     gst_element_class_set_details_simple(elementClass,
    138                                          "WebKit WebAudio source element",
    139                                          "Source",
    140                                          "Handles WebAudio data from WebCore",
    141                                          "Philippe Normand <pnormand@igalia.com>");
     156    setGstElementClassMetadata(elementClass, "WebKit WebAudio source element", "Source", "Handles WebAudio data from WebCore", "Philippe Normand <pnormand@igalia.com>");
    142157
    143158    objectClass->constructed = webKitWebAudioSrcConstructed;
     
    186201    priv->bus = 0;
    187202
     203#ifdef GST_API_VERSION_1
     204    g_rec_mutex_init(&priv->mutex);
     205    priv->task = gst_task_new(reinterpret_cast<GstTaskFunction>(webKitWebAudioSrcLoop), src, reinterpret_cast<GDestroyNotify>(g_object_unref));
     206#else
    188207    g_static_rec_mutex_init(&priv->mutex);
    189 
    190208    priv->task = gst_task_create(reinterpret_cast<GstTaskFunction>(webKitWebAudioSrcLoop), src);
     209#endif
     210
    191211    gst_task_set_lock(priv->task.get(), &priv->mutex);
    192212}
     
    225245
    226246        GRefPtr<GstCaps> monoCaps = adoptGRef(getGStreamerMonoAudioCaps(priv->sampleRate));
     247
     248#ifdef GST_API_VERSION_1
     249        GstAudioInfo info;
     250        gst_audio_info_from_caps(&info, monoCaps.get());
     251        GST_AUDIO_INFO_POSITION(&info, 0) = webKitWebAudioGStreamerChannelPosition(channelIndex);
     252        GRefPtr<GstCaps> caps = adoptGRef(gst_audio_info_to_caps(&info));
     253        g_object_set(capsfilter, "caps", caps.get(), NULL);
     254#else
    227255        g_object_set(capsfilter, "caps", monoCaps.get(), NULL);
     256#endif
    228257
    229258        // Configure the queue for minimal latency.
     
    251280    WebKitWebAudioSourcePrivate* priv = src->priv;
    252281
     282#ifdef GST_API_VERSION_1
     283    g_rec_mutex_clear(&priv->mutex);
     284#else
    253285    g_static_rec_mutex_free(&priv->mutex);
     286#endif
    254287
    255288    g_slist_free_full(priv->pads, reinterpret_cast<GDestroyNotify>(gst_object_unref));
     
    322355        ASSERT(channelBuffer);
    323356        channelBufferList = g_slist_prepend(channelBufferList, channelBuffer);
     357#ifdef GST_API_VERSION_1
     358        GstMapInfo info;
     359        gst_buffer_map(channelBuffer, &info, GST_MAP_READ);
     360        priv->bus->setChannelMemory(i, reinterpret_cast<float*>(info.data), priv->framesToPull);
     361        gst_buffer_unmap(channelBuffer, &info);
     362#else
    324363        priv->bus->setChannelMemory(i, reinterpret_cast<float*>(GST_BUFFER_DATA(channelBuffer)), priv->framesToPull);
     364#endif
    325365    }
    326366    channelBufferList = g_slist_reverse(channelBufferList);
     
    333373        GstBuffer* channelBuffer = static_cast<GstBuffer*>(g_slist_nth_data(channelBufferList, i));
    334374
     375#ifndef GST_API_VERSION_1
    335376        GRefPtr<GstCaps> monoCaps = adoptGRef(getGStreamerMonoAudioCaps(priv->sampleRate));
    336377        GstStructure* structure = gst_caps_get_structure(monoCaps.get(), 0);
     
    338379        gst_audio_set_channel_positions(structure, &channelPosition);
    339380        gst_buffer_set_caps(channelBuffer, monoCaps.get());
    340 
    341         gst_pad_chain(pad, channelBuffer);
     381#endif
     382
     383        GstFlowReturn ret = gst_pad_chain(pad, channelBuffer);
     384        if (ret != GST_FLOW_OK)
     385            GST_ELEMENT_ERROR(src, CORE, PAD, ("Internal WebAudioSrc error"), ("Failed to push buffer on %s", GST_DEBUG_PAD_NAME(pad)));
    342386    }
    343387
     
    376420    case GST_STATE_CHANGE_READY_TO_PAUSED:
    377421        GST_DEBUG_OBJECT(src, "READY->PAUSED");
     422        returnValue = GST_STATE_CHANGE_NO_PREROLL;
     423        break;
     424    case GST_STATE_CHANGE_PAUSED_TO_PLAYING:
     425        GST_DEBUG_OBJECT(src, "PAUSED->PLAYING");
    378426        if (!gst_task_start(src->priv->task.get()))
    379427            returnValue = GST_STATE_CHANGE_FAILURE;
    380428        break;
    381     case GST_STATE_CHANGE_PAUSED_TO_READY:
    382         GST_DEBUG_OBJECT(src, "PAUSED->READY");
     429    case GST_STATE_CHANGE_PLAYING_TO_PAUSED:
     430        GST_DEBUG_OBJECT(src, "PLAYING->PAUSED");
    383431        if (!gst_task_join(src->priv->task.get()))
    384432            returnValue = GST_STATE_CHANGE_FAILURE;
  • trunk/Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.h

    r101138 r138786  
    11/*
    2  *  Copyright (C) 2011 Igalia S.L
     2 *  Copyright (C) 2011, 2012 Igalia S.L
    33 *
    44 *  This library is free software; you can redistribute it and/or
  • trunk/Source/WebCore/platform/graphics/gstreamer/GStreamerVersioning.cpp

    r137301 r138786  
    2525#include "IntSize.h"
    2626#include <wtf/UnusedParam.h>
     27
     28#ifdef GST_API_VERSION_1
     29#include <gst/audio/audio.h>
     30#else
     31#include <gst/audio/multichannel.h>
     32#endif
    2733
    2834void webkitGstObjectRefSink(GstObject* gstObject)
     
    142148#endif
    143149}
     150
     151#if ENABLE(WEB_AUDIO)
     152GstCaps* getGstAudioCaps(int channels, float sampleRate)
     153{
     154#ifdef GST_API_VERSION_1
     155    return gst_caps_new_simple("audio/x-raw", "rate", G_TYPE_INT, static_cast<int>(sampleRate),
     156        "channels", G_TYPE_INT, channels,
     157        "format", G_TYPE_STRING, gst_audio_format_to_string(GST_AUDIO_FORMAT_F32),
     158        "layout", G_TYPE_STRING, "interleaved", NULL);
     159#else
     160    return gst_caps_new_simple("audio/x-raw-float", "rate", G_TYPE_INT, static_cast<int>(sampleRate),
     161        "channels", G_TYPE_INT, channels,
     162        "endianness", G_TYPE_INT, G_BYTE_ORDER,
     163        "width", G_TYPE_INT, 32, NULL);
     164#endif
     165}
     166#endif
     167
    144168#endif // USE(GSTREAMER)
  • trunk/Source/WebCore/platform/graphics/gstreamer/GStreamerVersioning.h

    r137301 r138786  
    4040bool gstObjectIsFloating(GstObject*);
    4141void notifyGstTagsOnPad(GstElement*, GstPad*, GstTagList*);
     42#if ENABLE(WEB_AUDIO)
     43GstCaps* getGstAudioCaps(int channels, float sampleRate);
     44#endif
    4245#endif // USE(GSTREAMER)
    4346#endif // GStreamerVersioning_h
Note: See TracChangeset for help on using the changeset viewer.