Changeset 101138 in webkit


Ignore:
Timestamp:
Nov 24, 2011 6:57:34 AM (12 years ago)
Author:
Philippe Normand
Message:

[GStreamer] WebAudio AudioDestination
https://bugs.webkit.org/show_bug.cgi?id=69835

Reviewed by Martin Robinson.

New GStreamer source element pulling data from the AudioBus and
outputing audio interleaved GstBuffers suitable for playback.

  • GNUmakefile.list.am: Added the new GStreamer WebAudio element

source files to the build.

  • platform/audio/gstreamer/AudioDestinationGStreamer.cpp:

(WebCore::onGStreamerWavparsePadAddedCallback): Function called
when the playback pipeline successfully parsed the audio source
into a WAV stream.
(WebCore::AudioDestinationGStreamer::AudioDestinationGStreamer):
Configure the initial playback pipeline up to the WAV parser. The
audio sink is added only after the WAV parser was configured.
(WebCore::AudioDestinationGStreamer::~AudioDestinationGStreamer):
Reset the playback pipeline and delete it.
(WebCore::AudioDestinationGStreamer::finishBuildingPipelineAfterWavParserPadReady):
Method to add the audio sink to the pipeline and link it to the
WAV parser.
(WebCore::AudioDestinationGStreamer::start): Set pipeline to
PLAYING, at the first run it will trigger the WAV parser and hence
the audio-sink plugging.
(WebCore::AudioDestinationGStreamer::stop): Pause the pipeline.

  • platform/audio/gstreamer/AudioDestinationGStreamer.h:
  • platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp: Added.

(getGStreamerMonoAudioCaps): Utility function to generate
GStreamer caps representing a single audio channel for a given
sample rate.
(webKitWebAudioGStreamerChannelPosition): Utility function to
convert AudioBus channel representations to GStreamer positional
audio channel values.
(webkit_web_audio_src_class_init): GObject configuration of the
GStreamer source element.
(webkit_web_audio_src_init): Initialization of the private data of
the element.
(webKitWebAudioSourceConstructed): Configure the GstBin elements
depending on the AudioBus layout.
(webKitWebAudioSourceFinalize): Clean up the GstBin and free private
data of the element.
(webKitWebAudioSourceSetProperty): GObject property setter.
(webKitWebAudioSourceGetProperty): GObject property getter.
(webKitWebAudioSourceLoop): GstTask used to pull data from the
AudioBus and push it as GstBuffers to the src pad of the element.
(webKitWebAudioSourceChangeState): Start or stop the above GstTask
depending on the asked state transition.

  • platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.h: Added.
  • platform/graphics/gstreamer/GRefPtrGStreamer.cpp: GstTask support in GRefPtr.

(WTF::adoptGRef):
(WTF::GstTask):

  • platform/graphics/gstreamer/GRefPtrGStreamer.h:
Location:
trunk/Source/WebCore
Files:
2 added
6 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r101133 r101138  
     12011-10-27  Philippe Normand  <pnormand@igalia.com>
     2
     3        [GStreamer] WebAudio AudioDestination
     4        https://bugs.webkit.org/show_bug.cgi?id=69835
     5
     6        Reviewed by Martin Robinson.
     7
     8        New GStreamer source element pulling data from the AudioBus and
     9        outputing audio interleaved GstBuffers suitable for playback.
     10
     11        * GNUmakefile.list.am: Added the new GStreamer WebAudio element
     12        source files to the build.
     13        * platform/audio/gstreamer/AudioDestinationGStreamer.cpp:
     14        (WebCore::onGStreamerWavparsePadAddedCallback): Function called
     15        when the playback pipeline successfully parsed the audio source
     16        into a WAV stream.
     17        (WebCore::AudioDestinationGStreamer::AudioDestinationGStreamer):
     18        Configure the initial playback pipeline up to the WAV parser. The
     19        audio sink is added only after the WAV parser was configured.
     20        (WebCore::AudioDestinationGStreamer::~AudioDestinationGStreamer):
     21        Reset the playback pipeline and delete it.
     22        (WebCore::AudioDestinationGStreamer::finishBuildingPipelineAfterWavParserPadReady):
     23        Method to add the audio sink to the pipeline and link it to the
     24        WAV parser.
     25        (WebCore::AudioDestinationGStreamer::start): Set pipeline to
     26        PLAYING, at the first run it will trigger the WAV parser and hence
     27        the audio-sink plugging.
     28        (WebCore::AudioDestinationGStreamer::stop): Pause the pipeline.
     29        * platform/audio/gstreamer/AudioDestinationGStreamer.h:
     30        * platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp: Added.
     31        (getGStreamerMonoAudioCaps): Utility function to generate
     32        GStreamer caps representing a single audio channel for a given
     33        sample rate.
     34        (webKitWebAudioGStreamerChannelPosition): Utility function to
     35        convert AudioBus channel representations to GStreamer positional
     36        audio channel values.
     37        (webkit_web_audio_src_class_init): GObject configuration of the
     38        GStreamer source element.
     39        (webkit_web_audio_src_init): Initialization of the private data of
     40        the element.
     41        (webKitWebAudioSourceConstructed): Configure the GstBin elements
     42        depending on the AudioBus layout.
     43        (webKitWebAudioSourceFinalize): Clean up the GstBin and free private
     44        data of the element.
     45        (webKitWebAudioSourceSetProperty): GObject property setter.
     46        (webKitWebAudioSourceGetProperty): GObject property getter.
     47        (webKitWebAudioSourceLoop): GstTask used to pull data from the
     48        AudioBus and push it as GstBuffers to the src pad of the element.
     49        (webKitWebAudioSourceChangeState): Start or stop the above GstTask
     50        depending on the asked state transition.
     51        * platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.h: Added.
     52        * platform/graphics/gstreamer/GRefPtrGStreamer.cpp: GstTask support in GRefPtr.
     53        (WTF::adoptGRef):
     54        (WTF::GstTask):
     55        * platform/graphics/gstreamer/GRefPtrGStreamer.h:
     56
    1572011-11-24  Tor Arne Vestbø  <tor.arne.vestbo@nokia.com>
    258
  • trunk/Source/WebCore/GNUmakefile.list.am

    r101081 r101138  
    48264826        Source/WebCore/platform/audio/gstreamer/AudioDestinationGStreamer.h \
    48274827        Source/WebCore/platform/audio/gstreamer/AudioFileReaderGStreamer.cpp \
     4828        Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.cpp \
     4829        Source/WebCore/platform/audio/gstreamer/WebKitWebAudioSourceGStreamer.h \
    48284830        Source/WebCore/platform/audio/gtk/AudioBusGtk.cpp
    48294831webcore_built_sources += \
  • trunk/Source/WebCore/platform/audio/gstreamer/AudioDestinationGStreamer.cpp

    r98554 r101138  
    2626#include "AudioSourceProvider.h"
    2727#include "GOwnPtr.h"
     28#include "GRefPtrGStreamer.h"
     29#include "WebKitWebAudioSourceGStreamer.h"
     30#include <gst/gst.h>
     31#include <gst/pbutils/pbutils.h>
    2832
    2933namespace WebCore {
     34
     35// Size of the AudioBus for playback. The webkitwebaudiosrc element
     36// needs to handle this number of frames per cycle as well.
     37const unsigned framesToPull = 128;
    3038
    3139PassOwnPtr<AudioDestination> AudioDestination::create(AudioSourceProvider& provider, float sampleRate)
     
    3947}
    4048
     49static void onGStreamerWavparsePadAddedCallback(GstElement* element, GstPad* pad, AudioDestinationGStreamer* destination)
     50{
     51    destination->finishBuildingPipelineAfterWavParserPadReady(pad);
     52}
     53
    4154AudioDestinationGStreamer::AudioDestinationGStreamer(AudioSourceProvider& provider, float sampleRate)
    4255    : m_provider(provider)
    43     , m_renderBus(2, 128, true)
     56    , m_renderBus(2, framesToPull, true)
    4457    , m_sampleRate(sampleRate)
    4558    , m_isPlaying(false)
    4659{
     60    static bool gstInitialized = false;
     61    if (!gstInitialized)
     62        gstInitialized = gst_init_check(0, 0, 0);
     63    ASSERT_WITH_MESSAGE(gstInitialized, "GStreamer initialization failed");
     64
     65    m_pipeline = gst_pipeline_new("play");
     66
     67    GstElement* webkitAudioSrc = reinterpret_cast<GstElement*>(g_object_new(WEBKIT_TYPE_WEB_AUDIO_SRC,
     68                                                                            "rate", sampleRate,
     69                                                                            "bus", &m_renderBus,
     70                                                                            "provider", &m_provider,
     71                                                                            "frames", framesToPull, NULL));
     72
     73    GstElement* wavParser = gst_element_factory_make("wavparse", 0);
     74
     75    m_wavParserAvailable = wavParser;
     76    ASSERT_WITH_MESSAGE(m_wavParserAvailable, "Failed to create GStreamer wavparse element");
     77    if (!m_wavParserAvailable)
     78        return;
     79
     80    g_signal_connect(wavParser, "pad-added", G_CALLBACK(onGStreamerWavparsePadAddedCallback), this);
     81    gst_bin_add_many(GST_BIN(m_pipeline), webkitAudioSrc, wavParser, NULL);
     82    gst_element_link_pads_full(webkitAudioSrc, "src", wavParser, "sink", GST_PAD_LINK_CHECK_NOTHING);
    4783}
    4884
    4985AudioDestinationGStreamer::~AudioDestinationGStreamer()
    5086{
     87    gst_element_set_state(m_pipeline, GST_STATE_NULL);
     88    gst_object_unref(m_pipeline);
     89}
     90
     91void AudioDestinationGStreamer::finishBuildingPipelineAfterWavParserPadReady(GstPad* pad)
     92{
     93    ASSERT(m_wavParserAvailable);
     94
     95    GRefPtr<GstElement> audioSink = gst_element_factory_make("autoaudiosink", 0);
     96    m_audioSinkAvailable = audioSink;
     97
     98    if (!audioSink) {
     99        LOG_ERROR("Failed to create GStreamer autoaudiosink element");
     100        return;
     101    }
     102
     103    // Autoaudiosink does the real sink detection in the GST_STATE_NULL->READY transition
     104    // so it's best to roll it to READY as soon as possible to ensure the underlying platform
     105    // audiosink was loaded correctly.
     106    GstStateChangeReturn stateChangeReturn = gst_element_set_state(audioSink.get(), GST_STATE_READY);
     107    if (stateChangeReturn == GST_STATE_CHANGE_FAILURE) {
     108        LOG_ERROR("Failed to change autoaudiosink element state");
     109        gst_element_set_state(audioSink.get(), GST_STATE_NULL);
     110        m_audioSinkAvailable = false;
     111        return;
     112    }
     113
     114    GstElement* audioConvert = gst_element_factory_make("audioconvert", 0);
     115    gst_bin_add_many(GST_BIN(m_pipeline), audioConvert, audioSink.get(), NULL);
     116
     117    // Link wavparse's src pad to audioconvert sink pad.
     118    GRefPtr<GstPad> sinkPad = adoptGRef(gst_element_get_static_pad(audioConvert, "sink"));
     119    gst_pad_link(pad, sinkPad.get());
     120
     121    // Link audioconvert to audiosink and roll states.
     122    gst_element_link_pads_full(audioConvert, "src", audioSink.get(), "sink", GST_PAD_LINK_CHECK_NOTHING);
     123    gst_element_sync_state_with_parent(audioConvert);
     124    gst_element_sync_state_with_parent(audioSink.leakRef());
    51125}
    52126
    53127void AudioDestinationGStreamer::start()
    54128{
     129    ASSERT(m_wavParserAvailable);
     130    if (!m_wavParserAvailable)
     131        return;
     132
     133    gst_element_set_state(m_pipeline, GST_STATE_PLAYING);
    55134    m_isPlaying = true;
    56135}
     
    58137void AudioDestinationGStreamer::stop()
    59138{
     139    ASSERT(m_wavParserAvailable && m_audioSinkAvailable);
     140    if (!m_wavParserAvailable || m_audioSinkAvailable)
     141        return;
     142
     143    gst_element_set_state(m_pipeline, GST_STATE_PAUSED);
    60144    m_isPlaying = false;
    61145}
  • trunk/Source/WebCore/platform/audio/gstreamer/AudioDestinationGStreamer.h

    r98554 r101138  
    2323#include "AudioDestination.h"
    2424
     25typedef struct _GstElement GstElement;
     26typedef struct _GstPad GstPad;
     27
    2528namespace WebCore {
    2629
     
    3740    AudioSourceProvider& sourceProvider() const { return m_provider; }
    3841
     42    void finishBuildingPipelineAfterWavParserPadReady(GstPad*);
     43
    3944private:
    4045    AudioSourceProvider& m_provider;
     
    4348    float m_sampleRate;
    4449    bool m_isPlaying;
     50    bool m_wavParserAvailable;
     51    bool m_audioSinkAvailable;
     52    GstElement* m_pipeline;
    4553};
    4654
  • trunk/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.cpp

    r101130 r101138  
    8282}
    8383
     84
     85template <> GRefPtr<GstTask> adoptGRef(GstTask* ptr)
     86{
     87    ASSERT(!GST_OBJECT_IS_FLOATING(GST_OBJECT(ptr)));
     88    return GRefPtr<GstTask>(ptr, GRefPtrAdopt);
     89}
     90
     91template <> GstTask* refGPtr<GstTask>(GstTask* ptr)
     92{
     93    if (ptr) {
     94        gst_object_ref(GST_OBJECT(ptr));
     95        gst_object_sink(GST_OBJECT(ptr));
     96    }
     97
     98    return ptr;
     99}
     100
     101template <> void derefGPtr<GstTask>(GstTask* ptr)
     102{
     103    if (ptr)
     104        gst_object_unref(ptr);
     105}
     106
    84107}
    85108#endif // USE(GSTREAMER)
  • trunk/Source/WebCore/platform/graphics/gstreamer/GRefPtrGStreamer.h

    r101082 r101138  
    2727typedef struct _GstPad GstPad;
    2828typedef struct _GstCaps GstCaps;
     29typedef struct _GstTask GstTask;
    2930
    3031namespace WTF {
     
    4142template<> void derefGPtr<GstCaps>(GstCaps* ptr);
    4243
     44template<> GRefPtr<GstTask> adoptGRef(GstTask* ptr);
     45template<> GstTask* refGPtr<GstTask>(GstTask* ptr);
     46template<> void derefGPtr<GstTask>(GstTask* ptr);
     47
    4348}
    4449
Note: See TracChangeset for help on using the changeset viewer.