Changeset 258977 in webkit


Ignore:
Timestamp:
Mar 25, 2020 6:44:17 AM (4 years ago)
Author:
youenn@apple.com
Message:

Audio fails to capture stream in WebRTC if AudioSession gets interrupted
https://bugs.webkit.org/show_bug.cgi?id=208516
<rdar://problem/60020467>

Reviewed by Eric Carlson.

In case of page going to hidden, continue calling each capture factory to mute the corresponding sources if needed.
In case of page being visible again, reset all tracks according page muted state. This allows restarting tracks that have been
muted while page was hidden (video tracks or suspended audio tracks).

Since tracks can go to muted when visibility changes, we no longer return early when setting the muted state of a page to the same value.
Instead we apply it which ensures we comply with what UIProcess wants.

We start removing the concept of a RealtimeMediaSource be interrupted. Instead we use muting of sources.
This allows UIProcess or the page to override any muted state, for instance if page goes in foreground again.

We update the AudioSharedUnit to allow restarting capture even if suspended.
This ensures that we are able to restart capturing even if we do not receive the audio session end of interruption.
Also, this notification sometimes takes a long time to happen and we do not want to wait for it when user is interacting with the page.
A future refactoring will further remove RealtimeMediaSource interrupted-related code.

Manually tested.

  • dom/Document.cpp:

(WebCore::Document::visibilityStateChanged):

  • page/Page.cpp:

(WebCore::Page::setMuted):

  • platform/audio/PlatformMediaSessionManager.h:

(WebCore::PlatformMediaSessionManager::isInterrupted const):

  • platform/mediastream/RealtimeMediaSource.cpp:

(WebCore::RealtimeMediaSource::setInterrupted):
(WebCore::RealtimeMediaSource::setMuted):

  • platform/mediastream/mac/BaseAudioSharedUnit.cpp:

(WebCore::BaseAudioSharedUnit::startProducingData):
(WebCore::BaseAudioSharedUnit::resume):
(WebCore::BaseAudioSharedUnit::suspend):

Location:
trunk/Source/WebCore
Files:
9 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r258975 r258977  
     12020-03-25  Youenn Fablet  <youenn@apple.com>
     2
     3        Audio fails to capture stream in WebRTC if AudioSession gets interrupted
     4        https://bugs.webkit.org/show_bug.cgi?id=208516
     5        <rdar://problem/60020467>
     6
     7        Reviewed by Eric Carlson.
     8
     9        In case of page going to hidden, continue calling each capture factory to mute the corresponding sources if needed.
     10        In case of page being visible again, reset all tracks according page muted state. This allows restarting tracks that have been
     11        muted while page was hidden (video tracks or suspended audio tracks).
     12
     13        Since tracks can go to muted when visibility changes, we no longer return early when setting the muted state of a page to the same value.
     14        Instead we apply it which ensures we comply with what UIProcess wants.
     15
     16        We start removing the concept of a RealtimeMediaSource be interrupted. Instead we use muting of sources.
     17        This allows UIProcess or the page to override any muted state, for instance if page goes in foreground again.
     18
     19        We update the AudioSharedUnit to allow restarting capture even if suspended.
     20        This ensures that we are able to restart capturing even if we do not receive the audio session end of interruption.
     21        Also, this notification sometimes takes a long time to happen and we do not want to wait for it when user is interacting with the page.
     22        A future refactoring will further remove RealtimeMediaSource interrupted-related code.
     23
     24        Manually tested.
     25
     26        * dom/Document.cpp:
     27        (WebCore::Document::visibilityStateChanged):
     28        * page/Page.cpp:
     29        (WebCore::Page::setMuted):
     30        * platform/audio/PlatformMediaSessionManager.h:
     31        (WebCore::PlatformMediaSessionManager::isInterrupted const):
     32        * platform/mediastream/RealtimeMediaSource.cpp:
     33        (WebCore::RealtimeMediaSource::setInterrupted):
     34        (WebCore::RealtimeMediaSource::setMuted):
     35        * platform/mediastream/mac/BaseAudioSharedUnit.cpp:
     36        (WebCore::BaseAudioSharedUnit::startProducingData):
     37        (WebCore::BaseAudioSharedUnit::resume):
     38        (WebCore::BaseAudioSharedUnit::suspend):
     39
    1402020-03-25  Charlie Turner  <cturner@igalia.com>
    241
  • trunk/Source/WebCore/Modules/mediastream/MediaStreamTrack.cpp

    r257039 r258977  
    468468}
    469469
     470#if PLATFORM(IOS_FAMILY)
     471static MediaStreamTrack* findActiveCaptureTrackForDocument(Document& document, RealtimeMediaSource* activeSource, RealtimeMediaSource::Type type)
     472{
     473    MediaStreamTrack* selectedTrack = nullptr;
     474    for (auto* captureTrack : allCaptureTracks()) {
     475        if (captureTrack->document() != &document || captureTrack->ended())
     476            continue;
     477
     478        if (&captureTrack->source() == activeSource)
     479            return captureTrack;
     480
     481        // If the document has a live capture track, which is not the active one, we pick the first one.
     482        // FIXME: We should probably store per page active audio/video capture tracks.
     483        if (!selectedTrack && captureTrack->privateTrack().type() == type)
     484            selectedTrack = captureTrack;
     485    }
     486    return selectedTrack;
     487}
     488#endif
     489
    470490void MediaStreamTrack::updateCaptureAccordingToMutedState(Document& document)
    471491{
     492#if PLATFORM(IOS_FAMILY)
     493    auto* activeAudioSource = RealtimeMediaSourceCenter::singleton().audioCaptureFactory().activeSource();
     494    if (auto* audioCaptureTrack = findActiveCaptureTrackForDocument(document, activeAudioSource, RealtimeMediaSource::Type::Audio))
     495        audioCaptureTrack->setMuted(document.page()->mutedState());
     496
     497    auto* activeVideoSource = RealtimeMediaSourceCenter::singleton().videoCaptureFactory().activeSource();
     498    if (auto* videoCaptureTrack = findActiveCaptureTrackForDocument(document, activeVideoSource, RealtimeMediaSource::Type::Video))
     499        videoCaptureTrack->setMuted(document.page()->mutedState());
     500#else
    472501    for (auto* captureTrack : allCaptureTracks()) {
    473502        if (captureTrack->document() != &document || captureTrack->ended())
     
    475504        captureTrack->setMuted(document.page()->mutedState());
    476505    }
     506#endif
    477507}
    478508
  • trunk/Source/WebCore/Modules/mediastream/MediaStreamTrack.h

    r257039 r258977  
    155155    void setIdForTesting(String&& id) { m_private->setIdForTesting(WTFMove(id)); }
    156156
     157    Document* document() const;
     158
    157159#if !RELEASE_LOG_DISABLED
    158160    const Logger& logger() const final { return m_private->logger(); }
     
    172174    void configureTrackRendering();
    173175
    174     Document* document() const;
    175 
    176176    // ActiveDOMObject API.
    177177    void stop() final { stopTrack(); }
  • trunk/Source/WebCore/dom/Document.cpp

    r258869 r258977  
    17451745        client->visibilityStateChanged();
    17461746
    1747 #if ENABLE(MEDIA_STREAM)
    1748     if (auto* page = this->page())
    1749         RealtimeMediaSourceCenter::singleton().setCapturePageState(hidden(), page->isMediaCaptureMuted());
     1747#if PLATFORM(IOS_FAMILY)
     1748    if (hidden()) {
     1749        RealtimeMediaSourceCenter::singleton().setCapturePageState(hidden(), page()->isMediaCaptureMuted());
     1750        return;
     1751    }
     1752    if (!PlatformMediaSessionManager::sharedManager().isInterrupted())
     1753        MediaStreamTrack::updateCaptureAccordingToMutedState(*this);
    17501754#endif
    17511755}
  • trunk/Source/WebCore/page/Page.cpp

    r258882 r258977  
    17821782void Page::setMuted(MediaProducer::MutedStateFlags muted)
    17831783{
    1784     if (m_mutedState == muted)
    1785         return;
    1786 
    17871784    m_mutedState = muted;
    17881785
  • trunk/Source/WebCore/platform/audio/PlatformMediaSessionManager.h

    r258024 r258977  
    141141    WEBCORE_EXPORT void processDidReceiveRemoteControlCommand(PlatformMediaSession::RemoteControlCommandType, const PlatformMediaSession::RemoteCommandArgument*);
    142142
     143    bool isInterrupted() const { return m_interrupted; }
     144
    143145protected:
    144146    friend class PlatformMediaSession;
  • trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.cpp

    r252521 r258977  
    7777void RealtimeMediaSource::setInterrupted(bool interrupted, bool pageMuted)
    7878{
    79     if (interrupted == m_interrupted)
    80         return;
    81 
    8279    ALWAYS_LOG_IF(m_logger, LOGIDENTIFIER, interrupted, ", page muted : ", pageMuted);
    83 
    84     m_interrupted = interrupted;
    85     if (!interrupted && pageMuted)
    86         return;
    87 
    8880    setMuted(interrupted);
    8981}
     
    9183void RealtimeMediaSource::setMuted(bool muted)
    9284{
    93     if (!muted && interrupted()) {
    94         ALWAYS_LOG_IF(m_logger, LOGIDENTIFIER, "ignoring unmute because of interruption");
    95         return;
    96     }
    97 
    9885    ALWAYS_LOG_IF(m_logger, LOGIDENTIFIER, muted);
    9986
  • trunk/Source/WebCore/platform/mediastream/RealtimeMediaSourceFactory.h

    r258202 r258977  
    4646    void unsetActiveSource(RealtimeMediaSource&);
    4747
    48 protected:
    4948    RealtimeMediaSource* activeSource() { return m_activeSource; }
    5049
  • trunk/Source/WebCore/platform/mediastream/mac/BaseAudioSharedUnit.cpp

    r257039 r258977  
    7575    ASSERT(isMainThread());
    7676
     77    if (m_suspended)
     78        resume();
     79
    7780    if (++m_producingCount != 1)
    7881        return;
     
    8083    if (isProducingData())
    8184        return;
    82 
    83     if (m_suspended) {
    84         RELEASE_LOG_INFO(WebRTC, "BaseAudioSharedUnit::startProducingData - exiting early as suspended");
    85         return;
    86     }
    8785
    8886    if (hasAudioUnit()) {
     
    160158{
    161159    ASSERT(isMainThread());
    162     ASSERT(m_suspended);
     160    if (!m_suspended)
     161        return 0;
     162
    163163    ASSERT(!isProducingData());
    164164
     
    181181
    182182    forEachClient([](auto& client) {
    183         client.notifyMutedChange(false);
     183        client.setMuted(false);
    184184    });
    185185
     
    197197
    198198    forEachClient([](auto& client) {
    199         client.notifyMutedChange(true);
     199        client.setMuted(true);
    200200    });
    201201
Note: See TracChangeset for help on using the changeset viewer.