Changeset 271154 in webkit


Ignore:
Timestamp:
Jan 5, 2021 10:16:02 AM (19 months ago)
Author:
commit-queue@webkit.org
Message:

Fail speech recognition when page is muted for audio capture
https://bugs.webkit.org/show_bug.cgi?id=220133
<rdar://problem/72745232>

Patch by Sihui Liu <sihui_liu@appe.com> on 2021-01-05
Reviewed by Youenn Fablet.

Source/WebCore:

API test: WebKit2.SpeechRecognitionErrorWhenStartingAudioCaptureOnDifferentPage

  • Modules/speech/SpeechRecognitionCaptureSource.cpp:

(WebCore::SpeechRecognitionCaptureSource::mute):

  • Modules/speech/SpeechRecognitionCaptureSource.h:
  • Modules/speech/SpeechRecognitionCaptureSourceImpl.cpp:

(WebCore::SpeechRecognitionCaptureSourceImpl::mute):

  • Modules/speech/SpeechRecognitionCaptureSourceImpl.h:
  • Modules/speech/SpeechRecognizer.h:

(WebCore::SpeechRecognizer::source):

Source/WebKit:

We currently only allow one page to capture media at a time and we did this by muting (stop capture in) other
pages. To make speech recognition work with this behavior, two changes are made:

  1. when page is muted, mute audio capture source used for speech recognition on the page. This will

ultimately fail recognition.

  1. when speech recognition is about to start, make sure other pages are muted for capture.
  • UIProcess/SpeechRecognitionPermissionManager.h:

(WebKit::SpeechRecognitionPermissionManager::page):

  • UIProcess/SpeechRecognitionServer.cpp:

(WebKit::SpeechRecognitionServer::handleRequest):
(WebKit::SpeechRecognitionServer::mute):

  • UIProcess/SpeechRecognitionServer.h:
  • UIProcess/UserMediaProcessManager.cpp:

(WebKit::UserMediaProcessManager::muteCaptureMediaStreamsExceptIn): Deleted.

  • UIProcess/UserMediaProcessManager.h:
  • UIProcess/WebPageProxy.cpp:

(WebKit::WebPageProxy::activateMediaStreamCaptureInPage):
(WebKit::WebPageProxy::setMuted):

  • UIProcess/WebProcessProxy.cpp:

(WebKit::WebProcessProxy::muteCaptureInPagesExcept):
(WebKit::WebProcessProxy::pageMutedStateChanged):

  • UIProcess/WebProcessProxy.h:

Tools:

  • TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj:
  • TestWebKitAPI/Tests/WebKitCocoa/SpeechRecognition.mm:

(-[SpeechRecognitionPermissionUIDelegate _webView:requestMediaCaptureAuthorization:decisionHandler:]):
(-[SpeechRecognitionPermissionUIDelegate _webView:checkUserMediaPermissionForURL:mainFrameURL:frameIdentifier:decisionHandler:]):
(TestWebKitAPI::TEST):

  • TestWebKitAPI/Tests/WebKitCocoa/speechrecognition-basic.html: Added.
Location:
trunk
Files:
1 added
18 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r271153 r271154  
     12021-01-05  Sihui Liu  <sihui_liu@appe.com>
     2
     3        Fail speech recognition when page is muted for audio capture
     4        https://bugs.webkit.org/show_bug.cgi?id=220133
     5        <rdar://problem/72745232>
     6
     7        Reviewed by Youenn Fablet.
     8
     9        API test: WebKit2.SpeechRecognitionErrorWhenStartingAudioCaptureOnDifferentPage
     10
     11        * Modules/speech/SpeechRecognitionCaptureSource.cpp:
     12        (WebCore::SpeechRecognitionCaptureSource::mute):
     13        * Modules/speech/SpeechRecognitionCaptureSource.h:
     14        * Modules/speech/SpeechRecognitionCaptureSourceImpl.cpp:
     15        (WebCore::SpeechRecognitionCaptureSourceImpl::mute):
     16        * Modules/speech/SpeechRecognitionCaptureSourceImpl.h:
     17        * Modules/speech/SpeechRecognizer.h:
     18        (WebCore::SpeechRecognizer::source):
     19
    1202021-01-05  Alex Christensen  <achristensen@webkit.org>
    221
  • trunk/Source/WebCore/Modules/speech/SpeechRecognitionCaptureSource.cpp

    r270574 r271154  
    3535namespace WebCore {
    3636
     37void SpeechRecognitionCaptureSource::mute()
     38{
     39#if ENABLE(MEDIA_STREAM)
     40    m_impl->mute();
     41#endif
     42}
     43
    3744#if ENABLE(MEDIA_STREAM)
    3845
  • trunk/Source/WebCore/Modules/speech/SpeechRecognitionCaptureSource.h

    r270574 r271154  
    4545    SpeechRecognitionCaptureSource() = default;
    4646    ~SpeechRecognitionCaptureSource() = default;
     47    WEBCORE_EXPORT void mute();
     48
     49#if ENABLE(MEDIA_STREAM)
    4750    using DataCallback = Function<void(const WTF::MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t)>;
    4851    using StateUpdateCallback = Function<void(const SpeechRecognitionUpdate&)>;
    49 
    50 #if ENABLE(MEDIA_STREAM)
    5152    SpeechRecognitionCaptureSource(SpeechRecognitionConnectionClientIdentifier, DataCallback&&, StateUpdateCallback&&, Ref<RealtimeMediaSource>&&);
    5253    WEBCORE_EXPORT static Optional<WebCore::CaptureDevice> findCaptureDevice();
  • trunk/Source/WebCore/Modules/speech/SpeechRecognitionCaptureSourceImpl.cpp

    r270158 r271154  
    147147}
    148148
     149void SpeechRecognitionCaptureSourceImpl::mute()
     150{
     151    m_source->setMuted(true);
     152}
     153
    149154} // namespace WebCore
    150155
  • trunk/Source/WebCore/Modules/speech/SpeechRecognitionCaptureSourceImpl.h

    r270158 r271154  
    5555    SpeechRecognitionCaptureSourceImpl(SpeechRecognitionConnectionClientIdentifier, DataCallback&&, StateUpdateCallback&&, Ref<RealtimeMediaSource>&&);
    5656    ~SpeechRecognitionCaptureSourceImpl();
     57    void mute();
    5758
    5859private:
  • trunk/Source/WebCore/Modules/speech/SpeechRecognizer.h

    r270772 r271154  
    5353
    5454    Optional<SpeechRecognitionConnectionClientIdentifier> currentClientIdentifier() const { return m_clientIdentifier; }
     55    SpeechRecognitionCaptureSource* source() { return m_source.get(); }
    5556
    5657private:
  • trunk/Source/WebKit/ChangeLog

    r271153 r271154  
     12021-01-05  Sihui Liu  <sihui_liu@appe.com>
     2
     3        Fail speech recognition when page is muted for audio capture
     4        https://bugs.webkit.org/show_bug.cgi?id=220133
     5        <rdar://problem/72745232>
     6
     7        Reviewed by Youenn Fablet.
     8
     9        We currently only allow one page to capture media at a time and we did this by muting (stop capture in) other
     10        pages. To make speech recognition work with this behavior, two changes are made:
     11        1. when page is muted, mute audio capture source used for speech recognition on the page. This will
     12        ultimately fail recognition.
     13        2. when speech recognition is about to start, make sure other pages are muted for capture.
     14
     15        * UIProcess/SpeechRecognitionPermissionManager.h:
     16        (WebKit::SpeechRecognitionPermissionManager::page):
     17        * UIProcess/SpeechRecognitionServer.cpp:
     18        (WebKit::SpeechRecognitionServer::handleRequest):
     19        (WebKit::SpeechRecognitionServer::mute):
     20        * UIProcess/SpeechRecognitionServer.h:
     21        * UIProcess/UserMediaProcessManager.cpp:
     22        (WebKit::UserMediaProcessManager::muteCaptureMediaStreamsExceptIn): Deleted.
     23        * UIProcess/UserMediaProcessManager.h:
     24        * UIProcess/WebPageProxy.cpp:
     25        (WebKit::WebPageProxy::activateMediaStreamCaptureInPage):
     26        (WebKit::WebPageProxy::setMuted):
     27        * UIProcess/WebProcessProxy.cpp:
     28        (WebKit::WebProcessProxy::muteCaptureInPagesExcept):
     29        (WebKit::WebProcessProxy::pageMutedStateChanged):
     30        * UIProcess/WebProcessProxy.h:
     31
    1322021-01-05  Alex Christensen  <achristensen@webkit.org>
    233
  • trunk/Source/WebKit/UIProcess/SpeechRecognitionPermissionManager.h

    r271031 r271154  
    4343
    4444    void decideByDefaultAction(const WebCore::SecurityOrigin&, CompletionHandler<void(bool)>&&);
     45    WebPageProxy& page() { return m_page; }
    4546
    4647private:
  • trunk/Source/WebKit/UIProcess/SpeechRecognitionServer.cpp

    r271031 r271154  
    2828
    2929#include "SpeechRecognitionPermissionRequest.h"
     30#include "UserMediaProcessManager.h"
    3031#include "WebProcessProxy.h"
    3132#include "WebSpeechRecognitionConnectionMessages.h"
     
    125126    }
    126127
     128    WebProcessProxy::muteCaptureInPagesExcept(m_identifier);
     129
    127130    bool mockDeviceCapturesEnabled = m_checkIfMockSpeechRecognitionEnabled();
    128131    m_recognizer->start(clientIdentifier, sourceOrError.source(), mockDeviceCapturesEnabled, request.lang(), request.continuous(), request.interimResults(), request.maxAlternatives());
     
    191194}
    192195
     196void SpeechRecognitionServer::mute()
     197{
     198    if (!m_recognizer)
     199        return;
     200   
     201    if (auto* source = m_recognizer->source())
     202        source->mute();
     203}
     204
    193205} // namespace WebKit
    194206
  • trunk/Source/WebKit/UIProcess/SpeechRecognitionServer.h

    r271031 r271154  
    6464    void abort(WebCore::SpeechRecognitionConnectionClientIdentifier);
    6565    void invalidate(WebCore::SpeechRecognitionConnectionClientIdentifier);
     66    void mute();
    6667
    6768private:
  • trunk/Source/WebKit/UIProcess/UserMediaProcessManager.cpp

    r264521 r271154  
    2424#include "Logging.h"
    2525#include "MediaDeviceSandboxExtensions.h"
     26#include "SpeechRecognitionPermissionManager.h"
    2627#include "WebPageProxy.h"
    2728#include "WebProcessMessages.h"
     
    5758}
    5859
    59 void UserMediaProcessManager::muteCaptureMediaStreamsExceptIn(WebPageProxy& pageStartingCapture)
    60 {
    61 #if PLATFORM(COCOA)
    62     UserMediaPermissionRequestManagerProxy::forEach([&pageStartingCapture](auto& proxy) {
    63         if (&proxy.page() != &pageStartingCapture)
    64             proxy.page().setMediaStreamCaptureMuted(true);
    65     });
    66 #else
    67     UNUSED_PARAM(pageStartingCapture);
    68 #endif
    69 }
    70 
    7160#if ENABLE(SANDBOX_EXTENSIONS)
    7261static bool needsAppleCameraService()
  • trunk/Source/WebKit/UIProcess/UserMediaProcessManager.h

    r245335 r271154  
    2323#include "UserMediaPermissionRequestManagerProxy.h"
    2424#include <WebCore/CaptureDevice.h>
     25#include <WebCore/PageIdentifier.h>
    2526#include <wtf/RunLoop.h>
    2627
     
    3738
    3839    bool willCreateMediaStream(UserMediaPermissionRequestManagerProxy&, bool withAudio, bool withVideo);
    39     void muteCaptureMediaStreamsExceptIn(WebPageProxy&);
    4040
    4141    void revokeSandboxExtensionsIfNeeded(WebProcessProxy&);
  • trunk/Source/WebKit/UIProcess/WebPageProxy.cpp

    r271059 r271154  
    23882388{
    23892389#if ENABLE(MEDIA_STREAM)
    2390     UserMediaProcessManager::singleton().muteCaptureMediaStreamsExceptIn(*this);
     2390    WebProcessProxy::muteCaptureInPagesExcept(m_webPageID);
    23912391#endif
    23922392    setMuted(m_mutedState & ~WebCore::MediaProducer::MediaStreamCaptureIsMuted);
     
    59555955    bool hasMutedCaptureStreams = m_mediaState & WebCore::MediaProducer::MutedCaptureMask;
    59565956    if (hasMutedCaptureStreams && !(state & WebCore::MediaProducer::MediaStreamCaptureIsMuted))
    5957         UserMediaProcessManager::singleton().muteCaptureMediaStreamsExceptIn(*this);
    5958 #endif
     5957        WebProcessProxy::muteCaptureInPagesExcept(m_webPageID);
     5958#endif
     5959
     5960    m_process->pageMutedStateChanged(m_webPageID, state);
    59595961
    59605962    send(Messages::WebPage::SetMuted(state));
  • trunk/Source/WebKit/UIProcess/WebProcessProxy.cpp

    r271031 r271154  
    17651765}
    17661766
    1767 #endif
     1767void WebProcessProxy::muteCaptureInPagesExcept(WebCore::PageIdentifier pageID)
     1768{
     1769#if PLATFORM(COCOA)
     1770    for (auto* page : globalPageMap().values()) {
     1771        if (page->webPageID() != pageID)
     1772            page->setMediaStreamCaptureMuted(true);
     1773    }
     1774#else
     1775    UNUSED_PARAM(pageID);
     1776#endif
     1777}
     1778
     1779#endif
     1780
     1781void WebProcessProxy::pageMutedStateChanged(WebCore::PageIdentifier identifier, WebCore::MediaProducer::MutedStateFlags flags)
     1782{
     1783    bool mutedForCapture = flags & MediaProducer::AudioAndVideoCaptureIsMuted;
     1784    if (!mutedForCapture)
     1785        return;
     1786
     1787    if (auto speechRecognitionServer = m_speechRecognitionServerMap.get(identifier))
     1788        speechRecognitionServer->mute();
     1789}
    17681790
    17691791#if PLATFORM(WATCHOS)
  • trunk/Source/WebKit/UIProcess/WebProcessProxy.h

    r270574 r271154  
    4646#include "WebProcessProxyMessagesReplies.h"
    4747#include <WebCore/FrameIdentifier.h>
     48#include <WebCore/MediaProducer.h>
    4849#include <WebCore/PageIdentifier.h>
    4950#include <WebCore/ProcessIdentifier.h>
     
    399400    void setIgnoreInvalidMessageForTesting();
    400401#endif
    401    
     402
    402403#if ENABLE(MEDIA_STREAM)
     404    static void muteCaptureInPagesExcept(WebCore::PageIdentifier);
    403405    SpeechRecognitionRemoteRealtimeMediaSourceManager& ensureSpeechRecognitionRemoteRealtimeMediaSourceManager();
    404406#endif
     407    void pageMutedStateChanged(WebCore::PageIdentifier, WebCore::MediaProducer::MutedStateFlags);
    405408
    406409protected:
  • trunk/Tools/ChangeLog

    r271151 r271154  
     12021-01-05  Sihui Liu  <sihui_liu@appe.com>
     2
     3        Fail speech recognition when page is muted for audio capture
     4        https://bugs.webkit.org/show_bug.cgi?id=220133
     5        <rdar://problem/72745232>
     6
     7        Reviewed by Youenn Fablet.
     8
     9        * TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj:
     10        * TestWebKitAPI/Tests/WebKitCocoa/SpeechRecognition.mm:
     11        (-[SpeechRecognitionPermissionUIDelegate _webView:requestMediaCaptureAuthorization:decisionHandler:]):
     12        (-[SpeechRecognitionPermissionUIDelegate _webView:checkUserMediaPermissionForURL:mainFrameURL:frameIdentifier:decisionHandler:]):
     13        (TestWebKitAPI::TEST):
     14        * TestWebKitAPI/Tests/WebKitCocoa/speechrecognition-basic.html: Added.
     15
    1162021-01-05  Alexey Proskuryakov  <ap@apple.com>
    217
  • trunk/Tools/TestWebKitAPI/TestWebKitAPI.xcodeproj/project.pbxproj

    r270638 r271154  
    810810                935786CD20F6A2910000CDFC /* IndexedDB.sqlite3 in Copy Resources */ = {isa = PBXBuildFile; fileRef = 934FA5C720F69FEE0040DC1B /* IndexedDB.sqlite3 */; };
    811811                935786CE20F6A2A10000CDFC /* IndexedDB.sqlite3-shm in Copy Resources */ = {isa = PBXBuildFile; fileRef = 934FA5C620F69FED0040DC1B /* IndexedDB.sqlite3-shm */; };
     812                9360270625A3CF7600367670 /* speechrecognition-basic.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 9360270525A3B28E00367670 /* speechrecognition-basic.html */; };
    812813                9361002914DC95A70061379D /* lots-of-iframes.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 9361002814DC957B0061379D /* lots-of-iframes.html */; };
    813814                93625D271CD9741C006DC1F1 /* large-video-without-audio.html in Copy Resources */ = {isa = PBXBuildFile; fileRef = 93625D261CD973AF006DC1F1 /* large-video-without-audio.html */; };
     
    16201621                                C01A23F21266156700C9ED55 /* spacebar-scrolling.html in Copy Resources */,
    16211622                                F4CFCDDA249FC9E400527482 /* SpaceOnly.otf in Copy Resources */,
     1623                                9360270625A3CF7600367670 /* speechrecognition-basic.html in Copy Resources */,
    16221624                                9342589E255B6A120059EEDD /* speechrecognition-user-permission-persistence.html in Copy Resources */,
    16231625                                E194E1BD177E53C7009C4D4E /* StopLoadingFromDidReceiveResponse.html in Copy Resources */,
     
    24232425                934FA5C720F69FEE0040DC1B /* IndexedDB.sqlite3 */ = {isa = PBXFileReference; lastKnownFileType = file; path = IndexedDB.sqlite3; sourceTree = "<group>"; };
    24242426                93575C551D30366E000D604D /* focus-inputs.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = "focus-inputs.html"; sourceTree = "<group>"; };
     2427                9360270525A3B28E00367670 /* speechrecognition-basic.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = "speechrecognition-basic.html"; sourceTree = "<group>"; };
    24252428                9361002814DC957B0061379D /* lots-of-iframes.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = "lots-of-iframes.html"; sourceTree = "<group>"; };
    24262429                93625D261CD973AF006DC1F1 /* large-video-without-audio.html */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text.html; path = "large-video-without-audio.html"; sourceTree = "<group>"; };
     
    39303933                                F4F405BB1D4C0CF8007A9707 /* skinny-autoplaying-video-with-audio.html */,
    39313934                                F4CFCDD8249FC9D900527482 /* SpaceOnly.otf */,
     3935                                9360270525A3B28E00367670 /* speechrecognition-basic.html */,
    39323936                                9342589D255B66A00059EEDD /* speechrecognition-user-permission-persistence.html */,
    39333937                                515BE16E1D4288FF00DD7C68 /* StoreBlobToBeDeleted.html */,
  • trunk/Tools/TestWebKitAPI/Tests/WebKitCocoa/SpeechRecognition.mm

    r269810 r271154  
    3131#import <WebKit/WKUIDelegatePrivate.h>
    3232#import <WebKit/WKWebView.h>
    33 #import <WebKit/WKWebViewConfiguration.h>
     33#import <WebKit/WKWebViewConfigurationPrivate.h>
    3434#import <wtf/RetainPtr.h>
    3535
     
    4141@interface SpeechRecognitionPermissionUIDelegate : NSObject<WKUIDelegatePrivate>
    4242- (void)_webView:(WKWebView *)webView requestSpeechRecognitionPermissionForOrigin:(WKSecurityOrigin *)origin decisionHandler:(void (^)(BOOL))decisionHandler;
     43- (void)_webView:(WKWebView *)webView requestMediaCaptureAuthorization: (_WKCaptureDevices)devices decisionHandler:(void (^)(BOOL))decisionHandler;
     44- (void)_webView:(WKWebView *)webView checkUserMediaPermissionForURL:(NSURL *)url mainFrameURL:(NSURL *)mainFrameURL frameIdentifier:(NSUInteger)frameIdentifier decisionHandler:(void (^)(NSString *salt, BOOL authorized))decisionHandler;
    4345@end
    4446
     
    4850    permissionRequested = true;
    4951    decisionHandler(shouldGrantPermissionRequest);
     52}
     53
     54- (void)_webView:(WKWebView *)webView requestMediaCaptureAuthorization: (_WKCaptureDevices)devices decisionHandler:(void (^)(BOOL))decisionHandler
     55{
     56    decisionHandler(YES);
     57}
     58
     59- (void)_webView:(WKWebView *)webView checkUserMediaPermissionForURL:(NSURL *)url mainFrameURL:(NSURL *)mainFrameURL frameIdentifier:(NSUInteger)frameIdentifier decisionHandler:(void (^)(NSString *salt, BOOL authorized))decisionHandler
     60{
     61    decisionHandler(@"0x9876543210", YES);
    5062}
    5163@end
     
    104116}
    105117
     118TEST(WebKit2, SpeechRecognitionErrorWhenStartingAudioCaptureOnDifferentPage)
     119{
     120    shouldGrantPermissionRequest = true;
     121
     122    auto configuration = adoptNS([[WKWebViewConfiguration alloc] init]);
     123    auto handler = adoptNS([[SpeechRecognitionMessageHandler alloc] init]);
     124    [[configuration userContentController] addScriptMessageHandler:handler.get() name:@"testHandler"];
     125    configuration.get()._mediaCaptureEnabled = YES;
     126    auto preferences = [configuration preferences];
     127    preferences._mockCaptureDevicesEnabled = YES;
     128    preferences._speechRecognitionEnabled = YES;
     129    preferences._mediaCaptureRequiresSecureConnection = NO;
     130    auto delegate = adoptNS([[SpeechRecognitionPermissionUIDelegate alloc] init]);
     131    auto firstWebView = adoptNS([[TestWKWebView alloc] initWithFrame:CGRectMake(0, 0, 800, 600) configuration:configuration.get()]);
     132    [firstWebView setUIDelegate:delegate.get()];
     133    auto secondWebView = adoptNS([[TestWKWebView alloc] initWithFrame:CGRectMake(0, 0, 800, 600) configuration:configuration.get()]);
     134    [secondWebView setUIDelegate:delegate.get()];
     135
     136    // First page starts recognition successfully.
     137    receivedScriptMessage = false;
     138    [firstWebView synchronouslyLoadTestPageNamed:@"speechrecognition-basic"];
     139    [firstWebView stringByEvaluatingJavaScript:@"start()"];
     140    TestWebKitAPI::Util::run(&receivedScriptMessage);
     141    EXPECT_WK_STREQ(@"Start", [lastScriptMessage body]);
     142
     143    // First page is muted when second page starts recognition.
     144    // Load html string instead of test page to make sure message only comes from one page.
     145    receivedScriptMessage = false;
     146    [secondWebView synchronouslyLoadHTMLString:@"<script>speechRecognition = new webkitSpeechRecognition(); speechRecognition.start();</script>" baseURL:[NSURL URLWithString:@"https://webkit.org"]];
     147    TestWebKitAPI::Util::run(&receivedScriptMessage);
     148    EXPECT_WK_STREQ(@"Error: audio-capture - Source is muted", [lastScriptMessage body]);
     149
     150    // First page restarts recognition successfully.
     151    receivedScriptMessage = false;
     152    [firstWebView stringByEvaluatingJavaScript:@"start()"];
     153    TestWebKitAPI::Util::run(&receivedScriptMessage);
     154    EXPECT_WK_STREQ(@"Start", [lastScriptMessage body]);
     155
     156    // First page is muted when second page starts media recorder.
     157    receivedScriptMessage = false;
     158    [secondWebView synchronouslyLoadTestPageNamed:@"speechrecognition-basic"];
     159    [secondWebView stringByEvaluatingJavaScript:@"startRecorder()"];
     160    TestWebKitAPI::Util::run(&receivedScriptMessage);
     161    EXPECT_WK_STREQ(@"Error: audio-capture - Source is muted", [lastScriptMessage body]);
     162
     163    // Second page is muted when first page starts recognition.
     164    receivedScriptMessage = false;
     165    [firstWebView synchronouslyLoadHTMLString:@"<script>speechRecognition = new webkitSpeechRecognition(); speechRecognition.start();</script>" baseURL:[NSURL URLWithString:@"https://webkit.org"]];
     166    TestWebKitAPI::Util::run(&receivedScriptMessage);
     167    EXPECT_WK_STREQ(@"Recorder Mute", [lastScriptMessage body]);
     168}
     169
    106170} // namespace TestWebKitAPI
Note: See TracChangeset for help on using the changeset viewer.