Changeset 272434 in webkit
- Timestamp:
- Feb 5, 2021 12:27:19 PM (18 months ago)
- Location:
- trunk
- Files:
-
- 15 edited
-
LayoutTests/ChangeLog (modified) (1 diff)
-
LayoutTests/fast/speechrecognition/ios/restart-recognition-after-stop.html (modified) (1 diff)
-
LayoutTests/fast/speechrecognition/ios/start-recognition-then-stop.html (modified) (1 diff)
-
LayoutTests/fast/speechrecognition/start-recognition-then-stop.html (modified) (1 diff)
-
LayoutTests/fast/speechrecognition/start-second-recognition.html (modified) (1 diff)
-
Source/WebCore/ChangeLog (modified) (1 diff)
-
Source/WebCore/Modules/speech/SpeechRecognitionCaptureSource.cpp (modified) (1 diff)
-
Source/WebKit/ChangeLog (modified) (1 diff)
-
Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp (modified) (2 diffs)
-
Source/WebKit/UIProcess/UserMediaPermissionRequestManagerProxy.cpp (modified) (1 diff)
-
Source/WebKit/UIProcess/WebPageProxy.cpp (modified) (1 diff)
-
Source/WebKit/WebProcess/Speech/SpeechRecognitionRealtimeMediaSourceManager.cpp (modified) (3 diffs)
-
Source/WebKit/WebProcess/cocoa/RemoteRealtimeMediaSource.cpp (modified) (6 diffs)
-
Source/WebKit/WebProcess/cocoa/RemoteRealtimeMediaSource.h (modified) (6 diffs)
-
Source/WebKit/WebProcess/cocoa/UserMediaCaptureManager.cpp (modified) (5 diffs)
Legend:
- Unmodified
- Added
- Removed
-
trunk/LayoutTests/ChangeLog
r272433 r272434 1 2021-02-05 Youenn Fablet <youenn@apple.com> 2 3 Enable audio capture for speech recognition in GPUProcess 4 https://bugs.webkit.org/show_bug.cgi?id=221457 5 6 Reviewed by Eric Carlson. 7 8 * fast/speechrecognition/ios/restart-recognition-after-stop.html: 9 * fast/speechrecognition/ios/start-recognition-then-stop.html: 10 * fast/speechrecognition/start-recognition-then-stop.html: 11 * fast/speechrecognition/start-second-recognition.html: 12 1 13 2021-02-05 Patrick Angle <pangle@apple.com> 2 14 -
trunk/LayoutTests/fast/speechrecognition/ios/restart-recognition-after-stop.html
r272304 r272434 1 <!DOCTYPE html> <!-- webkit-test-runner [ CaptureAudioInGPUProcessEnabled=false ] -->1 <!DOCTYPE html> 2 2 <html> 3 3 <body> -
trunk/LayoutTests/fast/speechrecognition/ios/start-recognition-then-stop.html
r272304 r272434 1 <!DOCTYPE html> <!-- webkit-test-runner [ CaptureAudioInGPUProcessEnabled=false ] -->1 <!DOCTYPE html> 2 2 <html> 3 3 <body> -
trunk/LayoutTests/fast/speechrecognition/start-recognition-then-stop.html
r272053 r272434 1 <!DOCTYPE html> <!-- webkit-test-runner [ CaptureAudioInGPUProcessEnabled=false ] -->1 <!DOCTYPE html> 2 2 <html> 3 3 <body> -
trunk/LayoutTests/fast/speechrecognition/start-second-recognition.html
r272053 r272434 1 <!DOCTYPE html> <!-- webkit-test-runner [ CaptureAudioInGPUProcessEnabled=false ] -->1 <!DOCTYPE html> 2 2 <html> 3 3 <body> -
trunk/Source/WebCore/ChangeLog
r272433 r272434 1 2021-02-05 Youenn Fablet <youenn@apple.com> 2 3 Enable audio capture for speech recognition in GPUProcess 4 https://bugs.webkit.org/show_bug.cgi?id=221457 5 6 Reviewed by Eric Carlson. 7 8 Add fake deviceId to play nice with capture ASSERTs. 9 Covered by updated tests. 10 11 * Modules/speech/SpeechRecognitionCaptureSource.cpp: 12 (WebCore::SpeechRecognitionCaptureSource::createRealtimeMediaSource): 13 1 14 2021-02-05 Patrick Angle <pangle@apple.com> 2 15 -
trunk/Source/WebCore/Modules/speech/SpeechRecognitionCaptureSource.cpp
r271154 r272434 65 65 CaptureSourceOrError SpeechRecognitionCaptureSource::createRealtimeMediaSource(const CaptureDevice& captureDevice) 66 66 { 67 return RealtimeMediaSourceCenter::singleton().audioCaptureFactory().createAudioCaptureSource(captureDevice, { }, { });67 return RealtimeMediaSourceCenter::singleton().audioCaptureFactory().createAudioCaptureSource(captureDevice, "SpeechID"_s, { }); 68 68 } 69 69 -
trunk/Source/WebKit/ChangeLog
r272425 r272434 1 2021-02-05 Youenn Fablet <youenn@apple.com> 2 3 Enable audio capture for speech recognition in GPUProcess 4 https://bugs.webkit.org/show_bug.cgi?id=221457 5 6 Reviewed by Eric Carlson. 7 8 Allow to create remote sources without any constraint. 9 To do so, we serialize through IPC a MediaConstraints with isValid = false and treat it as no constraint in capture process. 10 11 Make sure to send sandbox extensions and authorizations for GPUProcess to capture in case of speech recognition audio capture request. 12 13 In case of GPUProcess audio capture, send the request to capture to WebProcess like done for iOS. 14 WebProcess is then responsible to get audio samples from GPUProcess and forward them to UIProcess. 15 A future refactoring should move speech recognition to GPUProcess. 16 17 * UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp: 18 (WebKit::UserMediaCaptureManagerProxy::createMediaSourceForCaptureDeviceWithConstraints): 19 * UIProcess/UserMediaPermissionRequestManagerProxy.cpp: 20 (WebKit::UserMediaPermissionRequestManagerProxy::grantRequest): 21 * UIProcess/WebPageProxy.cpp: 22 (WebKit::WebPageProxy::createRealtimeMediaSourceForSpeechRecognition): 23 * WebProcess/Speech/SpeechRecognitionRealtimeMediaSourceManager.cpp: 24 (WebKit::SpeechRecognitionRealtimeMediaSourceManager::grantSandboxExtensions): 25 (WebKit::SpeechRecognitionRealtimeMediaSourceManager::createSource): 26 * WebProcess/cocoa/RemoteRealtimeMediaSource.cpp: 27 (WebKit::RemoteRealtimeMediaSource::create): 28 (WebKit::RemoteRealtimeMediaSource::RemoteRealtimeMediaSource): 29 (WebKit::RemoteRealtimeMediaSource::createRemoteMediaSource): 30 (WebKit::RemoteRealtimeMediaSource::~RemoteRealtimeMediaSource): 31 (WebKit::RemoteRealtimeMediaSource::cloneVideoSource): 32 (WebKit::RemoteRealtimeMediaSource::gpuProcessConnectionDidClose): 33 * WebProcess/cocoa/RemoteRealtimeMediaSource.h: 34 * WebProcess/cocoa/UserMediaCaptureManager.cpp: 35 (WebKit::UserMediaCaptureManager::AudioFactory::createAudioCaptureSource): 36 (WebKit::UserMediaCaptureManager::VideoFactory::createVideoCaptureSource): 37 (WebKit::UserMediaCaptureManager::DisplayFactory::createDisplayCaptureSource): 38 1 39 2021-02-05 Kate Cheney <katherine_cheney@apple.com> 2 40 -
trunk/Source/WebKit/UIProcess/Cocoa/UserMediaCaptureManagerProxy.cpp
r270961 r272434 239 239 } 240 240 241 void UserMediaCaptureManagerProxy::createMediaSourceForCaptureDeviceWithConstraints(RealtimeMediaSourceIdentifier id, const CaptureDevice& device, String&& hashSalt, const MediaConstraints& constraints, CompletionHandler<void(bool succeeded, String invalidConstraints, WebCore::RealtimeMediaSourceSettings&&, WebCore::RealtimeMediaSourceCapabilities&&)>&& completionHandler)241 void UserMediaCaptureManagerProxy::createMediaSourceForCaptureDeviceWithConstraints(RealtimeMediaSourceIdentifier id, const CaptureDevice& device, String&& hashSalt, const MediaConstraints& mediaConstraints, CompletionHandler<void(bool succeeded, String invalidConstraints, WebCore::RealtimeMediaSourceSettings&&, WebCore::RealtimeMediaSourceCapabilities&&)>&& completionHandler) 242 242 { 243 243 if (!m_connectionProxy->willStartCapture(device.type())) 244 244 return completionHandler(false, "Request is not allowed"_s, RealtimeMediaSourceSettings { }, { }); 245 246 auto* constraints = mediaConstraints.isValid ? &mediaConstraints : nullptr; 245 247 246 248 CaptureSourceOrError sourceOrError; 247 249 switch (device.type()) { 248 250 case WebCore::CaptureDevice::DeviceType::Microphone: 249 sourceOrError = RealtimeMediaSourceCenter::singleton().audioCaptureFactory().createAudioCaptureSource(device, WTFMove(hashSalt), &constraints);251 sourceOrError = RealtimeMediaSourceCenter::singleton().audioCaptureFactory().createAudioCaptureSource(device, WTFMove(hashSalt), constraints); 250 252 break; 251 253 case WebCore::CaptureDevice::DeviceType::Camera: 252 sourceOrError = RealtimeMediaSourceCenter::singleton().videoCaptureFactory().createVideoCaptureSource(device, WTFMove(hashSalt), &constraints);254 sourceOrError = RealtimeMediaSourceCenter::singleton().videoCaptureFactory().createVideoCaptureSource(device, WTFMove(hashSalt), constraints); 253 255 if (sourceOrError) 254 256 sourceOrError.captureSource->monitorOrientation(m_orientationNotifier); … … 256 258 case WebCore::CaptureDevice::DeviceType::Screen: 257 259 case WebCore::CaptureDevice::DeviceType::Window: 258 sourceOrError = RealtimeMediaSourceCenter::singleton().displayCaptureFactory().createDisplayCaptureSource(device, &constraints);260 sourceOrError = RealtimeMediaSourceCenter::singleton().displayCaptureFactory().createDisplayCaptureSource(device, constraints); 259 261 break; 260 262 case WebCore::CaptureDevice::DeviceType::Speaker: -
trunk/Source/WebKit/UIProcess/UserMediaPermissionRequestManagerProxy.cpp
r272165 r272434 241 241 242 242 if (auto callback = request.decisionCompletionHandler()) { 243 m_page.willStartCapture(request, [callback = WTFMove(callback)]() mutable { 244 callback(true); 245 }); 243 246 m_grantedRequests.append(makeRef(request)); 244 callback(true);245 247 return; 246 248 } -
trunk/Source/WebKit/UIProcess/WebPageProxy.cpp
r272417 r272434 10442 10442 WebCore::CaptureSourceOrError WebPageProxy::createRealtimeMediaSourceForSpeechRecognition() 10443 10443 { 10444 if (preferences().captureAudioInGPUProcessEnabled())10445 return CaptureSourceOrError { "Not implemented for GPU process" };10446 10447 10444 auto captureDevice = SpeechRecognitionCaptureSource::findCaptureDevice(); 10448 10445 if (!captureDevice) 10449 10446 return CaptureSourceOrError { "No device is available for capture" }; 10447 10448 if (preferences().captureAudioInGPUProcessEnabled()) 10449 return CaptureSourceOrError { SpeechRecognitionRemoteRealtimeMediaSource::create(m_process->ensureSpeechRecognitionRemoteRealtimeMediaSourceManager(), *captureDevice) }; 10450 10450 10451 10451 #if PLATFORM(IOS_FAMILY) -
trunk/Source/WebKit/WebProcess/Speech/SpeechRecognitionRealtimeMediaSourceManager.cpp
r272408 r272434 183 183 m_sandboxExtensionForTCCD = SandboxExtension::create(WTFMove(sandboxHandleForTCCD)); 184 184 if (!m_sandboxExtensionForTCCD) 185 LOG_ERROR("Failed to create sandbox extension for tccd");185 RELEASE_LOG_ERROR(Media, "Failed to create sandbox extension for tccd"); 186 186 else 187 187 m_sandboxExtensionForTCCD->consume(); … … 189 189 m_sandboxExtensionForMicrophone = SandboxExtension::create(WTFMove(sandboxHandleForMicrophone)); 190 190 if (!m_sandboxExtensionForMicrophone) 191 LOG_ERROR("Failed to create sandbox extension for microphone");191 RELEASE_LOG_ERROR(Media, "Failed to create sandbox extension for microphone"); 192 192 else 193 193 m_sandboxExtensionForMicrophone->consume(); … … 213 213 auto result = SpeechRecognitionCaptureSource::createRealtimeMediaSource(device); 214 214 if (!result) { 215 LOG_ERROR("Failed to create realtime source");215 RELEASE_LOG_ERROR(Media, "Failed to create realtime source"); 216 216 send(Messages::SpeechRecognitionRemoteRealtimeMediaSourceManager::RemoteCaptureFailed(identifier), 0); 217 217 return; -
trunk/Source/WebKit/WebProcess/cocoa/RemoteRealtimeMediaSource.cpp
r272381 r272434 47 47 using namespace WebCore; 48 48 49 Ref<RealtimeMediaSource> RemoteRealtimeMediaSource::create(const CaptureDevice& device, const MediaConstraints &constraints, String&& name, String&& hashSalt, UserMediaCaptureManager& manager, bool shouldCaptureInGPUProcess)50 { 51 auto source = adoptRef(*new RemoteRealtimeMediaSource(RealtimeMediaSourceIdentifier::generate(), device .type(), WTFMove(name), WTFMove(hashSalt), manager, shouldCaptureInGPUProcess));49 Ref<RealtimeMediaSource> RemoteRealtimeMediaSource::create(const CaptureDevice& device, const MediaConstraints* constraints, String&& name, String&& hashSalt, UserMediaCaptureManager& manager, bool shouldCaptureInGPUProcess) 50 { 51 auto source = adoptRef(*new RemoteRealtimeMediaSource(RealtimeMediaSourceIdentifier::generate(), device, constraints, WTFMove(name), WTFMove(hashSalt), manager, shouldCaptureInGPUProcess)); 52 52 manager.addSource(source.copyRef()); 53 source->createRemoteMediaSource( device, constraints);53 source->createRemoteMediaSource(); 54 54 return source; 55 55 } … … 71 71 } 72 72 73 RemoteRealtimeMediaSource::RemoteRealtimeMediaSource(RealtimeMediaSourceIdentifier identifier, CaptureDevice::DeviceType deviceType, String&& name, String&& hashSalt, UserMediaCaptureManager& manager, bool shouldCaptureInGPUProcess)74 : RealtimeMediaSource(sourceTypeFromDeviceType(device Type), WTFMove(name), String::number(identifier.toUInt64()), WTFMove(hashSalt))73 RemoteRealtimeMediaSource::RemoteRealtimeMediaSource(RealtimeMediaSourceIdentifier identifier, const CaptureDevice& device, const MediaConstraints* constraints, String&& name, String&& hashSalt, UserMediaCaptureManager& manager, bool shouldCaptureInGPUProcess) 74 : RealtimeMediaSource(sourceTypeFromDeviceType(device.type()), WTFMove(name), String::number(identifier.toUInt64()), WTFMove(hashSalt)) 75 75 , m_identifier(identifier) 76 76 , m_manager(manager) 77 , m_device Type(deviceType)77 , m_device(device) 78 78 , m_shouldCaptureInGPUProcess(shouldCaptureInGPUProcess) 79 79 { 80 switch (m_deviceType) { 80 if (constraints) 81 m_constraints = *constraints; 82 83 switch (m_device.type()) { 81 84 case CaptureDevice::DeviceType::Microphone: 82 85 #if PLATFORM(IOS_FAMILY) … … 98 101 } 99 102 100 void RemoteRealtimeMediaSource::createRemoteMediaSource(const CaptureDevice& device, const MediaConstraints& constraints) 101 { 102 if (m_shouldCaptureInGPUProcess) { 103 m_device = device; 104 m_constraints = constraints; 105 } 106 107 connection()->sendWithAsyncReply(Messages::UserMediaCaptureManagerProxy::CreateMediaSourceForCaptureDeviceWithConstraints(identifier(), device, deviceIDHashSalt(), constraints), [this, protectedThis = makeRef(*this)](bool succeeded, auto&& errorMessage, auto&& settings, auto&& capabilities) { 103 void RemoteRealtimeMediaSource::createRemoteMediaSource() 104 { 105 connection()->sendWithAsyncReply(Messages::UserMediaCaptureManagerProxy::CreateMediaSourceForCaptureDeviceWithConstraints(identifier(), m_device, deviceIDHashSalt(), m_constraints), [this, protectedThis = makeRef(*this)](bool succeeded, auto&& errorMessage, auto&& settings, auto&& capabilities) { 108 106 if (!succeeded) { 109 107 didFail(WTFMove(errorMessage)); … … 124 122 WebProcess::singleton().ensureGPUProcessConnection().removeClient(*this); 125 123 126 switch (m_device Type) {124 switch (m_device.type()) { 127 125 case CaptureDevice::DeviceType::Microphone: 128 126 #if PLATFORM(IOS_FAMILY) … … 185 183 return *this; 186 184 187 auto cloneSource = adoptRef(*new RemoteRealtimeMediaSource(identifier, deviceType(), String { m_settings.label().string() }, deviceIDHashSalt(), m_manager, m_shouldCaptureInGPUProcess));185 auto cloneSource = adoptRef(*new RemoteRealtimeMediaSource(identifier, m_device, &m_constraints, String { m_settings.label().string() }, deviceIDHashSalt(), m_manager, m_shouldCaptureInGPUProcess)); 188 186 cloneSource->setSettings(RealtimeMediaSourceSettings { m_settings }); 189 187 m_manager.addSource(cloneSource.copyRef()); … … 330 328 331 329 m_manager.didUpdateSourceConnection(*this); 332 createRemoteMediaSource( m_device, m_constraints);330 createRemoteMediaSource(); 333 331 // FIXME: We should update the track according current settings. 334 332 if (isProducingData()) -
trunk/Source/WebKit/WebProcess/cocoa/RemoteRealtimeMediaSource.h
r272381 r272434 55 55 { 56 56 public: 57 static Ref<WebCore::RealtimeMediaSource> create(const WebCore::CaptureDevice&, const WebCore::MediaConstraints &, String&& name, String&& hashSalt, UserMediaCaptureManager&, bool shouldCaptureInGPUProcess);57 static Ref<WebCore::RealtimeMediaSource> create(const WebCore::CaptureDevice&, const WebCore::MediaConstraints*, String&& name, String&& hashSalt, UserMediaCaptureManager&, bool shouldCaptureInGPUProcess); 58 58 ~RemoteRealtimeMediaSource(); 59 59 … … 73 73 74 74 private: 75 RemoteRealtimeMediaSource(WebCore::RealtimeMediaSourceIdentifier, WebCore::CaptureDevice::DeviceType, String&& name, String&& hashSalt, UserMediaCaptureManager&, bool shouldCaptureInGPUProcess);75 RemoteRealtimeMediaSource(WebCore::RealtimeMediaSourceIdentifier, const WebCore::CaptureDevice&, const WebCore::MediaConstraints*, String&& name, String&& hashSalt, UserMediaCaptureManager&, bool shouldCaptureInGPUProcess); 76 76 77 77 // RealtimeMediaSource … … 89 89 const WebCore::RealtimeMediaSourceCapabilities& capabilities() final; 90 90 void whenReady(CompletionHandler<void(String)>&&) final; 91 WebCore::CaptureDevice::DeviceType deviceType() const final { return m_device Type; }91 WebCore::CaptureDevice::DeviceType deviceType() const final { return m_device.type(); } 92 92 Ref<RealtimeMediaSource> clone() final; 93 93 … … 97 97 #endif 98 98 99 void createRemoteMediaSource( const WebCore::CaptureDevice&, const WebCore::MediaConstraints&);99 void createRemoteMediaSource(); 100 100 void didFail(String&& errorMessage); 101 101 void setAsReady(); … … 108 108 WebCore::RealtimeMediaSourceSettings m_settings; 109 109 110 WebCore::CaptureDevice m_device; 111 WebCore::MediaConstraints m_constraints; 112 110 113 std::unique_ptr<WebCore::ImageTransferSessionVT> m_imageTransferSession; 111 WebCore::CaptureDevice::DeviceType m_deviceType { WebCore::CaptureDevice::DeviceType::Unknown };112 114 113 115 Deque<ApplyConstraintsHandler> m_pendingApplyConstraintsCallbacks; … … 117 119 String m_errorMessage; 118 120 CompletionHandler<void(String)> m_callback; 119 WebCore::CaptureDevice m_device;120 WebCore::MediaConstraints m_constraints;121 121 }; 122 122 -
trunk/Source/WebKit/WebProcess/cocoa/UserMediaCaptureManager.cpp
r272205 r272434 145 145 CaptureSourceOrError UserMediaCaptureManager::AudioFactory::createAudioCaptureSource(const CaptureDevice& device, String&& hashSalt, const MediaConstraints* constraints) 146 146 { 147 if (!constraints)148 return { };149 150 147 #if !ENABLE(GPU_PROCESS) 151 148 if (m_shouldCaptureInGPUProcess) … … 159 156 #endif 160 157 161 return RemoteRealtimeMediaSource::create(device, *constraints, { }, WTFMove(hashSalt), m_manager, m_shouldCaptureInGPUProcess);158 return RemoteRealtimeMediaSource::create(device, constraints, { }, WTFMove(hashSalt), m_manager, m_shouldCaptureInGPUProcess); 162 159 } 163 160 … … 169 166 CaptureSourceOrError UserMediaCaptureManager::VideoFactory::createVideoCaptureSource(const CaptureDevice& device, String&& hashSalt, const MediaConstraints* constraints) 170 167 { 171 if (!constraints)172 return { };173 174 168 #if !ENABLE(GPU_PROCESS) 175 169 if (m_shouldCaptureInGPUProcess) … … 177 171 #endif 178 172 179 return RemoteRealtimeMediaSource::create(device, *constraints, { }, WTFMove(hashSalt), m_manager, m_shouldCaptureInGPUProcess);173 return RemoteRealtimeMediaSource::create(device, constraints, { }, WTFMove(hashSalt), m_manager, m_shouldCaptureInGPUProcess); 180 174 } 181 175 … … 189 183 CaptureSourceOrError UserMediaCaptureManager::DisplayFactory::createDisplayCaptureSource(const CaptureDevice& device, const MediaConstraints* constraints) 190 184 { 191 if (!constraints) 192 return { }; 193 194 return RemoteRealtimeMediaSource::create(device, *constraints, { }, { }, m_manager, false); 185 return RemoteRealtimeMediaSource::create(device, constraints, { }, { }, m_manager, false); 195 186 } 196 187
Note: See TracChangeset
for help on using the changeset viewer.