Changeset 189913 in webkit
- Timestamp:
- Sep 17, 2015, 8:33:29 AM (10 years ago)
- Location:
- trunk/Source/WebCore
- Files:
-
- 10 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/Source/WebCore/ChangeLog
r189911 r189913 1 2015-09-17 Eric Carlson <eric.carlson@apple.com> 2 3 [Mac MediaStream] Cleanup capture source classes 4 https://bugs.webkit.org/show_bug.cgi?id=149233 5 6 Reviewed by Jer Noble. 7 8 * platform/cf/CoreMediaSoftLink.cpp: Soft-link CMAudioFormatDescriptionGetStreamBasicDescription, 9 CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, and CMSampleBufferGetNumSamples. 10 * platform/cf/CoreMediaSoftLink.h: 11 12 * platform/mediastream/mac/AVAudioCaptureSource.h: 13 (WebCore::AVAudioCaptureSource::Observer::~Observer): 14 * platform/mediastream/mac/AVAudioCaptureSource.mm: 15 (WebCore::AVAudioCaptureSource::AVAudioCaptureSource): Initialize m_inputDescription. 16 (WebCore::AVAudioCaptureSource::capabilities): 0 -> nullptr. 17 (WebCore::AVAudioCaptureSource::addObserver): New, add an observer and tell it to prepare. 18 (WebCore::AVAudioCaptureSource::removeObserver): New. 19 (WebCore::operator==): Compare AudioStreamBasicDescription. 20 (WebCore::operator!=): 21 (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Call 22 observer->prepare when passed a new stream description, call observer->process. 23 24 * platform/mediastream/mac/AVCaptureDeviceManager.mm: 25 (WebCore::refreshCaptureDeviceList): Set m_groupID and m_localizedName. 26 (WebCore::AVCaptureDeviceManager::sessionSupportsConstraint): Invalid constraint names should 27 be ignored, so return true when passed one. 28 (WebCore::AVCaptureDeviceManager::getSourcesInfo): This just didn't work, fix it. 29 (WebCore::AVCaptureDeviceManager::verifyConstraintsForMediaType): Optional constraints are 30 optional so they don't need to be validated. 31 (WebCore::AVCaptureDeviceManager::bestSourcesForTypeAndConstraints): m_audioSource -> m_audioAVMediaCaptureSource, 32 m_videoSource -> m_videoAVMediaCaptureSource. 33 (WebCore::AVCaptureDeviceManager::sourceWithUID): Ditto. 34 35 * platform/mediastream/mac/AVMediaCaptureSource.h: 36 (WebCore::AVMediaCaptureSource::session): 37 (WebCore::AVMediaCaptureSource::device): 38 (WebCore::AVMediaCaptureSource::currentStates): 39 (WebCore::AVMediaCaptureSource::constraints): 40 (WebCore::AVMediaCaptureSource::statesDidChanged): 41 (WebCore::AVMediaCaptureSource::createWeakPtr): 42 (WebCore::AVMediaCaptureSource::buffer): Deleted. 43 (WebCore::AVMediaCaptureSource::setBuffer): Deleted. 44 * platform/mediastream/mac/AVMediaCaptureSource.mm: 45 (WebCore::AVMediaCaptureSource::AVMediaCaptureSource): Initilize m_weakPtrFactory. 46 (WebCore::AVMediaCaptureSource::scheduleDeferredTask): New, call a function asynchronously on 47 the main thread. 48 (-[WebCoreAVMediaCaptureSourceObserver captureOutput:didOutputSampleBuffer:fromConnection:]): Don't 49 dispatch calls to the main thread, let the derived classes do that if necessary. 50 51 * platform/mediastream/mac/AVVideoCaptureSource.h: 52 (WebCore::AVVideoCaptureSource::width): 53 (WebCore::AVVideoCaptureSource::height): 54 (WebCore::AVVideoCaptureSource::previewLayer): 55 (WebCore::AVVideoCaptureSource::currentFrameSampleBuffer): 56 * platform/mediastream/mac/AVVideoCaptureSource.mm: 57 (WebCore::AVVideoCaptureSource::setFrameRateConstraint): Remove unwanted logging. 58 (WebCore::AVVideoCaptureSource::setupCaptureSession): Configure the AVCaptureVideoDataOutput so 59 it delivers 32-bit BGRA samples. 60 (WebCore::AVVideoCaptureSource::calculateFramerate): Return bool to signal if the frame rate 61 changed. 62 (WebCore::AVVideoCaptureSource::processNewFrame): New. Process sample buffer, invalidate cached 63 image, signal when characteristics change. 64 (WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Schedule 65 call to processNewFrame on the main thread so we do all video processing on main thread. 66 (WebCore::AVVideoCaptureSource::currentFrameImage): Create and return a CVImageBuffer of the 67 current video frame. 68 (WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Draw the current frame to a context. 69 1 70 2015-09-15 Sergio Villar Senin <svillar@igalia.com> 2 71 -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp
r187987 r189913 102 102 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimebaseNotification_EffectiveRateChanged, CFStringRef) 103 103 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimebaseNotification_TimeJumped, CFStringRef) 104 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc)) 105 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, OSStatus, (CMSampleBufferRef sbuf, size_t *bufferListSizeNeededOut, AudioBufferList *bufferListOut, size_t bufferListSize, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, CMBlockBufferRef *blockBufferOut), (sbuf, bufferListSizeNeededOut, bufferListOut, bufferListSize, bbufStructAllocator, bbufMemoryAllocator, flags, blockBufferOut)) 106 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf)) 104 107 #endif // PLATFORM(COCOA) 105 108 -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h
r187987 r189913 169 169 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimebaseNotification_TimeJumped, CFStringRef) 170 170 #define kCMTimebaseNotification_TimeJumped get_CoreMedia_kCMTimebaseNotification_TimeJumped() 171 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc)) 172 #define CMAudioFormatDescriptionGetStreamBasicDescription softLink_CoreMedia_CMAudioFormatDescriptionGetStreamBasicDescription 173 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, OSStatus, (CMSampleBufferRef sbuf, size_t *bufferListSizeNeededOut, AudioBufferList *bufferListOut, size_t bufferListSize, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, CMBlockBufferRef *blockBufferOut), (sbuf, bufferListSizeNeededOut, bufferListOut, bufferListSize, bbufStructAllocator, bbufMemoryAllocator, flags, blockBufferOut)) 174 #define CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer softLink_CoreMedia_CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer 175 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf)) 176 #define CMSampleBufferGetNumSamples softLink_CoreMedia_CMSampleBufferGetNumSamples 171 177 172 178 #endif // PLATFORM(COCOA) -
trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h
r181152 r189913 1 1 /* 2 * Copyright (C) 2013 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 30 30 31 31 #include "AVMediaCaptureSource.h" 32 #include <wtf/Lock.h> 32 33 34 typedef struct AudioStreamBasicDescription AudioStreamBasicDescription; 33 35 typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef; 34 36 35 37 namespace WebCore { 36 38 37 39 class AVAudioCaptureSource : public AVMediaCaptureSource { 38 40 public: 41 42 class Observer { 43 public: 44 virtual ~Observer() { } 45 virtual void prepare(const AudioStreamBasicDescription *) = 0; 46 virtual void unprepare() = 0; 47 virtual void process(CMFormatDescriptionRef, CMSampleBufferRef) = 0; 48 }; 49 39 50 static RefPtr<AVMediaCaptureSource> create(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>); 40 51 52 void addObserver(Observer*); 53 void removeObserver(Observer*); 54 41 55 private: 42 56 AVAudioCaptureSource(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>); 43 57 virtual ~AVAudioCaptureSource(); 44 58 45 virtualRefPtr<RealtimeMediaSourceCapabilities> capabilities() const override;46 v irtual void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override;59 RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override; 60 void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override; 47 61 48 v irtual void setupCaptureSession() override;49 v irtual void updateStates() override;50 62 void setupCaptureSession() override; 63 void updateStates() override; 64 51 65 RetainPtr<AVCaptureConnection> m_audioConnection; 52 RetainPtr<CMFormatDescriptionRef> m_audioFormatDescription; 66 67 std::unique_ptr<AudioStreamBasicDescription> m_inputDescription; 68 Vector<Observer*> m_observers; 69 Lock m_lock; 53 70 }; 54 71 -
trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm
r186182 r189913 1 1 /* 2 * Copyright (C) 2013 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 25 25 26 26 #import "config.h" 27 #import "AVAudioCaptureSource.h" 27 28 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 29 30 30 #import "AVAudioCaptureSource.h"31 32 #import "CoreMediaSoftLink.h"33 31 #import "Logging.h" 34 32 #import "MediaConstraints.h" … … 37 35 #import "SoftLinking.h" 38 36 #import <AVFoundation/AVFoundation.h> 39 #import <objc/runtime.h> 37 #import <CoreAudio/CoreAudioTypes.h> 38 #import <wtf/HashSet.h> 39 40 #import "CoreMediaSoftLink.h" 40 41 41 42 typedef AVCaptureConnection AVCaptureConnectionType; … … 69 70 currentStates()->setSourceId(id); 70 71 currentStates()->setSourceType(RealtimeMediaSourceStates::Microphone); 72 m_inputDescription = std::make_unique<AudioStreamBasicDescription>(); 71 73 } 72 74 … … 78 80 { 79 81 notImplemented(); 80 return 0;82 return nullptr; 81 83 } 82 84 … … 84 86 { 85 87 // FIXME: use [AVCaptureAudioPreviewOutput volume] for volume 88 } 89 90 void AVAudioCaptureSource::addObserver(AVAudioCaptureSource::Observer* observer) 91 { 92 { 93 LockHolder lock(m_lock); 94 m_observers.append(observer); 95 } 96 97 if (m_inputDescription->mSampleRate) 98 observer->prepare(m_inputDescription.get()); 99 } 100 101 void AVAudioCaptureSource::removeObserver(AVAudioCaptureSource::Observer* observer) 102 { 103 LockHolder lock(m_lock); 104 size_t pos = m_observers.find(observer); 105 if (pos != notFound) 106 m_observers.remove(pos); 86 107 } 87 108 … … 101 122 } 102 123 124 static bool operator==(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b) 125 { 126 return a.mSampleRate == b.mSampleRate 127 && a.mFormatID == b.mFormatID 128 && a.mFormatFlags == b.mFormatFlags 129 && a.mBytesPerPacket == b.mBytesPerPacket 130 && a.mFramesPerPacket == b.mFramesPerPacket 131 && a.mBytesPerFrame == b.mBytesPerFrame 132 && a.mChannelsPerFrame == b.mChannelsPerFrame 133 && a.mBitsPerChannel == b.mBitsPerChannel; 134 } 135 136 static bool operator!=(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b) 137 { 138 return !(a == b); 139 } 140 103 141 void AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*) 104 142 { 143 Vector<Observer*> observers; 144 { 145 LockHolder lock(m_lock); 146 if (m_observers.isEmpty()) 147 return; 148 149 copyToVector(m_observers, observers); 150 } 151 105 152 CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); 106 153 if (!formatDescription) 107 154 return; 108 155 109 CFRetain(formatDescription); 110 m_audioFormatDescription = adoptCF(formatDescription); 156 const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription); 157 if (*m_inputDescription != *streamDescription) { 158 m_inputDescription = std::make_unique<AudioStreamBasicDescription>(*streamDescription); 159 for (auto& observer : observers) 160 observer->prepare(m_inputDescription.get()); 161 } 162 163 for (auto& observer : observers) 164 observer->process(formatDescription, sampleBuffer); 111 165 } 112 166 -
trunk/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm
r187282 r189913 25 25 26 26 #import "config.h" 27 #import "AVCaptureDeviceManager.h" 27 28 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 29 30 #import "AVCaptureDeviceManager.h"31 30 32 31 #import "AVAudioCaptureSource.h" 33 32 #import "AVMediaCaptureSource.h" 34 33 #import "AVVideoCaptureSource.h" 34 #import "AudioSourceProvider.h" 35 35 #import "Logging.h" 36 36 #import "MediaConstraints.h" … … 100 100 101 101 String m_captureDeviceID; 102 String m_localizedName; 103 String m_groupID; 102 104 103 105 String m_audioSourceId; 104 RefPtr<AVMediaCaptureSource> m_audio Source;106 RefPtr<AVMediaCaptureSource> m_audioAVMediaCaptureSource; 105 107 106 108 String m_videoSourceId; 107 RefPtr<AVMediaCaptureSource> m_video Source;109 RefPtr<AVMediaCaptureSource> m_videoAVMediaCaptureSource; 108 110 109 111 bool m_enabled; … … 158 160 if ([device hasMediaType:AVMediaTypeVideo] || [device hasMediaType:AVMediaTypeMuxed]) 159 161 source.m_videoSourceId = createCanonicalUUIDString(); 162 163 source.m_groupID = createCanonicalUUIDString(); 164 source.m_localizedName = device.localizedName; 160 165 161 166 devices.append(source); … … 259 264 size_t constraint = validConstraintNames().find(name); 260 265 if (constraint == notFound) 261 return false;266 return true; 262 267 263 268 switch (constraint) { … … 323 328 324 329 Vector<CaptureDevice>& devices = captureDeviceList(); 325 size_t count = devices.size(); 326 for (size_t i = 0; i < count; ++i) { 327 AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:devices[i].m_captureDeviceID]; 328 ASSERT(device); 329 330 if (!devices[i].m_enabled) 330 for (auto captureDevice : devices) { 331 332 if (!captureDevice.m_enabled) 331 333 continue; 332 // FIXME: Change groupID from localizedName to something more meaningful 333 if ( devices[i].m_videoSource)334 sourcesInfo.append(TrackSourceInfo::create( devices[i].m_videoSourceId, TrackSourceInfo::Video, device.localizedName, device.localizedName, devices[i].m_captureDeviceID));335 if ( devices[i].m_audioSource)336 sourcesInfo.append(TrackSourceInfo::create( devices[i].m_audioSourceId, TrackSourceInfo::Audio, device.localizedName, device.localizedName, devices[i].m_captureDeviceID));337 } 338 334 335 if (!captureDevice.m_videoSourceId.isEmpty()) 336 sourcesInfo.append(TrackSourceInfo::create(captureDevice.m_videoSourceId, TrackSourceInfo::Video, captureDevice.m_localizedName, captureDevice.m_groupID, captureDevice.m_captureDeviceID)); 337 if (!captureDevice.m_audioSourceId.isEmpty()) 338 sourcesInfo.append(TrackSourceInfo::create(captureDevice.m_audioSourceId, TrackSourceInfo::Audio, captureDevice.m_localizedName, captureDevice.m_groupID, captureDevice.m_captureDeviceID)); 339 } 340 339 341 LOG(Media, "AVCaptureDeviceManager::getSourcesInfo(%p), found %d active devices", this, sourcesInfo.size()); 340 342 … … 353 355 constraints->getMandatoryConstraints(mandatoryConstraints); 354 356 if (mandatoryConstraints.size()) { 357 358 // FIXME: this method should take an AVCaptureDevice and use its AVCaptureSession instead of creating a new one. 355 359 RetainPtr<AVCaptureSessionType> session = adoptNS([allocAVCaptureSessionInstance() init]); 356 360 for (size_t i = 0; i < mandatoryConstraints.size(); ++i) { … … 362 366 } 363 367 } 364 365 Vector<MediaConstraint> optionalConstraints;366 constraints->getOptionalConstraints(optionalConstraints);367 if (!optionalConstraints.size())368 return true;369 370 for (size_t i = 0; i < optionalConstraints.size(); ++i) {371 const MediaConstraint& constraint = optionalConstraints[i];372 if (!isValidConstraint(type, constraint.m_name)) {373 invalidConstraint = constraint.m_name;374 return false;375 }376 }377 368 378 369 return true; … … 393 384 } sortBasedOffFitnessScore; 394 385 395 for (auto& captureDevice : captureDeviceList()) { 386 Vector<CaptureDevice>& devices = captureDeviceList(); 387 388 for (auto& captureDevice : devices) { 396 389 if (!captureDevice.m_enabled) 397 390 continue; … … 400 393 // device of the appropriate type. 401 394 if (type == RealtimeMediaSource::Audio && !captureDevice.m_audioSourceId.isEmpty()) { 402 if (!captureDevice.m_audio Source) {395 if (!captureDevice.m_audioAVMediaCaptureSource) { 403 396 AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:captureDevice.m_captureDeviceID]; 404 397 ASSERT(device); 405 captureDevice.m_audio Source = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints);398 captureDevice.m_audioAVMediaCaptureSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints); 406 399 } 407 bestSourcesList.append(captureDevice.m_audio Source);400 bestSourcesList.append(captureDevice.m_audioAVMediaCaptureSource); 408 401 } 409 402 410 403 if (type == RealtimeMediaSource::Video && !captureDevice.m_videoSourceId.isEmpty()) { 411 if (!captureDevice.m_video Source) {404 if (!captureDevice.m_videoAVMediaCaptureSource) { 412 405 AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:captureDevice.m_captureDeviceID]; 413 406 ASSERT(device); 414 captureDevice.m_video Source = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints);407 captureDevice.m_videoAVMediaCaptureSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints); 415 408 } 416 bestSourcesList.append(captureDevice.m_video Source);409 bestSourcesList.append(captureDevice.m_videoAVMediaCaptureSource); 417 410 } 418 411 } … … 443 436 ASSERT(device); 444 437 if (type == RealtimeMediaSource::Type::Audio && !captureDevice.m_audioSourceId.isEmpty()) { 445 captureDevice.m_audioSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints); 446 return captureDevice.m_audioSource; 438 if (!captureDevice.m_audioAVMediaCaptureSource) 439 captureDevice.m_audioAVMediaCaptureSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints); 440 return captureDevice.m_audioAVMediaCaptureSource; 447 441 } 448 442 if (type == RealtimeMediaSource::Type::Video && !captureDevice.m_videoSourceId.isEmpty()) { 449 captureDevice.m_videoSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints); 450 return captureDevice.m_videoSource; 443 if (!captureDevice.m_videoAVMediaCaptureSource) 444 captureDevice.m_videoAVMediaCaptureSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints); 445 return captureDevice.m_videoAVMediaCaptureSource; 451 446 } 452 447 } -
trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h
r187898 r189913 1 1 /* 2 * Copyright (C) 2013 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 29 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 30 30 31 #include "GenericTaskQueue.h" 31 32 #include "RealtimeMediaSource.h" 33 #include "Timer.h" 32 34 #include <wtf/RetainPtr.h> 35 #include <wtf/WeakPtr.h> 33 36 34 37 OBJC_CLASS AVCaptureAudioDataOutput; … … 54 57 AVCaptureSession *session() const { return m_session.get(); } 55 58 59 void startProducingData() override; 60 void stopProducingData() override; 61 56 62 protected: 57 63 AVMediaCaptureSource(AVCaptureDevice*, const AtomicString&, RealtimeMediaSource::Type, PassRefPtr<MediaConstraints>); 58 64 59 virtual const RealtimeMediaSourceStates& states() override; 60 61 virtual void startProducingData() override; 62 virtual void stopProducingData() override; 65 const RealtimeMediaSourceStates& states() override; 63 66 64 67 virtual void setupCaptureSession() = 0; … … 68 71 RealtimeMediaSourceStates* currentStates() { return &m_currentStates; } 69 72 MediaConstraints* constraints() { return m_constraints.get(); } 70 CMSampleBufferRef buffer() const { return m_buffer.get(); }71 73 72 74 void setVideoSampleBufferDelegate(AVCaptureVideoDataOutput*); 73 75 void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*); 76 77 void scheduleDeferredTask(std::function<void ()>); 78 79 void statesDidChanged() { } 74 80 75 void setBuffer(CMSampleBufferRef buffer) { m_buffer = buffer; }76 77 81 private: 78 82 void setupSession(); 83 WeakPtr<AVMediaCaptureSource> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); } 79 84 85 WeakPtrFactory<AVMediaCaptureSource> m_weakPtrFactory; 80 86 RetainPtr<WebCoreAVMediaCaptureSourceObserver> m_objcObserver; 81 87 RefPtr<MediaConstraints> m_constraints; … … 83 89 RetainPtr<AVCaptureSession> m_session; 84 90 RetainPtr<AVCaptureDevice> m_device; 85 RetainPtr<CMSampleBufferRef> m_buffer;86 91 87 92 bool m_isRunning; -
trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm
r186182 r189913 1 1 /* 2 * Copyright (C) 2013 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 25 25 26 26 #import "config.h" 27 #import "AVMediaCaptureSource.h" 27 28 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 29 30 30 #import "AVMediaCaptureSource.h" 31 31 #import "AudioSourceProvider.h" 32 32 #import "Logging.h" 33 33 #import "MediaConstraints.h" … … 124 124 AVMediaCaptureSource::AVMediaCaptureSource(AVCaptureDeviceType* device, const AtomicString& id, RealtimeMediaSource::Type type, PassRefPtr<MediaConstraints> constraints) 125 125 : RealtimeMediaSource(id, type, emptyString()) 126 , m_weakPtrFactory(this) 126 127 , m_objcObserver(adoptNS([[WebCoreAVMediaCaptureSourceObserver alloc] initWithCallback:this])) 127 128 , m_constraints(constraints) … … 129 130 , m_isRunning(false) 130 131 { 131 setName( [device localizedName]);132 setName(device.localizedName); 132 133 m_currentStates.setSourceType(type == RealtimeMediaSource::Video ? RealtimeMediaSourceStates::Camera : RealtimeMediaSourceStates::Microphone); 133 134 } … … 194 195 { 195 196 [audioOutput setSampleBufferDelegate:m_objcObserver.get() queue:globaAudioCaptureSerialQueue()]; 197 } 198 199 void AVMediaCaptureSource::scheduleDeferredTask(std::function<void ()> function) 200 { 201 ASSERT(function); 202 203 auto weakThis = createWeakPtr(); 204 callOnMainThread([weakThis, function] { 205 if (!weakThis) 206 return; 207 208 function(); 209 }); 196 210 } 197 211 … … 233 247 return; 234 248 235 CFRetain(sampleBuffer); 236 dispatch_async(dispatch_get_main_queue(), ^{ 237 if (m_callback) 238 m_callback->captureOutputDidOutputSampleBufferFromConnection(captureOutput, sampleBuffer, connection); 239 CFRelease(sampleBuffer); 240 }); 249 m_callback->captureOutputDidOutputSampleBufferFromConnection(captureOutput, sampleBuffer, connection); 241 250 } 242 251 -
trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h
r187208 r189913 1 1 /* 2 * Copyright (C) 2013 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 33 33 OBJC_CLASS AVCaptureVideoPreviewLayer; 34 34 35 typedef struct CGImage *CGImageRef; 35 36 typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef; 37 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 36 38 37 39 namespace WebCore { 40 41 class FloatRect; 42 class GraphicsContext; 38 43 39 44 class AVVideoCaptureSource : public AVMediaCaptureSource { … … 41 46 static RefPtr<AVMediaCaptureSource> create(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>); 42 47 43 virtual RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override; 44 virtual void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override; 45 46 virtual int32_t width() const { return m_width; } 47 virtual int32_t height() const { return m_height; } 48 int32_t width() const { return m_width; } 49 int32_t height() const { return m_height; } 48 50 49 51 AVCaptureVideoPreviewLayer* previewLayer() { return m_videoPreviewLayer.get(); } 50 52 CMSampleBufferRef currentFrameSampleBuffer() const { return m_buffer.get(); } 53 void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&); 54 RetainPtr<CGImageRef> currentFrameImage(); 55 51 56 private: 52 57 AVVideoCaptureSource(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>); 53 58 virtual ~AVVideoCaptureSource(); 54 59 55 virtual void setupCaptureSession() override; 56 virtual void updateStates() override; 60 void setupCaptureSession() override; 61 void updateStates() override; 62 63 RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override; 57 64 58 65 bool applyConstraints(MediaConstraints*); 59 66 bool setFrameRateConstraint(float minFrameRate, float maxFrameRate); 60 67 61 void calculateFramerate(CMSampleBufferRef); 68 bool calculateFramerate(CMSampleBufferRef); 69 70 void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override; 71 void processNewFrame(RetainPtr<CMSampleBufferRef>); 62 72 63 73 RetainPtr<AVCaptureConnection> m_videoConnection; 64 RetainPtr<CMFormatDescriptionRef> m_videoFormatDescription;65 74 RetainPtr<AVCaptureVideoPreviewLayer> m_videoPreviewLayer; 75 RetainPtr<CMSampleBufferRef> m_buffer; 76 RetainPtr<CGImageRef> m_lastImage; 66 77 Vector<Float64> m_videoFrameTimeStamps; 67 78 Float64 m_frameRate; -
trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm
r187898 r189913 1 1 /* 2 * Copyright (C) 2013 ,2015 Apple Inc. All rights reserved.2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 25 25 26 26 #import "config.h" 27 #import "AVVideoCaptureSource.h" 27 28 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 29 30 #import "AVVideoCaptureSource.h"31 30 32 31 #import "AVCaptureDeviceManager.h" 33 32 #import "BlockExceptions.h" 33 #import "GraphicsContextCG.h" 34 #import "IntRect.h" 34 35 #import "Logging.h" 35 36 #import "MediaConstraints.h" … … 49 50 50 51 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) 52 SOFT_LINK_FRAMEWORK_OPTIONAL(CoreVideo) 51 53 52 54 SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection) … … 76 78 #define AVCaptureSessionPreset352x288 getAVCaptureSessionPreset352x288() 77 79 #define AVCaptureSessionPresetLow getAVCaptureSessionPresetLow() 80 81 SOFT_LINK(CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 82 SOFT_LINK(CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 83 SOFT_LINK(CoreVideo, CVPixelBufferGetBaseAddress, void*, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 84 SOFT_LINK(CoreVideo, CVPixelBufferGetBytesPerRow, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 85 SOFT_LINK(CoreVideo, CVPixelBufferGetPixelFormatType, OSType, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 86 SOFT_LINK(CoreVideo, CVPixelBufferLockBaseAddress, CVReturn, (CVPixelBufferRef pixelBuffer, CVOptionFlags lockFlags), (pixelBuffer, lockFlags)) 87 SOFT_LINK(CoreVideo, CVPixelBufferUnlockBaseAddress, CVReturn, (CVPixelBufferRef pixelBuffer, CVOptionFlags lockFlags), (pixelBuffer, lockFlags)) 88 89 SOFT_LINK_POINTER(CoreVideo, kCVPixelBufferPixelFormatTypeKey, NSString *) 90 #define kCVPixelBufferPixelFormatTypeKey getkCVPixelBufferPixelFormatTypeKey() 78 91 79 92 namespace WebCore { … … 158 171 } 159 172 160 NSLog(@"set frame rate to %f", [bestFrameRateRange minFrameRate]);161 173 LOG(Media, "AVVideoCaptureSource::setFrameRateConstraint(%p) - set frame rate range to %f..%f", this, minFrameRate, maxFrameRate); 162 174 return true; … … 210 222 if ([session() canAddInput:videoIn.get()]) 211 223 [session() addInput:videoIn.get()]; 212 224 213 225 if (constraints()) 214 226 applyConstraints(constraints()); 215 227 216 228 RetainPtr<AVCaptureVideoDataOutputType> videoOutput = adoptNS([allocAVCaptureVideoDataOutputInstance() init]); 229 RetainPtr<NSDictionary> settingsDictionary = adoptNS([[NSDictionary alloc] initWithObjectsAndKeys: 230 [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey 231 , nil]); 232 [videoOutput setVideoSettings:settingsDictionary.get()]; 217 233 setVideoSampleBufferDelegate(videoOutput.get()); 218 234 ASSERT([session() canAddOutput:videoOutput.get()]); 219 235 if ([session() canAddOutput:videoOutput.get()]) 220 236 [session() addOutput:videoOutput.get()]; 221 237 222 238 m_videoConnection = adoptNS([videoOutput.get() connectionWithMediaType:AVMediaTypeVideo]); 223 224 239 m_videoPreviewLayer = adoptNS([[AVCaptureVideoPreviewLayer alloc] initWithSession:session()]); 225 240 } 226 241 227 voidAVVideoCaptureSource::calculateFramerate(CMSampleBufferRef sampleBuffer)242 bool AVVideoCaptureSource::calculateFramerate(CMSampleBufferRef sampleBuffer) 228 243 { 229 244 CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 230 245 if (!CMTIME_IS_NUMERIC(sampleTime)) 231 return ;246 return false; 232 247 233 248 Float64 frameTime = CMTimeGetSeconds(sampleTime); … … 238 253 while (m_videoFrameTimeStamps[0] < oneSecondAgo) 239 254 m_videoFrameTimeStamps.remove(0); 240 255 256 Float64 frameRate = m_frameRate; 241 257 m_frameRate = (m_frameRate + m_videoFrameTimeStamps.size()) / 2; 242 } 243 244 void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*) 245 { 246 CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); 258 259 return frameRate != m_frameRate; 260 } 261 262 void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBuffer) 263 { 264 CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer.get()); 247 265 if (!formatDescription) 248 266 return; 249 267 250 CFRetain(formatDescription); 251 m_videoFormatDescription = adoptCF(formatDescription); 252 calculateFramerate(sampleBuffer); 268 bool statesChanged = false; 269 270 statesChanged = calculateFramerate(sampleBuffer.get()); 271 m_buffer = sampleBuffer; 272 m_lastImage = nullptr; 253 273 254 274 CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription); 255 m_width = dimensions.width; 256 m_height = dimensions.height; 257 258 setBuffer(sampleBuffer); 275 if (dimensions.width != m_width || dimensions.height != m_height) { 276 m_width = dimensions.width; 277 m_height = dimensions.height; 278 statesChanged = true; 279 } 280 281 if (statesChanged) 282 this->statesDidChanged(); 283 } 284 285 void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*) 286 { 287 RetainPtr<CMSampleBufferRef> buffer = sampleBuffer; 288 289 scheduleDeferredTask([this, buffer] { 290 this->processNewFrame(buffer); 291 }); 292 } 293 294 RetainPtr<CGImageRef> AVVideoCaptureSource::currentFrameImage() 295 { 296 if (m_lastImage) 297 return m_lastImage; 298 299 if (!m_buffer) 300 return nullptr; 301 302 CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get())); 303 ASSERT(CVPixelBufferGetPixelFormatType(pixelBuffer) == kCVPixelFormatType_32BGRA); 304 305 CVPixelBufferLockBaseAddress(pixelBuffer, 0); 306 void *baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer); 307 size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer); 308 size_t width = CVPixelBufferGetWidth(pixelBuffer); 309 size_t height = CVPixelBufferGetHeight(pixelBuffer); 310 311 RetainPtr<CGDataProviderRef> provider = adoptCF(CGDataProviderCreateWithData(NULL, baseAddress, bytesPerRow * height, NULL)); 312 m_lastImage = adoptCF(CGImageCreate(width, height, 8, 32, bytesPerRow, deviceRGBColorSpaceRef(), kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, provider.get(), NULL, true, kCGRenderingIntentDefault)); 313 314 CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); 315 316 return m_lastImage; 317 } 318 319 void AVVideoCaptureSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect) 320 { 321 if (context.paintingDisabled() || !currentFrameImage()) 322 return; 323 324 GraphicsContextStateSaver stateSaver(context); 325 context.translate(rect.x(), rect.y() + rect.height()); 326 context.scale(FloatSize(1, -1)); 327 context.setImageInterpolationQuality(InterpolationLow); 328 IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height())); 329 CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_lastImage.get()); 259 330 } 260 331
Note:
See TracChangeset
for help on using the changeset viewer.