Changeset 189913 in webkit


Ignore:
Timestamp:
Sep 17, 2015 8:33:29 AM (9 years ago)
Author:
eric.carlson@apple.com
Message:

[Mac MediaStream] Cleanup capture source classes
https://bugs.webkit.org/show_bug.cgi?id=149233

Reviewed by Jer Noble.

  • platform/cf/CoreMediaSoftLink.cpp: Soft-link CMAudioFormatDescriptionGetStreamBasicDescription, CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, and CMSampleBufferGetNumSamples.
  • platform/cf/CoreMediaSoftLink.h:
  • platform/mediastream/mac/AVAudioCaptureSource.h:

(WebCore::AVAudioCaptureSource::Observer::~Observer):

  • platform/mediastream/mac/AVAudioCaptureSource.mm:

(WebCore::AVAudioCaptureSource::AVAudioCaptureSource): Initialize m_inputDescription.
(WebCore::AVAudioCaptureSource::capabilities): 0 -> nullptr.
(WebCore::AVAudioCaptureSource::addObserver): New, add an observer and tell it to prepare.
(WebCore::AVAudioCaptureSource::removeObserver): New.
(WebCore::operator==): Compare AudioStreamBasicDescription.
(WebCore::operator!=):
(WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Call

observer->prepare when passed a new stream description, call observer->process.

  • platform/mediastream/mac/AVCaptureDeviceManager.mm:

(WebCore::refreshCaptureDeviceList): Set m_groupID and m_localizedName.
(WebCore::AVCaptureDeviceManager::sessionSupportsConstraint): Invalid constraint names should

be ignored, so return true when passed one.

(WebCore::AVCaptureDeviceManager::getSourcesInfo): This just didn't work, fix it.
(WebCore::AVCaptureDeviceManager::verifyConstraintsForMediaType): Optional constraints are

optional so they don't need to be validated.

(WebCore::AVCaptureDeviceManager::bestSourcesForTypeAndConstraints): m_audioSource -> m_audioAVMediaCaptureSource,

m_videoSource -> m_videoAVMediaCaptureSource.

(WebCore::AVCaptureDeviceManager::sourceWithUID): Ditto.

  • platform/mediastream/mac/AVMediaCaptureSource.h:

(WebCore::AVMediaCaptureSource::session):
(WebCore::AVMediaCaptureSource::device):
(WebCore::AVMediaCaptureSource::currentStates):
(WebCore::AVMediaCaptureSource::constraints):
(WebCore::AVMediaCaptureSource::statesDidChanged):
(WebCore::AVMediaCaptureSource::createWeakPtr):
(WebCore::AVMediaCaptureSource::buffer): Deleted.
(WebCore::AVMediaCaptureSource::setBuffer): Deleted.

  • platform/mediastream/mac/AVMediaCaptureSource.mm:

(WebCore::AVMediaCaptureSource::AVMediaCaptureSource): Initilize m_weakPtrFactory.
(WebCore::AVMediaCaptureSource::scheduleDeferredTask): New, call a function asynchronously on

the main thread.

(-[WebCoreAVMediaCaptureSourceObserver captureOutput:didOutputSampleBuffer:fromConnection:]): Don't

dispatch calls to the main thread, let the derived classes do that if necessary.

  • platform/mediastream/mac/AVVideoCaptureSource.h:

(WebCore::AVVideoCaptureSource::width):
(WebCore::AVVideoCaptureSource::height):
(WebCore::AVVideoCaptureSource::previewLayer):
(WebCore::AVVideoCaptureSource::currentFrameSampleBuffer):

  • platform/mediastream/mac/AVVideoCaptureSource.mm:

(WebCore::AVVideoCaptureSource::setFrameRateConstraint): Remove unwanted logging.
(WebCore::AVVideoCaptureSource::setupCaptureSession): Configure the AVCaptureVideoDataOutput so

it delivers 32-bit BGRA samples.

(WebCore::AVVideoCaptureSource::calculateFramerate): Return bool to signal if the frame rate

changed.

(WebCore::AVVideoCaptureSource::processNewFrame): New. Process sample buffer, invalidate cached

image, signal when characteristics change.

(WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Schedule

call to processNewFrame on the main thread so we do all video processing on main thread.

(WebCore::AVVideoCaptureSource::currentFrameImage): Create and return a CVImageBuffer of the

current video frame.

(WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Draw the current frame to a context.

Location:
trunk/Source/WebCore
Files:
10 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r189911 r189913  
     12015-09-17  Eric Carlson  <eric.carlson@apple.com>
     2
     3        [Mac MediaStream] Cleanup capture source classes
     4        https://bugs.webkit.org/show_bug.cgi?id=149233
     5
     6        Reviewed by Jer Noble.
     7
     8        * platform/cf/CoreMediaSoftLink.cpp: Soft-link CMAudioFormatDescriptionGetStreamBasicDescription,
     9          CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, and CMSampleBufferGetNumSamples.
     10        * platform/cf/CoreMediaSoftLink.h:
     11
     12        * platform/mediastream/mac/AVAudioCaptureSource.h:
     13        (WebCore::AVAudioCaptureSource::Observer::~Observer):
     14        * platform/mediastream/mac/AVAudioCaptureSource.mm:
     15        (WebCore::AVAudioCaptureSource::AVAudioCaptureSource): Initialize m_inputDescription.
     16        (WebCore::AVAudioCaptureSource::capabilities): 0 -> nullptr.
     17        (WebCore::AVAudioCaptureSource::addObserver): New, add an observer and tell it to prepare.
     18        (WebCore::AVAudioCaptureSource::removeObserver): New.
     19        (WebCore::operator==): Compare AudioStreamBasicDescription.
     20        (WebCore::operator!=):
     21        (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Call
     22          observer->prepare when passed a new stream description, call observer->process.
     23
     24        * platform/mediastream/mac/AVCaptureDeviceManager.mm:
     25        (WebCore::refreshCaptureDeviceList): Set m_groupID and m_localizedName.
     26        (WebCore::AVCaptureDeviceManager::sessionSupportsConstraint): Invalid constraint names should
     27          be ignored, so return true when passed one.
     28        (WebCore::AVCaptureDeviceManager::getSourcesInfo): This just didn't work, fix it.
     29        (WebCore::AVCaptureDeviceManager::verifyConstraintsForMediaType): Optional constraints are
     30          optional so they don't need to be validated.
     31        (WebCore::AVCaptureDeviceManager::bestSourcesForTypeAndConstraints): m_audioSource -> m_audioAVMediaCaptureSource,
     32          m_videoSource -> m_videoAVMediaCaptureSource.
     33        (WebCore::AVCaptureDeviceManager::sourceWithUID): Ditto.
     34
     35        * platform/mediastream/mac/AVMediaCaptureSource.h:
     36        (WebCore::AVMediaCaptureSource::session):
     37        (WebCore::AVMediaCaptureSource::device):
     38        (WebCore::AVMediaCaptureSource::currentStates):
     39        (WebCore::AVMediaCaptureSource::constraints):
     40        (WebCore::AVMediaCaptureSource::statesDidChanged):
     41        (WebCore::AVMediaCaptureSource::createWeakPtr):
     42        (WebCore::AVMediaCaptureSource::buffer): Deleted.
     43        (WebCore::AVMediaCaptureSource::setBuffer): Deleted.
     44        * platform/mediastream/mac/AVMediaCaptureSource.mm:
     45        (WebCore::AVMediaCaptureSource::AVMediaCaptureSource): Initilize m_weakPtrFactory.
     46        (WebCore::AVMediaCaptureSource::scheduleDeferredTask): New, call a function asynchronously on
     47          the main thread.
     48        (-[WebCoreAVMediaCaptureSourceObserver captureOutput:didOutputSampleBuffer:fromConnection:]): Don't
     49          dispatch calls to the main thread, let the derived classes do that if necessary.
     50
     51        * platform/mediastream/mac/AVVideoCaptureSource.h:
     52        (WebCore::AVVideoCaptureSource::width):
     53        (WebCore::AVVideoCaptureSource::height):
     54        (WebCore::AVVideoCaptureSource::previewLayer):
     55        (WebCore::AVVideoCaptureSource::currentFrameSampleBuffer):
     56        * platform/mediastream/mac/AVVideoCaptureSource.mm:
     57        (WebCore::AVVideoCaptureSource::setFrameRateConstraint): Remove unwanted logging.
     58        (WebCore::AVVideoCaptureSource::setupCaptureSession): Configure the AVCaptureVideoDataOutput so
     59          it delivers 32-bit BGRA samples.
     60        (WebCore::AVVideoCaptureSource::calculateFramerate): Return bool to signal if the frame rate
     61          changed.
     62        (WebCore::AVVideoCaptureSource::processNewFrame): New. Process sample buffer, invalidate cached
     63          image, signal when characteristics change.
     64        (WebCore::AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Schedule
     65          call to processNewFrame on the main thread so we do all video processing on main thread.
     66        (WebCore::AVVideoCaptureSource::currentFrameImage): Create and return a CVImageBuffer of the
     67          current video frame.
     68        (WebCore::AVVideoCaptureSource::paintCurrentFrameInContext): Draw the current frame to a context.
     69
    1702015-09-15  Sergio Villar Senin  <svillar@igalia.com>
    271
  • trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp

    r187987 r189913  
    102102SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimebaseNotification_EffectiveRateChanged, CFStringRef)
    103103SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimebaseNotification_TimeJumped, CFStringRef)
     104SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc))
     105SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, OSStatus, (CMSampleBufferRef sbuf, size_t *bufferListSizeNeededOut, AudioBufferList *bufferListOut, size_t bufferListSize, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, CMBlockBufferRef *blockBufferOut), (sbuf, bufferListSizeNeededOut, bufferListOut, bufferListSize, bbufStructAllocator, bbufMemoryAllocator, flags, blockBufferOut))
     106SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf))
    104107#endif // PLATFORM(COCOA)
    105108
  • trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h

    r187987 r189913  
    169169SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimebaseNotification_TimeJumped, CFStringRef)
    170170#define kCMTimebaseNotification_TimeJumped get_CoreMedia_kCMTimebaseNotification_TimeJumped()
     171SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc))
     172#define CMAudioFormatDescriptionGetStreamBasicDescription softLink_CoreMedia_CMAudioFormatDescriptionGetStreamBasicDescription
     173SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer, OSStatus, (CMSampleBufferRef sbuf, size_t *bufferListSizeNeededOut, AudioBufferList *bufferListOut, size_t bufferListSize, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, CMBlockBufferRef *blockBufferOut), (sbuf, bufferListSizeNeededOut, bufferListOut, bufferListSize, bbufStructAllocator, bbufMemoryAllocator, flags, blockBufferOut))
     174#define CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer softLink_CoreMedia_CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer
     175SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf))
     176#define CMSampleBufferGetNumSamples softLink_CoreMedia_CMSampleBufferGetNumSamples
    171177
    172178#endif // PLATFORM(COCOA)
  • trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h

    r181152 r189913  
    11/*
    2  * Copyright (C) 2013 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3030
    3131#include "AVMediaCaptureSource.h"
     32#include <wtf/Lock.h>
    3233
     34typedef struct AudioStreamBasicDescription AudioStreamBasicDescription;
    3335typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef;
    3436
    3537namespace WebCore {
    36    
     38
    3739class AVAudioCaptureSource : public AVMediaCaptureSource {
    3840public:
     41
     42    class Observer {
     43    public:
     44        virtual ~Observer() { }
     45        virtual void prepare(const AudioStreamBasicDescription *) = 0;
     46        virtual void unprepare() = 0;
     47        virtual void process(CMFormatDescriptionRef, CMSampleBufferRef) = 0;
     48    };
     49
    3950    static RefPtr<AVMediaCaptureSource> create(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>);
    40    
     51
     52    void addObserver(Observer*);
     53    void removeObserver(Observer*);
     54
    4155private:
    4256    AVAudioCaptureSource(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>);
    4357    virtual ~AVAudioCaptureSource();
    4458   
    45     virtual RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override;
    46     virtual void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override;
     59    RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override;
     60    void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override;
    4761   
    48     virtual void setupCaptureSession() override;
    49     virtual void updateStates() override;
    50        
     62    void setupCaptureSession() override;
     63    void updateStates() override;
     64
    5165    RetainPtr<AVCaptureConnection> m_audioConnection;
    52     RetainPtr<CMFormatDescriptionRef> m_audioFormatDescription;
     66
     67    std::unique_ptr<AudioStreamBasicDescription> m_inputDescription;
     68    Vector<Observer*> m_observers;
     69    Lock m_lock;
    5370};
    5471
  • trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm

    r186182 r189913  
    11/*
    2  * Copyright (C) 2013 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2525
    2626#import "config.h"
     27#import "AVAudioCaptureSource.h"
    2728
    2829#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    2930
    30 #import "AVAudioCaptureSource.h"
    31 
    32 #import "CoreMediaSoftLink.h"
    3331#import "Logging.h"
    3432#import "MediaConstraints.h"
     
    3735#import "SoftLinking.h"
    3836#import <AVFoundation/AVFoundation.h>
    39 #import <objc/runtime.h>
     37#import <CoreAudio/CoreAudioTypes.h>
     38#import <wtf/HashSet.h>
     39
     40#import "CoreMediaSoftLink.h"
    4041
    4142typedef AVCaptureConnection AVCaptureConnectionType;
     
    6970    currentStates()->setSourceId(id);
    7071    currentStates()->setSourceType(RealtimeMediaSourceStates::Microphone);
     72    m_inputDescription = std::make_unique<AudioStreamBasicDescription>();
    7173}
    7274   
     
    7880{
    7981    notImplemented();
    80     return 0;
     82    return nullptr;
    8183}
    8284
     
    8486{
    8587    // FIXME: use [AVCaptureAudioPreviewOutput volume] for volume
     88}
     89
     90void AVAudioCaptureSource::addObserver(AVAudioCaptureSource::Observer* observer)
     91{
     92    {
     93        LockHolder lock(m_lock);
     94        m_observers.append(observer);
     95    }
     96
     97    if (m_inputDescription->mSampleRate)
     98        observer->prepare(m_inputDescription.get());
     99}
     100
     101void AVAudioCaptureSource::removeObserver(AVAudioCaptureSource::Observer* observer)
     102{
     103    LockHolder lock(m_lock);
     104    size_t pos = m_observers.find(observer);
     105    if (pos != notFound)
     106        m_observers.remove(pos);
    86107}
    87108
     
    101122}
    102123
     124static bool operator==(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b)
     125{
     126    return a.mSampleRate == b.mSampleRate
     127        && a.mFormatID == b.mFormatID
     128        && a.mFormatFlags == b.mFormatFlags
     129        && a.mBytesPerPacket == b.mBytesPerPacket
     130        && a.mFramesPerPacket == b.mFramesPerPacket
     131        && a.mBytesPerFrame == b.mBytesPerFrame
     132        && a.mChannelsPerFrame == b.mChannelsPerFrame
     133        && a.mBitsPerChannel == b.mBitsPerChannel;
     134}
     135
     136static bool operator!=(const AudioStreamBasicDescription& a, const AudioStreamBasicDescription& b)
     137{
     138    return !(a == b);
     139}
     140
    103141void AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
    104142{
     143    Vector<Observer*> observers;
     144    {
     145        LockHolder lock(m_lock);
     146        if (m_observers.isEmpty())
     147            return;
     148
     149        copyToVector(m_observers, observers);
     150    }
     151
    105152    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
    106153    if (!formatDescription)
    107154        return;
    108155
    109     CFRetain(formatDescription);
    110     m_audioFormatDescription = adoptCF(formatDescription);
     156    const AudioStreamBasicDescription* streamDescription = CMAudioFormatDescriptionGetStreamBasicDescription(formatDescription);
     157    if (*m_inputDescription != *streamDescription) {
     158        m_inputDescription = std::make_unique<AudioStreamBasicDescription>(*streamDescription);
     159        for (auto& observer : observers)
     160            observer->prepare(m_inputDescription.get());
     161    }
     162
     163    for (auto& observer : observers)
     164        observer->process(formatDescription, sampleBuffer);
    111165}
    112166
  • trunk/Source/WebCore/platform/mediastream/mac/AVCaptureDeviceManager.mm

    r187282 r189913  
    2525
    2626#import "config.h"
     27#import "AVCaptureDeviceManager.h"
    2728
    2829#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    29 
    30 #import "AVCaptureDeviceManager.h"
    3130
    3231#import "AVAudioCaptureSource.h"
    3332#import "AVMediaCaptureSource.h"
    3433#import "AVVideoCaptureSource.h"
     34#import "AudioSourceProvider.h"
    3535#import "Logging.h"
    3636#import "MediaConstraints.h"
     
    100100
    101101    String m_captureDeviceID;
     102    String m_localizedName;
     103    String m_groupID;
    102104
    103105    String m_audioSourceId;
    104     RefPtr<AVMediaCaptureSource> m_audioSource;
     106    RefPtr<AVMediaCaptureSource> m_audioAVMediaCaptureSource;
    105107
    106108    String m_videoSourceId;
    107     RefPtr<AVMediaCaptureSource> m_videoSource;
     109    RefPtr<AVMediaCaptureSource> m_videoAVMediaCaptureSource;
    108110
    109111    bool m_enabled;
     
    158160            if ([device hasMediaType:AVMediaTypeVideo] || [device hasMediaType:AVMediaTypeMuxed])
    159161                source.m_videoSourceId = createCanonicalUUIDString();
     162
     163            source.m_groupID = createCanonicalUUIDString();
     164            source.m_localizedName = device.localizedName;
    160165
    161166            devices.append(source);
     
    259264    size_t constraint = validConstraintNames().find(name);
    260265    if (constraint == notFound)
    261         return false;
     266        return true;
    262267   
    263268    switch (constraint) {
     
    323328
    324329    Vector<CaptureDevice>& devices = captureDeviceList();
    325     size_t count = devices.size();
    326     for (size_t i = 0; i < count; ++i) {
    327         AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:devices[i].m_captureDeviceID];
    328         ASSERT(device);
    329 
    330         if (!devices[i].m_enabled)
     330    for (auto captureDevice : devices) {
     331
     332        if (!captureDevice.m_enabled)
    331333            continue;
    332         // FIXME: Change groupID from localizedName to something more meaningful
    333         if (devices[i].m_videoSource)
    334             sourcesInfo.append(TrackSourceInfo::create(devices[i].m_videoSourceId, TrackSourceInfo::Video, device.localizedName, device.localizedName, devices[i].m_captureDeviceID));
    335         if (devices[i].m_audioSource)
    336             sourcesInfo.append(TrackSourceInfo::create(devices[i].m_audioSourceId, TrackSourceInfo::Audio, device.localizedName, device.localizedName, devices[i].m_captureDeviceID));
    337     }
    338    
     334
     335        if (!captureDevice.m_videoSourceId.isEmpty())
     336            sourcesInfo.append(TrackSourceInfo::create(captureDevice.m_videoSourceId, TrackSourceInfo::Video, captureDevice.m_localizedName, captureDevice.m_groupID, captureDevice.m_captureDeviceID));
     337        if (!captureDevice.m_audioSourceId.isEmpty())
     338            sourcesInfo.append(TrackSourceInfo::create(captureDevice.m_audioSourceId, TrackSourceInfo::Audio, captureDevice.m_localizedName, captureDevice.m_groupID, captureDevice.m_captureDeviceID));
     339    }
     340
    339341    LOG(Media, "AVCaptureDeviceManager::getSourcesInfo(%p), found %d active devices", this, sourcesInfo.size());
    340342
     
    353355    constraints->getMandatoryConstraints(mandatoryConstraints);
    354356    if (mandatoryConstraints.size()) {
     357
     358        // FIXME: this method should take an AVCaptureDevice and use its AVCaptureSession instead of creating a new one.
    355359        RetainPtr<AVCaptureSessionType> session = adoptNS([allocAVCaptureSessionInstance() init]);
    356360        for (size_t i = 0; i < mandatoryConstraints.size(); ++i) {
     
    362366        }
    363367    }
    364    
    365     Vector<MediaConstraint> optionalConstraints;
    366     constraints->getOptionalConstraints(optionalConstraints);
    367     if (!optionalConstraints.size())
    368         return true;
    369 
    370     for (size_t i = 0; i < optionalConstraints.size(); ++i) {
    371         const MediaConstraint& constraint = optionalConstraints[i];
    372         if (!isValidConstraint(type, constraint.m_name)) {
    373             invalidConstraint = constraint.m_name;
    374             return false;
    375         }
    376     }
    377368
    378369    return true;
     
    393384    } sortBasedOffFitnessScore;
    394385
    395     for (auto& captureDevice : captureDeviceList()) {
     386    Vector<CaptureDevice>& devices = captureDeviceList();
     387
     388    for (auto& captureDevice : devices) {
    396389        if (!captureDevice.m_enabled)
    397390            continue;
     
    400393        // device of the appropriate type.
    401394        if (type == RealtimeMediaSource::Audio && !captureDevice.m_audioSourceId.isEmpty()) {
    402             if (!captureDevice.m_audioSource) {
     395            if (!captureDevice.m_audioAVMediaCaptureSource) {
    403396                AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:captureDevice.m_captureDeviceID];
    404397                ASSERT(device);
    405                 captureDevice.m_audioSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints);
     398                captureDevice.m_audioAVMediaCaptureSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints);
    406399            }
    407             bestSourcesList.append(captureDevice.m_audioSource);
     400            bestSourcesList.append(captureDevice.m_audioAVMediaCaptureSource);
    408401        }
    409402
    410403        if (type == RealtimeMediaSource::Video && !captureDevice.m_videoSourceId.isEmpty()) {
    411             if (!captureDevice.m_videoSource) {
     404            if (!captureDevice.m_videoAVMediaCaptureSource) {
    412405                AVCaptureDeviceType *device = [AVCaptureDevice deviceWithUniqueID:captureDevice.m_captureDeviceID];
    413406                ASSERT(device);
    414                 captureDevice.m_videoSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints);
     407                captureDevice.m_videoAVMediaCaptureSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints);
    415408            }
    416             bestSourcesList.append(captureDevice.m_videoSource);
     409            bestSourcesList.append(captureDevice.m_videoAVMediaCaptureSource);
    417410        }
    418411    }
     
    443436        ASSERT(device);
    444437        if (type == RealtimeMediaSource::Type::Audio && !captureDevice.m_audioSourceId.isEmpty()) {
    445             captureDevice.m_audioSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints);
    446             return captureDevice.m_audioSource;
     438            if (!captureDevice.m_audioAVMediaCaptureSource)
     439                captureDevice.m_audioAVMediaCaptureSource = AVAudioCaptureSource::create(device, captureDevice.m_audioSourceId, constraints);
     440            return captureDevice.m_audioAVMediaCaptureSource;
    447441        }
    448442        if (type == RealtimeMediaSource::Type::Video && !captureDevice.m_videoSourceId.isEmpty()) {
    449             captureDevice.m_videoSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints);
    450             return captureDevice.m_videoSource;
     443            if (!captureDevice.m_videoAVMediaCaptureSource)
     444                captureDevice.m_videoAVMediaCaptureSource = AVVideoCaptureSource::create(device, captureDevice.m_videoSourceId, constraints);
     445            return captureDevice.m_videoAVMediaCaptureSource;
    451446        }
    452447    }
  • trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h

    r187898 r189913  
    11/*
    2  * Copyright (C) 2013 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2929#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    3030
     31#include "GenericTaskQueue.h"
    3132#include "RealtimeMediaSource.h"
     33#include "Timer.h"
    3234#include <wtf/RetainPtr.h>
     35#include <wtf/WeakPtr.h>
    3336
    3437OBJC_CLASS AVCaptureAudioDataOutput;
     
    5457    AVCaptureSession *session() const { return m_session.get(); }
    5558
     59    void startProducingData() override;
     60    void stopProducingData() override;
     61
    5662protected:
    5763    AVMediaCaptureSource(AVCaptureDevice*, const AtomicString&, RealtimeMediaSource::Type, PassRefPtr<MediaConstraints>);
    5864
    59     virtual const RealtimeMediaSourceStates& states() override;
    60 
    61     virtual void startProducingData() override;
    62     virtual void stopProducingData() override;
     65    const RealtimeMediaSourceStates& states() override;
    6366
    6467    virtual void setupCaptureSession() = 0;
     
    6871    RealtimeMediaSourceStates* currentStates() { return &m_currentStates; }
    6972    MediaConstraints* constraints() { return m_constraints.get(); }
    70     CMSampleBufferRef buffer() const { return m_buffer.get(); }
    7173
    7274    void setVideoSampleBufferDelegate(AVCaptureVideoDataOutput*);
    7375    void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*);
     76
     77    void scheduleDeferredTask(std::function<void ()>);
     78
     79    void statesDidChanged() { }
    7480   
    75     void setBuffer(CMSampleBufferRef buffer) { m_buffer = buffer; }
    76 
    7781private:
    7882    void setupSession();
     83    WeakPtr<AVMediaCaptureSource> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
    7984
     85    WeakPtrFactory<AVMediaCaptureSource> m_weakPtrFactory;
    8086    RetainPtr<WebCoreAVMediaCaptureSourceObserver> m_objcObserver;
    8187    RefPtr<MediaConstraints> m_constraints;
     
    8389    RetainPtr<AVCaptureSession> m_session;
    8490    RetainPtr<AVCaptureDevice> m_device;
    85     RetainPtr<CMSampleBufferRef> m_buffer;
    8691   
    8792    bool m_isRunning;
  • trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm

    r186182 r189913  
    11/*
    2  * Copyright (C) 2013 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2525
    2626#import "config.h"
     27#import "AVMediaCaptureSource.h"
    2728
    2829#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    2930
    30 #import "AVMediaCaptureSource.h"
    31 
     31#import "AudioSourceProvider.h"
    3232#import "Logging.h"
    3333#import "MediaConstraints.h"
     
    124124AVMediaCaptureSource::AVMediaCaptureSource(AVCaptureDeviceType* device, const AtomicString& id, RealtimeMediaSource::Type type, PassRefPtr<MediaConstraints> constraints)
    125125    : RealtimeMediaSource(id, type, emptyString())
     126    , m_weakPtrFactory(this)
    126127    , m_objcObserver(adoptNS([[WebCoreAVMediaCaptureSourceObserver alloc] initWithCallback:this]))
    127128    , m_constraints(constraints)
     
    129130    , m_isRunning(false)
    130131{
    131     setName([device localizedName]);
     132    setName(device.localizedName);
    132133    m_currentStates.setSourceType(type == RealtimeMediaSource::Video ? RealtimeMediaSourceStates::Camera : RealtimeMediaSourceStates::Microphone);
    133134}
     
    194195{
    195196    [audioOutput setSampleBufferDelegate:m_objcObserver.get() queue:globaAudioCaptureSerialQueue()];
     197}
     198
     199void AVMediaCaptureSource::scheduleDeferredTask(std::function<void ()> function)
     200{
     201    ASSERT(function);
     202
     203    auto weakThis = createWeakPtr();
     204    callOnMainThread([weakThis, function] {
     205        if (!weakThis)
     206            return;
     207
     208        function();
     209    });
    196210}
    197211
     
    233247        return;
    234248
    235     CFRetain(sampleBuffer);
    236     dispatch_async(dispatch_get_main_queue(), ^{
    237         if (m_callback)
    238             m_callback->captureOutputDidOutputSampleBufferFromConnection(captureOutput, sampleBuffer, connection);
    239         CFRelease(sampleBuffer);
    240     });
     249    m_callback->captureOutputDidOutputSampleBufferFromConnection(captureOutput, sampleBuffer, connection);
    241250}
    242251
  • trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h

    r187208 r189913  
    11/*
    2  * Copyright (C) 2013 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3333OBJC_CLASS AVCaptureVideoPreviewLayer;
    3434
     35typedef struct CGImage *CGImageRef;
    3536typedef const struct opaqueCMFormatDescription *CMFormatDescriptionRef;
     37typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
    3638
    3739namespace WebCore {
     40
     41class FloatRect;
     42class GraphicsContext;
    3843
    3944class AVVideoCaptureSource : public AVMediaCaptureSource {
     
    4146    static RefPtr<AVMediaCaptureSource> create(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>);
    4247
    43     virtual RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override;
    44     virtual void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override;
    45    
    46     virtual int32_t width() const { return m_width; }
    47     virtual int32_t height() const { return m_height; }
     48    int32_t width() const { return m_width; }
     49    int32_t height() const { return m_height; }
    4850
    4951    AVCaptureVideoPreviewLayer* previewLayer() { return m_videoPreviewLayer.get(); }
    50    
     52    CMSampleBufferRef currentFrameSampleBuffer() const { return m_buffer.get(); }
     53    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
     54    RetainPtr<CGImageRef> currentFrameImage();
     55
    5156private:
    5257    AVVideoCaptureSource(AVCaptureDevice*, const AtomicString&, PassRefPtr<MediaConstraints>);
    5358    virtual ~AVVideoCaptureSource();
    5459
    55     virtual void setupCaptureSession() override;
    56     virtual void updateStates() override;
     60    void setupCaptureSession() override;
     61    void updateStates() override;
     62
     63    RefPtr<RealtimeMediaSourceCapabilities> capabilities() const override;
    5764
    5865    bool applyConstraints(MediaConstraints*);
    5966    bool setFrameRateConstraint(float minFrameRate, float maxFrameRate);
    6067
    61     void calculateFramerate(CMSampleBufferRef);
     68    bool calculateFramerate(CMSampleBufferRef);
     69
     70    void captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutput*, CMSampleBufferRef, AVCaptureConnection*) override;
     71    void processNewFrame(RetainPtr<CMSampleBufferRef>);
    6272
    6373    RetainPtr<AVCaptureConnection> m_videoConnection;
    64     RetainPtr<CMFormatDescriptionRef> m_videoFormatDescription;
    6574    RetainPtr<AVCaptureVideoPreviewLayer> m_videoPreviewLayer;
     75    RetainPtr<CMSampleBufferRef> m_buffer;
     76    RetainPtr<CGImageRef> m_lastImage;
    6677    Vector<Float64> m_videoFrameTimeStamps;
    6778    Float64 m_frameRate;
  • trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm

    r187898 r189913  
    11/*
    2  * Copyright (C) 2013, 2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2525
    2626#import "config.h"
     27#import "AVVideoCaptureSource.h"
    2728
    2829#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    29 
    30 #import "AVVideoCaptureSource.h"
    3130
    3231#import "AVCaptureDeviceManager.h"
    3332#import "BlockExceptions.h"
     33#import "GraphicsContextCG.h"
     34#import "IntRect.h"
    3435#import "Logging.h"
    3536#import "MediaConstraints.h"
     
    4950
    5051SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
     52SOFT_LINK_FRAMEWORK_OPTIONAL(CoreVideo)
    5153
    5254SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection)
     
    7678#define AVCaptureSessionPreset352x288 getAVCaptureSessionPreset352x288()
    7779#define AVCaptureSessionPresetLow getAVCaptureSessionPresetLow()
     80
     81SOFT_LINK(CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
     82SOFT_LINK(CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
     83SOFT_LINK(CoreVideo, CVPixelBufferGetBaseAddress, void*, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
     84SOFT_LINK(CoreVideo, CVPixelBufferGetBytesPerRow, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
     85SOFT_LINK(CoreVideo, CVPixelBufferGetPixelFormatType, OSType, (CVPixelBufferRef pixelBuffer), (pixelBuffer))
     86SOFT_LINK(CoreVideo, CVPixelBufferLockBaseAddress, CVReturn, (CVPixelBufferRef pixelBuffer, CVOptionFlags lockFlags), (pixelBuffer, lockFlags))
     87SOFT_LINK(CoreVideo, CVPixelBufferUnlockBaseAddress, CVReturn, (CVPixelBufferRef pixelBuffer, CVOptionFlags lockFlags), (pixelBuffer, lockFlags))
     88
     89SOFT_LINK_POINTER(CoreVideo, kCVPixelBufferPixelFormatTypeKey, NSString *)
     90#define kCVPixelBufferPixelFormatTypeKey getkCVPixelBufferPixelFormatTypeKey()
    7891
    7992namespace WebCore {
     
    158171    }
    159172
    160 NSLog(@"set frame rate to %f", [bestFrameRateRange minFrameRate]);
    161173    LOG(Media, "AVVideoCaptureSource::setFrameRateConstraint(%p) - set frame rate range to %f..%f", this, minFrameRate, maxFrameRate);
    162174    return true;
     
    210222    if ([session() canAddInput:videoIn.get()])
    211223        [session() addInput:videoIn.get()];
    212    
     224
    213225    if (constraints())
    214226        applyConstraints(constraints());
    215227
    216228    RetainPtr<AVCaptureVideoDataOutputType> videoOutput = adoptNS([allocAVCaptureVideoDataOutputInstance() init]);
     229    RetainPtr<NSDictionary> settingsDictionary = adoptNS([[NSDictionary alloc] initWithObjectsAndKeys:
     230                                                         [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey
     231                                                         , nil]);
     232    [videoOutput setVideoSettings:settingsDictionary.get()];
    217233    setVideoSampleBufferDelegate(videoOutput.get());
    218234    ASSERT([session() canAddOutput:videoOutput.get()]);
    219235    if ([session() canAddOutput:videoOutput.get()])
    220236        [session() addOutput:videoOutput.get()];
    221    
     237
    222238    m_videoConnection = adoptNS([videoOutput.get() connectionWithMediaType:AVMediaTypeVideo]);
    223    
    224239    m_videoPreviewLayer = adoptNS([[AVCaptureVideoPreviewLayer alloc] initWithSession:session()]);
    225240}
    226241
    227 void AVVideoCaptureSource::calculateFramerate(CMSampleBufferRef sampleBuffer)
     242bool AVVideoCaptureSource::calculateFramerate(CMSampleBufferRef sampleBuffer)
    228243{
    229244    CMTime sampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    230245    if (!CMTIME_IS_NUMERIC(sampleTime))
    231         return;
     246        return false;
    232247
    233248    Float64 frameTime = CMTimeGetSeconds(sampleTime);
     
    238253    while (m_videoFrameTimeStamps[0] < oneSecondAgo)
    239254        m_videoFrameTimeStamps.remove(0);
    240    
     255
     256    Float64 frameRate = m_frameRate;
    241257    m_frameRate = (m_frameRate + m_videoFrameTimeStamps.size()) / 2;
    242 }
    243    
    244 void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
    245 {
    246     CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
     258
     259    return frameRate != m_frameRate;
     260}
     261
     262void AVVideoCaptureSource::processNewFrame(RetainPtr<CMSampleBufferRef> sampleBuffer)
     263{
     264    CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer.get());
    247265    if (!formatDescription)
    248266        return;
    249267
    250     CFRetain(formatDescription);
    251     m_videoFormatDescription = adoptCF(formatDescription);
    252     calculateFramerate(sampleBuffer);
     268    bool statesChanged = false;
     269
     270    statesChanged = calculateFramerate(sampleBuffer.get());
     271    m_buffer = sampleBuffer;
     272    m_lastImage = nullptr;
    253273
    254274    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);
    255     m_width = dimensions.width;
    256     m_height = dimensions.height;
    257    
    258     setBuffer(sampleBuffer);
     275    if (dimensions.width != m_width || dimensions.height != m_height) {
     276        m_width = dimensions.width;
     277        m_height = dimensions.height;
     278        statesChanged = true;
     279    }
     280
     281    if (statesChanged)
     282        this->statesDidChanged();
     283}
     284
     285void AVVideoCaptureSource::captureOutputDidOutputSampleBufferFromConnection(AVCaptureOutputType*, CMSampleBufferRef sampleBuffer, AVCaptureConnectionType*)
     286{
     287    RetainPtr<CMSampleBufferRef> buffer = sampleBuffer;
     288
     289    scheduleDeferredTask([this, buffer] {
     290        this->processNewFrame(buffer);
     291    });
     292}
     293
     294RetainPtr<CGImageRef> AVVideoCaptureSource::currentFrameImage()
     295{
     296    if (m_lastImage)
     297        return m_lastImage;
     298
     299    if (!m_buffer)
     300        return nullptr;
     301
     302    CVPixelBufferRef pixelBuffer = static_cast<CVPixelBufferRef>(CMSampleBufferGetImageBuffer(m_buffer.get()));
     303    ASSERT(CVPixelBufferGetPixelFormatType(pixelBuffer) == kCVPixelFormatType_32BGRA);
     304
     305    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
     306    void *baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer);
     307    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer);
     308    size_t width = CVPixelBufferGetWidth(pixelBuffer);
     309    size_t height = CVPixelBufferGetHeight(pixelBuffer);
     310
     311    RetainPtr<CGDataProviderRef> provider = adoptCF(CGDataProviderCreateWithData(NULL, baseAddress, bytesPerRow * height, NULL));
     312    m_lastImage = adoptCF(CGImageCreate(width, height, 8, 32, bytesPerRow, deviceRGBColorSpaceRef(), kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst, provider.get(), NULL, true, kCGRenderingIntentDefault));
     313
     314    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
     315
     316    return m_lastImage;
     317}
     318
     319void AVVideoCaptureSource::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& rect)
     320{
     321    if (context.paintingDisabled() || !currentFrameImage())
     322        return;
     323
     324    GraphicsContextStateSaver stateSaver(context);
     325    context.translate(rect.x(), rect.y() + rect.height());
     326    context.scale(FloatSize(1, -1));
     327    context.setImageInterpolationQuality(InterpolationLow);
     328    IntRect paintRect(IntPoint(0, 0), IntSize(rect.width(), rect.height()));
     329    CGContextDrawImage(context.platformContext(), CGRectMake(0, 0, paintRect.width(), paintRect.height()), m_lastImage.get());
    259330}
    260331
Note: See TracChangeset for help on using the changeset viewer.