Changeset 210621 in webkit


Ignore:
Timestamp:
Jan 11, 2017 9:22:32 PM (7 years ago)
Author:
eric.carlson@apple.com
Message:

[MediaStream, Mac] Render media stream audio buffers
https://bugs.webkit.org/show_bug.cgi?id=159836
<rdar://problem/27380390>

Reviewed by Jer Noble.

No new tests, it isn't possible to test audio rendering directly. A follow-up patch will
add a mock audio source that will enable audio testing.

  • platform/cf/CoreMediaSoftLink.cpp: Include new functions used.
  • platform/cf/CoreMediaSoftLink.h:
  • WebCore.xcodeproj/project.pbxproj: Remove references to the deleted previews.
  • platform/Logging.h: Add MediaCaptureSamples.
  • platform/MediaSample.h: Add outputPresentationTime and outputDuration.
  • platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferGetOutputDuration, CMSampleBufferGetOutputPresentationTimeStamp,

CMTimeConvertScale, CMTimebaseGetEffectiveRate, CMAudioSampleBufferCreateWithPacketDescriptions,
CMSampleBufferSetDataBufferFromAudioBufferList, CMSampleBufferSetDataReady,
CMAudioFormatDescriptionCreate, CMClockGetHostTimeClock, and CMClockGetTime.

  • platform/cf/CoreMediaSoftLink.h:

Create and use an AVSampleBufferAudioRenderer each audio stream track, when it is available,
to render for audio samples. Store the offset between the first sample received from a track's
output presentation and the synchronizer time so we can adjust sample timestamps to be
relative to the synchronizer's timeline regardless of their source. Remove the use of source
previews because not all sources will have them.

  • platform/graphics/avfoundation/MediaSampleAVFObjC.h:
  • platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
  • platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:

Add an ObjC helper to catch renderer status changes.
(-[WebAVSampleBufferStatusChangeListener initWithParent:]):
(-[WebAVSampleBufferStatusChangeListener dealloc]):
(-[WebAVSampleBufferStatusChangeListener invalidate]):
(-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]):
(-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]):
(-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]):
(-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]):
(-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack): Deleted.
(WebCore::MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume): Deleted.

  • platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm:

(WebCore::MediaSampleAVFObjC::outputPresentationTime): New.
(WebCore::MediaSampleAVFObjC::outputDuration): New.
(WebCore::MediaSampleAVFObjC::dump): Log outputPresentationTime.

  • platform/mediastream/AudioTrackPrivateMediaStream.h: Add timelineOffset.
  • platform/mediastream/MediaStreamTrackPrivate.cpp:

(WebCore::MediaStreamTrackPrivate::setEnabled): No more m_preview.
(WebCore::MediaStreamTrackPrivate::endTrack): Ditto.
(WebCore::MediaStreamTrackPrivate::preview): Deleted.

  • platform/mediastream/MediaStreamTrackPrivate.h:
  • platform/mediastream/RealtimeMediaSource.h:

(WebCore::RealtimeMediaSource::preview): Deleted.

  • platform/mediastream/RealtimeMediaSourcePreview.h: Removed.
  • platform/mediastream/VideoTrackPrivateMediaStream.h: Add timelineOffset.
  • platform/mediastream/mac/AVAudioCaptureSource.h:
  • platform/mediastream/mac/AVAudioCaptureSource.mm:

(WebCore::AVAudioCaptureSource::updateSettings):
(WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Pass the
sample buffer up the chain.
(WebCore::AVAudioSourcePreview::create): Deleted.
(WebCore::AVAudioSourcePreview::AVAudioSourcePreview): Deleted.
(WebCore::AVAudioSourcePreview::invalidate): Deleted.
(WebCore::AVAudioSourcePreview::play): Deleted.
(WebCore::AVAudioSourcePreview::pause): Deleted.
(WebCore::AVAudioSourcePreview::setEnabled): Deleted.
(WebCore::AVAudioSourcePreview::setVolume): Deleted.
(WebCore::AVAudioSourcePreview::updateState): Deleted.
(WebCore::AVAudioCaptureSource::createPreview): Deleted.

  • platform/mediastream/mac/AVMediaCaptureSource.h:

(WebCore::AVMediaSourcePreview): Deleted.
(WebCore::AVMediaCaptureSource::createWeakPtr): Deleted.

  • platform/mediastream/mac/AVMediaCaptureSource.mm:

(WebCore::AVMediaCaptureSource::AVMediaCaptureSource): No more preview.
(WebCore::AVMediaCaptureSource::reset):
(WebCore::AVMediaCaptureSource::preview): Deleted.
(WebCore::AVMediaCaptureSource::removePreview): Deleted.
(WebCore::AVMediaSourcePreview::AVMediaSourcePreview): Deleted.
(WebCore::AVMediaSourcePreview::~AVMediaSourcePreview): Deleted.
(WebCore::AVMediaSourcePreview::invalidate): Deleted.

  • platform/mediastream/mac/AVVideoCaptureSource.h:
  • platform/mediastream/mac/AVVideoCaptureSource.mm:

(WebCore::AVVideoCaptureSource::processNewFrame): Don't set the "display immediately" attachment.
(WebCore::AVVideoSourcePreview::create): Deleted.
(WebCore::AVVideoSourcePreview::AVVideoSourcePreview): Deleted.
(WebCore::AVVideoSourcePreview::backgroundLayerBoundsChanged): Deleted.
(WebCore::AVVideoSourcePreview::invalidate): Deleted.
(WebCore::AVVideoSourcePreview::play): Deleted.
(WebCore::AVVideoSourcePreview::pause): Deleted.
(WebCore::AVVideoSourcePreview::setPaused): Deleted.
(WebCore::AVVideoSourcePreview::setEnabled): Deleted.
(WebCore::AVVideoCaptureSource::createPreview): Deleted.
(-[WebCoreAVVideoCaptureSourceObserver setParent:]): Deleted.
(-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): Deleted.

  • platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:

(WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Use a more typical video
time scale. Set the sample decode time.
(WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Use a static for colorspace
instead of fetching it for every frame.

  • platform/mock/mediasource/MockSourceBufferPrivate.cpp: Add outputPresentationTime and outputDuration.
Location:
trunk/Source
Files:
1 deleted
24 edited

Legend:

Unmodified
Added
Removed
  • trunk/Source/WebCore/ChangeLog

    r210616 r210621  
     12017-01-11  Eric Carlson  <eric.carlson@apple.com>
     2
     3        [MediaStream, Mac] Render media stream audio buffers
     4        https://bugs.webkit.org/show_bug.cgi?id=159836
     5        <rdar://problem/27380390>
     6
     7        Reviewed by Jer Noble.
     8
     9        No new tests, it isn't possible to test audio rendering directly. A follow-up patch will
     10        add a mock audio source that will enable audio testing.
     11
     12        * platform/cf/CoreMediaSoftLink.cpp: Include new functions used.
     13        * platform/cf/CoreMediaSoftLink.h:
     14
     15        * WebCore.xcodeproj/project.pbxproj: Remove references to the deleted previews.
     16
     17        * platform/Logging.h: Add MediaCaptureSamples.
     18
     19        * platform/MediaSample.h: Add outputPresentationTime and outputDuration.
     20
     21        * platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferGetOutputDuration, CMSampleBufferGetOutputPresentationTimeStamp,
     22        CMTimeConvertScale, CMTimebaseGetEffectiveRate, CMAudioSampleBufferCreateWithPacketDescriptions,
     23        CMSampleBufferSetDataBufferFromAudioBufferList, CMSampleBufferSetDataReady,
     24        CMAudioFormatDescriptionCreate, CMClockGetHostTimeClock, and CMClockGetTime.
     25        * platform/cf/CoreMediaSoftLink.h:
     26
     27        Create and use an AVSampleBufferAudioRenderer each audio stream track, when it is available,
     28        to render for audio samples. Store the offset between the first sample received from a track's
     29        output presentation and the synchronizer time so we can adjust sample timestamps to be
     30        relative to the synchronizer's timeline regardless of their source. Remove the use of source
     31        previews because not all sources will have them.
     32
     33        * platform/graphics/avfoundation/MediaSampleAVFObjC.h:
     34        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h:
     35        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm:
     36       
     37        Add an ObjC helper to catch renderer status changes.
     38        (-[WebAVSampleBufferStatusChangeListener initWithParent:]):
     39        (-[WebAVSampleBufferStatusChangeListener dealloc]):
     40        (-[WebAVSampleBufferStatusChangeListener invalidate]):
     41        (-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]):
     42        (-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]):
     43        (-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]):
     44        (-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]):
     45        (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]):
     46        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC):
     47        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC):
     48        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue):
     49        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue):
     50        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes):
     51        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample):
     52        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample):
     53        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData):
     54        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData):
     55        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer):
     56        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer):
     57        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers):
     58        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider):
     59        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange):
     60        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange):
     61        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers):
     62        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer):
     63        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer):
     64        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer):
     65        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode):
     66        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play):
     67        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Deleted.
     68        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData): Deleted.
     69        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer): Deleted.
     70        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack): Deleted.
     71        (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume): Deleted.
     72
     73        * platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm:
     74        (WebCore::MediaSampleAVFObjC::outputPresentationTime): New.
     75        (WebCore::MediaSampleAVFObjC::outputDuration): New.
     76        (WebCore::MediaSampleAVFObjC::dump): Log outputPresentationTime.
     77
     78        * platform/mediastream/AudioTrackPrivateMediaStream.h: Add timelineOffset.
     79
     80        * platform/mediastream/MediaStreamTrackPrivate.cpp:
     81        (WebCore::MediaStreamTrackPrivate::setEnabled): No more m_preview.
     82        (WebCore::MediaStreamTrackPrivate::endTrack): Ditto.
     83        (WebCore::MediaStreamTrackPrivate::preview): Deleted.
     84        * platform/mediastream/MediaStreamTrackPrivate.h:
     85
     86        * platform/mediastream/RealtimeMediaSource.h:
     87        (WebCore::RealtimeMediaSource::preview): Deleted.
     88
     89        * platform/mediastream/RealtimeMediaSourcePreview.h: Removed.
     90
     91        * platform/mediastream/VideoTrackPrivateMediaStream.h: Add timelineOffset.
     92
     93        * platform/mediastream/mac/AVAudioCaptureSource.h:
     94        * platform/mediastream/mac/AVAudioCaptureSource.mm:
     95        (WebCore::AVAudioCaptureSource::updateSettings):
     96        (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Pass the
     97        sample buffer up the chain.
     98        (WebCore::AVAudioSourcePreview::create): Deleted.
     99        (WebCore::AVAudioSourcePreview::AVAudioSourcePreview): Deleted.
     100        (WebCore::AVAudioSourcePreview::invalidate): Deleted.
     101        (WebCore::AVAudioSourcePreview::play): Deleted.
     102        (WebCore::AVAudioSourcePreview::pause): Deleted.
     103        (WebCore::AVAudioSourcePreview::setEnabled): Deleted.
     104        (WebCore::AVAudioSourcePreview::setVolume): Deleted.
     105        (WebCore::AVAudioSourcePreview::updateState): Deleted.
     106        (WebCore::AVAudioCaptureSource::createPreview): Deleted.
     107
     108        * platform/mediastream/mac/AVMediaCaptureSource.h:
     109        (WebCore::AVMediaSourcePreview): Deleted.
     110        (WebCore::AVMediaCaptureSource::createWeakPtr): Deleted.
     111
     112        * platform/mediastream/mac/AVMediaCaptureSource.mm:
     113        (WebCore::AVMediaCaptureSource::AVMediaCaptureSource): No more preview.
     114        (WebCore::AVMediaCaptureSource::reset):
     115        (WebCore::AVMediaCaptureSource::preview): Deleted.
     116        (WebCore::AVMediaCaptureSource::removePreview): Deleted.
     117        (WebCore::AVMediaSourcePreview::AVMediaSourcePreview): Deleted.
     118        (WebCore::AVMediaSourcePreview::~AVMediaSourcePreview): Deleted.
     119        (WebCore::AVMediaSourcePreview::invalidate): Deleted.
     120
     121        * platform/mediastream/mac/AVVideoCaptureSource.h:
     122        * platform/mediastream/mac/AVVideoCaptureSource.mm:
     123        (WebCore::AVVideoCaptureSource::processNewFrame): Don't set the "display immediately" attachment.
     124        (WebCore::AVVideoSourcePreview::create): Deleted.
     125        (WebCore::AVVideoSourcePreview::AVVideoSourcePreview): Deleted.
     126        (WebCore::AVVideoSourcePreview::backgroundLayerBoundsChanged): Deleted.
     127        (WebCore::AVVideoSourcePreview::invalidate): Deleted.
     128        (WebCore::AVVideoSourcePreview::play): Deleted.
     129        (WebCore::AVVideoSourcePreview::pause): Deleted.
     130        (WebCore::AVVideoSourcePreview::setPaused): Deleted.
     131        (WebCore::AVVideoSourcePreview::setEnabled): Deleted.
     132        (WebCore::AVVideoCaptureSource::createPreview): Deleted.
     133        (-[WebCoreAVVideoCaptureSourceObserver setParent:]): Deleted.
     134        (-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): Deleted.
     135
     136        * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm:
     137        (WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Use a more typical video
     138        time scale. Set the sample decode time.
     139        (WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Use a static for colorspace
     140        instead of fetching it for every frame.
     141
     142        * platform/mock/mediasource/MockSourceBufferPrivate.cpp: Add outputPresentationTime and outputDuration.
     143
    11442017-01-11  Youenn Fablet  <youenn@apple.com>
    2145
  • trunk/Source/WebCore/Modules/webaudio/ScriptProcessorNode.cpp

    r207050 r210621  
    214214
    215215            callOnMainThread([this] {
     216                if (!m_hasAudioProcessListener)
     217                    return;
     218
    216219                fireProcessEvent();
    217220
  • trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj

    r210588 r210621  
    280280                07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E41BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h */; settings = {ATTRIBUTES = (Private, ); }; };
    281281                07CE77D516712A6A00C55A47 /* InbandTextTrackPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */; settings = {ATTRIBUTES = (Private, ); }; };
    282                 07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */; settings = {ATTRIBUTES = (Private, ); }; };
    283282                07D637401BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */; };
    284283                07D637411BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */; };
     
    72547253                07C8AD121D073D630087C5CE /* AVFoundationMIMETypeCache.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVFoundationMIMETypeCache.h; sourceTree = "<group>"; };
    72557254                07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = InbandTextTrackPrivateClient.h; sourceTree = "<group>"; };
    7256                 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RealtimeMediaSourcePreview.h; sourceTree = "<group>"; };
    72577255                07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebAudioSourceProviderAVFObjC.h; sourceTree = "<group>"; };
    72587256                07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebAudioSourceProviderAVFObjC.mm; sourceTree = "<group>"; };
     
    1520115199                                4A0FFA9F1AAF5EA20062803B /* RealtimeMediaSourceCenter.cpp */,
    1520215200                                4A0FFAA01AAF5EA20062803B /* RealtimeMediaSourceCenter.h */,
    15203                                 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */,
    1520415201                                4A4F656E1AA997F100E38CDD /* RealtimeMediaSourceSettings.cpp */,
    1520515202                                4A4F656F1AA997F100E38CDD /* RealtimeMediaSourceSettings.h */,
     
    2722727224                                4A0FFAA21AAF5EA20062803B /* RealtimeMediaSourceCenter.h in Headers */,
    2722827225                                4A0FFAA61AAF5EF60062803B /* RealtimeMediaSourceCenterMac.h in Headers */,
    27229                                 07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */,
    2723027226                                4A4F65741AA997F100E38CDD /* RealtimeMediaSourceSettings.h in Headers */,
    2723127227                                07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */,
  • trunk/Source/WebCore/platform/Logging.h

    r209873 r210621  
    6363    M(MediaSource) \
    6464    M(MediaSourceSamples) \
     65    M(MediaCaptureSamples) \
    6566    M(MemoryPressure) \
    6667    M(Network) \
  • trunk/Source/WebCore/platform/MediaSample.h

    r207694 r210621  
    5555
    5656    virtual MediaTime presentationTime() const = 0;
     57    virtual MediaTime outputPresentationTime() const { return presentationTime(); }
    5758    virtual MediaTime decodeTime() const = 0;
    5859    virtual MediaTime duration() const = 0;
     60    virtual MediaTime outputDuration() const { return duration(); }
    5961    virtual AtomicString trackID() const = 0;
    6062    virtual void setTrackID(const String&) = 0;
  • trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp

    r208444 r210621  
    8686SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetImageBuffer, CVImageBufferRef, (CMSampleBufferRef sbuf), (sbuf))
    8787SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
     88SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
     89SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
    8890SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary))
    8991SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfoArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount timingArrayEntries, CMSampleTimingInfo *timingArrayOut, CMItemCount *timingArrayEntriesNeededOut), (sbuf, timingArrayEntries, timingArrayOut, timingArrayEntriesNeededOut))
     92SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method))
    9093SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetTotalSampleSize, size_t, (CMSampleBufferRef sbuf), (sbuf))
    9194SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSetAttachment, void, (CMAttachmentBearerRef target, CFStringRef key, CFTypeRef value, CMAttachmentMode attachmentMode), (target, key, value, attachmentMode))
     
    9497SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetRate, OSStatus, (CMTimebaseRef timebase, Float64 rate), (timebase, rate))
    9598SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
     99SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
    96100SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
    97101SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc))
     
    115119SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut))
    116120SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleSizeArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount sizeArrayEntries, size_t* sizeArrayOut, CMItemCount* sizeArrayEntriesNeededOut), (sbuf, sizeArrayEntries, sizeArrayOut, sizeArrayEntriesNeededOut))
     121
     122SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut))
     123SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList))
     124SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf))
     125SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc))
     126SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ())
     127SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock))
    117128#endif // PLATFORM(COCOA)
    118129
  • trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h

    r208444 r210621  
    4848SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfo, OSStatus, (CMSampleBufferRef sbuf, CMItemIndex sampleIndex, CMSampleTimingInfo* timingInfoOut), (sbuf, sampleIndex, timingInfoOut))
    4949#define CMSampleBufferGetSampleTimingInfo softLink_CoreMedia_CMSampleBufferGetSampleTimingInfo
     50SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method))
     51#define CMTimeConvertScale softLink_CoreMedia_CMTimeConvertScale
    5052SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeAdd, CMTime, (CMTime time1, CMTime time2), (time1, time2))
    5153#define CMTimeAdd softLink_CoreMedia_CMTimeAdd
     
    138140SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
    139141#define CMSampleBufferGetPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetPresentationTimeStamp
     142SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf))
     143#define CMSampleBufferGetOutputDuration softLink_CoreMedia_CMSampleBufferGetOutputDuration
     144SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf))
     145#define CMSampleBufferGetOutputPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetOutputPresentationTimeStamp
    140146SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary))
    141147#define CMSampleBufferGetSampleAttachmentsArray softLink_CoreMedia_CMSampleBufferGetSampleAttachmentsArray
     
    154160SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time))
    155161#define CMTimebaseSetTime softLink_CoreMedia_CMTimebaseSetTime
     162SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase))
     163#define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate
    156164SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator))
    157165#define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary
     
    194202#define CMSampleBufferGetSampleSizeArray softLink_CoreMedia_CMSampleBufferGetSampleSizeArray
    195203
     204SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut))
     205#define CMAudioSampleBufferCreateWithPacketDescriptions softLink_CoreMedia_CMAudioSampleBufferCreateWithPacketDescriptions
     206SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList))
     207#define CMSampleBufferSetDataBufferFromAudioBufferList softLink_CoreMedia_CMSampleBufferSetDataBufferFromAudioBufferList
     208SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf))
     209#define CMSampleBufferSetDataReady softLink_CoreMedia_CMSampleBufferSetDataReady
     210SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc))
     211#define CMAudioFormatDescriptionCreate softLink_CoreMedia_CMAudioFormatDescriptionCreate
     212SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ())
     213#define CMClockGetHostTimeClock  softLink_CoreMedia_CMClockGetHostTimeClock
     214SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock))
     215#define CMClockGetTime  softLink_CoreMedia_CMClockGetTime
    196216#endif // PLATFORM(COCOA)
    197217
  • trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h

    r207694 r210621  
    5555
    5656    MediaTime presentationTime() const override;
     57    MediaTime outputPresentationTime() const override;
    5758    MediaTime decodeTime() const override;
    5859    MediaTime duration() const override;
     60    MediaTime outputDuration() const override;
    5961
    6062    AtomicString trackID() const override { return m_id; }
  • trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h

    r208851 r210621  
    11/*
    2  * Copyright (C) 2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    4040OBJC_CLASS AVSampleBufferRenderSynchronizer;
    4141OBJC_CLASS AVStreamSession;
     42OBJC_CLASS NSNumber;
     43OBJC_CLASS WebAVSampleBufferStatusChangeListener;
    4244typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
    4345
     
    5456#endif
    5557
     58#if __has_include(<AVFoundation/AVSampleBufferRenderSynchronizer.h>)
     59#define USE_RENDER_SYNCHRONIZER 1
     60#endif
     61
    5662class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer {
    5763public:
     
    7682    void destroyLayer();
    7783
     84    void rendererStatusDidChange(AVSampleBufferAudioRenderer*, NSNumber*);
     85    void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*);
     86
    7887private:
    7988    // MediaPlayerPrivateInterface
     
    98107
    99108    void setVolume(float) override;
    100     void internalSetVolume(float, bool);
    101109    void setMuted(bool) override;
    102110    bool supportsMuting() const override { return true; }
     
    123131    void setSize(const IntSize&) override { /* No-op */ }
    124132
    125     void enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&);
    126 
    127     void prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&);
    128     void enqueueVideoSampleBuffer(MediaSample&);
     133    void flushRenderers();
     134
     135    using PendingSampleQueue = Deque<Ref<MediaSample>>;
     136    void addSampleToPendingQueue(PendingSampleQueue&, MediaSample&);
     137    void removeOldSamplesFromPendingQueue(PendingSampleQueue&);
     138
     139    void updateSampleTimes(MediaSample&, const MediaTime&, const char*);
     140    MediaTime calculateTimelineOffset(const MediaSample&, double);
     141   
     142    void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&);
    129143    bool shouldEnqueueVideoSampleBuffer() const;
    130144    void flushAndRemoveVideoSampleBuffers();
    131     void requestNotificationWhenReadyForMediaData();
     145    void requestNotificationWhenReadyForVideoData();
     146
     147    void enqueueAudioSample(MediaStreamTrackPrivate&, MediaSample&);
     148    void createAudioRenderer(AtomicString);
     149    void destroyAudioRenderer(AVSampleBufferAudioRenderer*);
     150    void destroyAudioRenderer(AtomicString);
     151    void destroyAudioRenderers();
     152    void requestNotificationWhenReadyForAudioData(AtomicString);
    132153
    133154    void paint(GraphicsContext&, const FloatRect&) override;
     
    156177    void updateTracks();
    157178    void renderingModeChanged();
     179    void checkSelectedVideoTrack();
    158180
    159181    void scheduleDeferredTask(Function<void ()>&&);
     
    187209#endif
    188210
    189     bool haveVideoLayer() const { return m_sampleBufferDisplayLayer || m_videoPreviewPlayer; }
     211    MediaTime streamTime() const;
     212
     213#if USE(RENDER_SYNCHRONIZER)
     214    AudioSourceProvider* audioSourceProvider() final;
     215#endif
    190216
    191217    MediaPlayer* m_player { nullptr };
     
    193219    RefPtr<MediaStreamPrivate> m_mediaStreamPrivate;
    194220
    195     RefPtr<RealtimeMediaSourcePreview> m_videoPreviewPlayer;
    196     RefPtr<MediaStreamTrackPrivate> m_videoTrack;
    197 
     221    RefPtr<MediaStreamTrackPrivate> m_activeVideoTrack;
     222
     223    RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener;
    198224    RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer;
    199 #if PLATFORM(MAC)
     225#if USE(RENDER_SYNCHRONIZER)
     226    HashMap<String, RetainPtr<AVSampleBufferAudioRenderer>> m_audioRenderers;
    200227    RetainPtr<AVSampleBufferRenderSynchronizer> m_synchronizer;
    201 #endif
     228#else
     229    std::unique_ptr<Clock> m_clock;
     230#endif
     231
     232    MediaTime m_pausedTime;
    202233    RetainPtr<CGImageRef> m_pausedImage;
    203     double m_pausedTime { 0 };
    204     std::unique_ptr<Clock> m_clock;
    205234
    206235    HashMap<String, RefPtr<AudioTrackPrivateMediaStream>> m_audioTrackMap;
    207236    HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap;
    208     Deque<Ref<MediaSample>> m_sampleQueue;
     237    PendingSampleQueue m_pendingVideoSampleQueue;
     238#if USE(RENDER_SYNCHRONIZER)
     239    PendingSampleQueue m_pendingAudioSampleQueue;
     240#endif
    209241
    210242    MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty };
     
    220252    bool m_hasReceivedMedia { false };
    221253    bool m_isFrameDisplayed { false };
     254    bool m_pendingSelectedTrackCheck { false };
    222255
    223256#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
  • trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm

    r210319 r210621  
    11/*
    2  * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
     2 * Copyright (C) 2015-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    5353SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
    5454
     55SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer)
    5556SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer)
    5657SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer)
    5758
     59SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*)
     60SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*)
     61
     62#define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral()
     63#define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed()
     64
     65using namespace WebCore;
     66
     67@interface WebAVSampleBufferStatusChangeListener : NSObject {
     68    MediaPlayerPrivateMediaStreamAVFObjC* _parent;
     69    Vector<RetainPtr<AVSampleBufferDisplayLayer>> _layers;
     70    Vector<RetainPtr<AVSampleBufferAudioRenderer>> _renderers;
     71}
     72
     73- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback;
     74- (void)invalidate;
     75- (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer;
     76- (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer;
     77- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
     78- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer *)renderer;
     79@end
     80
     81@implementation WebAVSampleBufferStatusChangeListener
     82
     83- (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)parent
     84{
     85    if (!(self = [super init]))
     86        return nil;
     87
     88    _parent = parent;
     89    return self;
     90}
     91
     92- (void)dealloc
     93{
     94    [self invalidate];
     95    [super dealloc];
     96}
     97
     98- (void)invalidate
     99{
     100    for (auto& layer : _layers)
     101        [layer removeObserver:self forKeyPath:@"status"];
     102    _layers.clear();
     103
     104    for (auto& renderer : _renderers)
     105        [renderer removeObserver:self forKeyPath:@"status"];
     106    _renderers.clear();
     107
     108    [[NSNotificationCenter defaultCenter] removeObserver:self];
     109
     110    _parent = nullptr;
     111}
     112
     113- (void)beginObservingLayer:(AVSampleBufferDisplayLayer*)layer
     114{
     115    ASSERT(_parent);
     116    ASSERT(!_layers.contains(layer));
     117
     118    _layers.append(layer);
     119    [layer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
     120}
     121
     122- (void)stopObservingLayer:(AVSampleBufferDisplayLayer*)layer
     123{
     124    ASSERT(_parent);
     125    ASSERT(_layers.contains(layer));
     126
     127    [layer removeObserver:self forKeyPath:@"status"];
     128    _layers.remove(_layers.find(layer));
     129}
     130
     131- (void)beginObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
     132{
     133    ASSERT(_parent);
     134    ASSERT(!_renderers.contains(renderer));
     135
     136    _renderers.append(renderer);
     137    [renderer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr];
     138}
     139
     140- (void)stopObservingRenderer:(AVSampleBufferAudioRenderer*)renderer
     141{
     142    ASSERT(_parent);
     143    ASSERT(_renderers.contains(renderer));
     144
     145    [renderer removeObserver:self forKeyPath:@"status"];
     146    _renderers.remove(_renderers.find(renderer));
     147}
     148
     149- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
     150{
     151    UNUSED_PARAM(context);
     152    UNUSED_PARAM(keyPath);
     153    ASSERT(_parent);
     154
     155    RetainPtr<WebAVSampleBufferStatusChangeListener> protectedSelf = self;
     156    if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) {
     157        RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object;
     158        RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
     159
     160        ASSERT(_layers.contains(layer.get()));
     161        ASSERT([keyPath isEqualToString:@"status"]);
     162
     163        callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer), status = WTFMove(status)] {
     164            protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get());
     165        });
     166
     167    } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) {
     168        RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object;
     169        RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey];
     170
     171        ASSERT(_renderers.contains(renderer.get()));
     172        ASSERT([keyPath isEqualToString:@"status"]);
     173
     174        callOnMainThread([protectedSelf = WTFMove(protectedSelf), renderer = WTFMove(renderer), status = WTFMove(status)] {
     175            protectedSelf->_parent->rendererStatusDidChange(renderer.get(), status.get());
     176        });
     177    } else
     178        ASSERT_NOT_REACHED();
     179}
     180@end
     181
    58182namespace WebCore {
    59183
    60184#pragma mark -
    61185#pragma mark MediaPlayerPrivateMediaStreamAVFObjC
     186
     187static const double rendererLatency = 0.02;
    62188
    63189MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer* player)
    64190    : m_player(player)
    65191    , m_weakPtrFactory(this)
     192    , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this]))
     193#if USE(RENDER_SYNCHRONIZER)
     194    , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]))
     195#else
    66196    , m_clock(Clock::create())
     197#endif
    67198#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
    68199    , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create())
     
    82213    }
    83214
     215    destroyLayer();
     216#if USE(RENDER_SYNCHRONIZER)
     217    destroyAudioRenderers();
     218#endif
     219
    84220    m_audioTrackMap.clear();
    85221    m_videoTrackMap.clear();
    86 
    87     destroyLayer();
    88222}
    89223
     
    128262#pragma mark AVSampleBuffer Methods
    129263
    130 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&)
    131 {
    132     // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
    133 }
    134 
    135 void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData()
    136 {
    137     [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
    138         [m_sampleBufferDisplayLayer stopRequestingMediaData];
    139 
    140         while (!m_sampleQueue.isEmpty()) {
    141             if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
    142                 requestNotificationWhenReadyForMediaData();
    143                 return;
    144             }
    145 
    146             auto sample = m_sampleQueue.takeFirst();
    147             enqueueVideoSampleBuffer(sample.get());
    148         }
    149     }];
    150 }
    151 
    152 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer(MediaSample& sample)
    153 {
     264void MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue(PendingSampleQueue& queue)
     265{
     266    MediaTime now = streamTime();
     267    while (!queue.isEmpty()) {
     268        if (queue.first()->decodeTime() > now)
     269            break;
     270        queue.removeFirst();
     271    };
     272}
     273
     274void MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue(PendingSampleQueue& queue, MediaSample& sample)
     275{
     276    removeOldSamplesFromPendingQueue(queue);
     277    queue.append(sample);
     278}
     279
     280void MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes(MediaSample& sample, const MediaTime& timelineOffset, const char* loggingPrefix)
     281{
     282    LOG(MediaCaptureSamples, "%s(%p): original sample = %s", loggingPrefix, this, toString(sample).utf8().data());
     283    sample.offsetTimestampsBy(timelineOffset);
     284    LOG(MediaCaptureSamples, "%s(%p): adjusted sample = %s", loggingPrefix, this, toString(sample).utf8().data());
     285
     286#if !LOG_DISABLED
     287    MediaTime now = streamTime();
     288    double delta = (sample.presentationTime() - now).toDouble();
     289    if (delta < 0)
     290        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %f seconds late", loggingPrefix, this, toString(now).utf8().data(), -delta);
     291    else if (delta < .01)
     292        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is only %s seconds early", loggingPrefix, this, toString(now).utf8().data(), delta);
     293    else if (delta > .3)
     294        LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %s seconds early!", loggingPrefix, this, toString(now).utf8().data(), delta);
     295#else
     296    UNUSED_PARAM(loggingPrefix);
     297#endif
     298
     299}
     300
     301MediaTime MediaPlayerPrivateMediaStreamAVFObjC::calculateTimelineOffset(const MediaSample& sample, double latency)
     302{
     303    MediaTime sampleTime = sample.outputPresentationTime();
     304    if (!sampleTime || !sampleTime.isValid())
     305        sampleTime = sample.presentationTime();
     306    MediaTime timelineOffset = streamTime() - sampleTime + MediaTime::createWithDouble(latency);
     307    if (timelineOffset.timeScale() != sampleTime.timeScale())
     308        timelineOffset = toMediaTime(CMTimeConvertScale(toCMTime(timelineOffset), sampleTime.timeScale(), kCMTimeRoundingMethod_Default));
     309    return timelineOffset;
     310}
     311
     312#if USE(RENDER_SYNCHRONIZER)
     313void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample(MediaStreamTrackPrivate& track, MediaSample& sample)
     314{
     315    ASSERT(m_audioTrackMap.contains(track.id()));
     316    ASSERT(m_audioRenderers.contains(sample.trackID()));
     317
     318    auto audioTrack = m_audioTrackMap.get(track.id());
     319    MediaTime timelineOffset = audioTrack->timelineOffset();
     320    if (timelineOffset == MediaTime::invalidTime()) {
     321        timelineOffset = calculateTimelineOffset(sample, rendererLatency);
     322        audioTrack->setTimelineOffset(timelineOffset);
     323        LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample: timeline offset for track %s set to (%lld/%d)", track.id().utf8().data(), timelineOffset.timeValue(), timelineOffset.timeScale());
     324    }
     325
     326    updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample");
     327
     328    auto renderer = m_audioRenderers.get(sample.trackID());
     329    if (![renderer isReadyForMoreMediaData]) {
     330        addSampleToPendingQueue(m_pendingAudioSampleQueue, sample);
     331        requestNotificationWhenReadyForAudioData(sample.trackID());
     332        return;
     333    }
     334
     335    [renderer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer];
     336}
     337#endif
     338
     339void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate& track, MediaSample& sample)
     340{
     341    ASSERT(m_videoTrackMap.contains(track.id()));
     342
     343    if (&track != m_mediaStreamPrivate->activeVideoTrack())
     344        return;
     345
     346    m_hasReceivedMedia = true;
     347    updateReadyState();
     348    if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_isFrameDisplayed))
     349        return;
     350
     351    auto videoTrack = m_videoTrackMap.get(track.id());
     352    MediaTime timelineOffset = videoTrack->timelineOffset();
     353    if (timelineOffset == MediaTime::invalidTime()) {
     354        timelineOffset = calculateTimelineOffset(sample, rendererLatency);
     355        videoTrack->setTimelineOffset(timelineOffset);
     356        LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample: timeline offset for track %s set to %f", track.id().utf8().data(), timelineOffset.toDouble());
     357    }
     358
     359    updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample");
     360
    154361    if (m_sampleBufferDisplayLayer) {
    155362        if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
    156             m_sampleQueue.append(sample);
    157             requestNotificationWhenReadyForMediaData();
     363            addSampleToPendingQueue(m_pendingVideoSampleQueue, sample);
     364            requestNotificationWhenReadyForVideoData();
    158365            return;
    159366        }
     
    165372    if (!m_hasEverEnqueuedVideoFrame) {
    166373        m_hasEverEnqueuedVideoFrame = true;
     374        if (m_displayMode == PausedImage)
     375            updatePausedImage();
    167376        m_player->firstVideoFrameAvailable();
    168         updatePausedImage();
    169     }
    170 }
    171 
    172 void MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate& track, MediaSample& sample)
    173 {
    174     if (&track != m_mediaStreamPrivate->activeVideoTrack() || !shouldEnqueueVideoSampleBuffer())
    175         return;
    176 
    177     enqueueVideoSampleBuffer(sample);
     377    }
     378}
     379
     380void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData()
     381{
     382    [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
     383        [m_sampleBufferDisplayLayer stopRequestingMediaData];
     384
     385        while (!m_pendingVideoSampleQueue.isEmpty()) {
     386            if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) {
     387                requestNotificationWhenReadyForVideoData();
     388                return;
     389            }
     390
     391            auto sample = m_pendingVideoSampleQueue.takeFirst();
     392            enqueueVideoSample(*m_activeVideoTrack.get(), sample.get());
     393        }
     394    }];
     395}
     396
     397#if USE(RENDER_SYNCHRONIZER)
     398void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData(AtomicString trackID)
     399{
     400    if (!m_audioRenderers.contains(trackID))
     401        return;
     402
     403    auto renderer = m_audioRenderers.get(trackID);
     404    [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ {
     405        [renderer stopRequestingMediaData];
     406
     407        auto audioTrack = m_audioTrackMap.get(trackID);
     408        while (!m_pendingAudioSampleQueue.isEmpty()) {
     409            if (![renderer isReadyForMoreMediaData]) {
     410                requestNotificationWhenReadyForAudioData(trackID);
     411                return;
     412            }
     413
     414            auto sample = m_pendingAudioSampleQueue.takeFirst();
     415            enqueueAudioSample(audioTrack->streamTrack(), sample.get());
     416        }
     417    }];
     418}
     419
     420void MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer(AtomicString trackID)
     421{
     422    ASSERT(!m_audioRenderers.contains(trackID));
     423    auto renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]);
     424    [renderer setAudioTimePitchAlgorithm:(m_player->preservesPitch() ? AVAudioTimePitchAlgorithmSpectral : AVAudioTimePitchAlgorithmVarispeed)];
     425    m_audioRenderers.set(trackID, renderer);
     426    [m_synchronizer addRenderer:renderer.get()];
     427    [m_statusChangeListener beginObservingRenderer:renderer.get()];
     428    if (m_audioRenderers.size() == 1)
     429        renderingModeChanged();
     430}
     431
     432void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AVSampleBufferAudioRenderer* renderer)
     433{
     434    [m_statusChangeListener stopObservingRenderer:renderer];
     435    [renderer flush];
     436    [renderer stopRequestingMediaData];
     437
     438    CMTime now = CMTimebaseGetTime([m_synchronizer timebase]);
     439    [m_synchronizer removeRenderer:renderer atTime:now withCompletionHandler:^(BOOL) { }];
     440}
     441
     442void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AtomicString trackID)
     443{
     444    if (!m_audioRenderers.contains(trackID))
     445        return;
     446
     447    destroyAudioRenderer(m_audioRenderers.get(trackID).get());
     448    m_audioRenderers.remove(trackID);
     449    if (!m_audioRenderers.size())
     450        renderingModeChanged();
     451}
     452
     453void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers()
     454{
     455    m_pendingAudioSampleQueue.clear();
     456    for (auto& renderer : m_audioRenderers.values())
     457        destroyAudioRenderer(renderer.get());
     458    m_audioRenderers.clear();
     459}
     460
     461AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider()
     462{
     463    // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305
     464    for (const auto& track : m_audioTrackMap.values()) {
     465        if (track->streamTrack().ended() || !track->streamTrack().enabled() || track->streamTrack().muted())
     466            continue;
     467
     468        return track->streamTrack().audioSourceProvider();
     469    }
     470    return nullptr;
     471}
     472#endif
     473
     474void MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange(AVSampleBufferAudioRenderer* renderer, NSNumber* status)
     475{
     476#if USE(RENDER_SYNCHRONIZER)
     477    String trackID;
     478    for (auto& pair : m_audioRenderers) {
     479        if (pair.value == renderer) {
     480            trackID = pair.key;
     481            break;
     482        }
     483    }
     484    ASSERT(!trackID.isEmpty());
     485    if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
     486        m_audioTrackMap.get(trackID)->setTimelineOffset(MediaTime::invalidTime());
     487#else
     488    UNUSED_PARAM(renderer);
     489    UNUSED_PARAM(status);
     490#endif
     491}
     492
     493void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status)
     494{
     495    ASSERT_UNUSED(layer, layer == m_sampleBufferDisplayLayer);
     496    ASSERT(m_activeVideoTrack);
     497    if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering)
     498        m_videoTrackMap.get(m_activeVideoTrack->id())->setTimelineOffset(MediaTime::invalidTime());
     499}
     500
     501void MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers()
     502{
     503    if (m_sampleBufferDisplayLayer)
     504        [m_sampleBufferDisplayLayer flush];
     505
     506#if USE(RENDER_SYNCHRONIZER)
     507    for (auto& renderer : m_audioRenderers.values())
     508        [renderer flush];
     509#endif
    178510}
    179511
     
    197529void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer()
    198530{
    199     if (!m_mediaStreamPrivate || haveVideoLayer())
    200         return;
    201 
    202     CALayer *videoLayer = nil;
    203     if (m_mediaStreamPrivate->activeVideoTrack()) {
    204         m_videoPreviewPlayer = m_mediaStreamPrivate->activeVideoTrack()->preview();
    205         if (m_videoPreviewPlayer)
    206             videoLayer = m_videoPreviewPlayer->platformLayer();
    207     }
    208 
    209     if (!videoLayer) {
    210         m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
    211         videoLayer = m_sampleBufferDisplayLayer.get();
     531    if (m_sampleBufferDisplayLayer)
     532        return;
     533
     534    m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]);
    212535#ifndef NDEBUG
    213         [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
    214 #endif
    215         m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
    216 
    217 #if PLATFORM(MAC)
    218         m_synchronizer = adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]);
    219         [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
    220 #endif
    221     }
     536    [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"];
     537#endif
     538    m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black);
     539    [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()];
     540
     541#if USE(RENDER_SYNCHRONIZER)
     542    [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];
     543#endif
    222544
    223545    renderingModeChanged();
    224546   
    225547#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
    226     m_videoFullscreenLayerManager->setVideoLayer(videoLayer, snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
     548    m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());
    227549#endif
    228550}
     
    230552void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer()
    231553{
    232     if (!haveVideoLayer())
    233         return;
    234 
    235     m_videoPreviewPlayer = nullptr;
     554    if (!m_sampleBufferDisplayLayer)
     555        return;
    236556
    237557    if (m_sampleBufferDisplayLayer) {
     558        m_pendingVideoSampleQueue.clear();
     559        [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()];
    238560        [m_sampleBufferDisplayLayer stopRequestingMediaData];
    239561        [m_sampleBufferDisplayLayer flush];
    240 #if PLATFORM(MAC)
     562#if USE(RENDER_SYNCHRONIZER)
    241563        CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);
    242564        [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL) {
     
    306628PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const
    307629{
    308     if (!haveVideoLayer() || m_displayMode == None)
     630    if (!m_sampleBufferDisplayLayer || m_displayMode == None)
    309631        return nullptr;
    310632
     
    312634    return m_videoFullscreenLayerManager->videoInlineLayer();
    313635#else
    314     if (m_videoPreviewPlayer)
    315         return m_videoPreviewPlayer->platformLayer();
    316636
    317637    return m_sampleBufferDisplayLayer.get();
     
    321641MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const
    322642{
    323     if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !haveVideoLayer())
     643    if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer)
    324644        return None;
    325645
     
    369689        return;
    370690
    371     m_clock->start();
    372691    m_playing = true;
    373 
    374     if (m_videoPreviewPlayer)
    375         m_videoPreviewPlayer->play();
    376 #if PLATFORM(MAC)
    377     else
    378         [m_synchronizer setRate:1];
    379 #endif
    380 
    381     for (const auto& track : m_audioTrackMap.values()) {
    382         if (!track->enabled() || !track->streamTrack().preview())
    383             continue;
    384 
    385         track->streamTrack().preview()->play();
    386     }
     692#if USE(RENDER_SYNCHRONIZER)
     693    if (!m_synchronizer.get().rate)
     694        [m_synchronizer setRate:1 ]; // streamtime
     695#else
     696    if (!m_clock->isRunning())
     697        m_clock->start();
     698#endif
    387699
    388700    m_haveEverPlayed = true;
     
    400712        return;
    401713
    402     m_pausedTime = m_clock->currentTime();
     714    m_pausedTime = currentMediaTime();
    403715    m_playing = false;
    404 
    405     if (m_videoPreviewPlayer)
    406         m_videoPreviewPlayer->pause();
    407 #if PLATFORM(MAC)
    408     else
    409         [m_synchronizer setRate:0];
    410 #endif
    411 
    412     for (const auto& track : m_audioTrackMap.values()) {
    413         if (!track->enabled() || !track->streamTrack().preview())
    414             continue;
    415 
    416         track->streamTrack().preview()->pause();
    417     }
    418716
    419717    updateDisplayMode();
    420718    updatePausedImage();
     719    flushRenderers();
    421720}
    422721
     
    426725}
    427726
    428 void MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume(float volume, bool internal)
    429 {
    430     if (!internal)
    431         m_volume = volume;
    432 
    433     if (!metaDataAvailable())
    434         return;
    435 
    436     for (const auto& track : m_audioTrackMap.values()) {
    437         if (!track->enabled() || !track->streamTrack().preview())
    438             continue;
    439 
    440         track->streamTrack().preview()->setVolume(volume);
    441     }
    442 }
    443 
    444727void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume)
    445728{
    446     internalSetVolume(volume, false);
     729    LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::setVolume(%p)", this);
     730
     731    if (m_volume == volume)
     732        return;
     733
     734    m_volume = volume;
     735
     736#if USE(RENDER_SYNCHRONIZER)
     737    for (auto& renderer : m_audioRenderers.values())
     738        [renderer setVolume:volume];
     739#endif
    447740}
    448741
     
    456749    m_muted = muted;
    457750   
    458     internalSetVolume(muted ? 0 : m_volume, true);
     751#if USE(RENDER_SYNCHRONIZER)
     752    for (auto& renderer : m_audioRenderers.values())
     753        [renderer setMuted:muted];
     754#endif
    459755}
    460756
     
    482778MediaTime MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime() const
    483779{
    484     return MediaTime::createWithDouble(m_playing ? m_clock->currentTime() : m_pausedTime);
     780    if (!m_playing)
     781        return m_pausedTime;
     782
     783    return streamTime();
     784}
     785
     786MediaTime MediaPlayerPrivateMediaStreamAVFObjC::streamTime() const
     787{
     788#if USE(RENDER_SYNCHRONIZER)
     789    return toMediaTime(CMTimebaseGetTime([m_synchronizer timebase]));
     790#else
     791    return MediaTime::createWithDouble(m_clock->currentTime());
     792#endif
    485793}
    486794
     
    599907    ASSERT(m_mediaStreamPrivate);
    600908
     909    if (!m_hasReceivedMedia) {
     910        m_hasReceivedMedia = true;
     911        updateReadyState();
     912    }
     913
     914    if (!m_playing || streamTime().toDouble() < 0)
     915        return;
     916
     917#if USE(RENDER_SYNCHRONIZER)
     918    if (!CMTimebaseGetEffectiveRate([m_synchronizer timebase]))
     919        return;
     920#endif
     921
    601922    switch (track.type()) {
    602923    case RealtimeMediaSource::None:
     
    604925        break;
    605926    case RealtimeMediaSource::Audio:
    606         // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836
     927#if USE(RENDER_SYNCHRONIZER)
     928        enqueueAudioSample(track, mediaSample);
     929#endif
    607930        break;
    608931    case RealtimeMediaSource::Video:
    609         prepareVideoSampleBufferFromTrack(track, mediaSample);
    610         m_hasReceivedMedia = true;
    611         scheduleDeferredTask([this] {
    612             updateReadyState();
    613         });
     932        if (&track == m_activeVideoTrack.get())
     933            enqueueVideoSample(track, mediaSample);
    614934        break;
    615935    }
     
    617937
    618938#if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)
    619 
    620939void MediaPlayerPrivateMediaStreamAVFObjC::setVideoFullscreenLayer(PlatformLayer *videoFullscreenLayer, std::function<void()> completionHandler)
    621940{
     
    627946    m_videoFullscreenLayerManager->setVideoFullscreenFrame(frame);
    628947}
    629 
    630 #endif
    631 
    632 template <typename RefT, typename PassRefT>
    633 void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT), std::function<void(RefT, int)> configureCallback, MediaStreamTrackPrivate::Observer* trackObserver)
     948#endif
     949
     950typedef enum {
     951    Add,
     952    Remove,
     953    Configure
     954} TrackState;
     955
     956template <typename RefT>
     957void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), const Function<void(RefT, int, TrackState)>& configureTrack)
    634958{
    635959    Vector<RefT> removedTracks;
     
    661985
    662986    int index = 0;
     987    for (auto& track : removedTracks)
     988        configureTrack(track, index++, TrackState::Remove);
     989
     990    index = 0;
     991    for (auto& track : addedTracks)
     992        configureTrack(track, index++, TrackState::Add);
     993
     994    index = 0;
    663995    for (const auto& track : trackMap.values())
    664         configureCallback(track, index++);
    665 
    666     for (auto& track : removedTracks) {
    667         (player->*removedFunction)(*track);
    668         track->streamTrack().removeObserver(*trackObserver);
    669     }
    670 
    671     for (auto& track : addedTracks) {
    672         (player->*addedFunction)(*track);
    673         track->streamTrack().addObserver(*trackObserver);
    674     }
     996        configureTrack(track, index++, TrackState::Configure);
     997}
     998
     999void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack()
     1000{
     1001    if (m_pendingSelectedTrackCheck)
     1002        return;
     1003
     1004    m_pendingSelectedTrackCheck = true;
     1005    scheduleDeferredTask([this] {
     1006        bool hideVideoLayer = true;
     1007        m_activeVideoTrack = nullptr;
     1008        if (m_mediaStreamPrivate->activeVideoTrack()) {
     1009            for (const auto& track : m_videoTrackMap.values()) {
     1010                if (&track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack()) {
     1011                    m_activeVideoTrack = m_mediaStreamPrivate->activeVideoTrack();
     1012                    if (track->selected())
     1013                        hideVideoLayer = false;
     1014                    break;
     1015                }
     1016            }
     1017        }
     1018
     1019        ensureLayer();
     1020        m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer;
     1021        m_pendingSelectedTrackCheck = false;
     1022    });
    6751023}
    6761024
     
    6791027    MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate->tracks();
    6801028
    681     std::function<void(RefPtr<AudioTrackPrivateMediaStream>, int)> enableAudioTrack = [this](auto track, int index)
     1029    Function<void(RefPtr<AudioTrackPrivateMediaStream>, int, TrackState)>  setAudioTrackState = [this](auto track, int index, TrackState state)
    6821030    {
    683         track->setTrackIndex(index);
    684         track->setEnabled(track->streamTrack().enabled() && !track->streamTrack().muted());
     1031        switch (state) {
     1032        case TrackState::Remove:
     1033            track->streamTrack().removeObserver(*this);
     1034            m_player->removeAudioTrack(*track);
     1035#if USE(RENDER_SYNCHRONIZER)
     1036            destroyAudioRenderer(track->id());
     1037#endif
     1038            break;
     1039        case TrackState::Add:
     1040            track->streamTrack().addObserver(*this);
     1041            m_player->addAudioTrack(*track);
     1042#if USE(RENDER_SYNCHRONIZER)
     1043            createAudioRenderer(track->id());
     1044#endif
     1045            break;
     1046        case TrackState::Configure:
     1047            track->setTrackIndex(index);
     1048            bool enabled = track->streamTrack().enabled() && !track->streamTrack().muted();
     1049            track->setEnabled(enabled);
     1050#if USE(RENDER_SYNCHRONIZER)
     1051            auto renderer = m_audioRenderers.get(track->id());
     1052            ASSERT(renderer);
     1053            renderer.get().muted = !enabled;
     1054#endif
     1055            break;
     1056        }
    6851057    };
    686     updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack, enableAudioTrack, (MediaStreamTrackPrivate::Observer*) this);
    687 
    688     std::function<void(RefPtr<VideoTrackPrivateMediaStream>, int)> enableVideoTrack = [this](auto track, int index)
     1058    updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, setAudioTrackState);
     1059
     1060    Function<void(RefPtr<VideoTrackPrivateMediaStream>, int, TrackState)> setVideoTrackState = [&](auto track, int index, TrackState state)
    6891061    {
    690         track->setTrackIndex(index);
    691         bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack();
    692         track->setSelected(selected);
    693 
    694         if (selected)
    695             ensureLayer();
     1062        switch (state) {
     1063        case TrackState::Remove:
     1064            track->streamTrack().removeObserver(*this);
     1065            m_player->removeVideoTrack(*track);
     1066            checkSelectedVideoTrack();
     1067            break;
     1068        case TrackState::Add:
     1069            track->streamTrack().addObserver(*this);
     1070            m_player->addVideoTrack(*track);
     1071            break;
     1072        case TrackState::Configure:
     1073            track->setTrackIndex(index);
     1074            bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack();
     1075            track->setSelected(selected);
     1076            checkSelectedVideoTrack();
     1077            break;
     1078        }
    6961079    };
    697     updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack, enableVideoTrack, (MediaStreamTrackPrivate::Observer*) this);
     1080    updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, setVideoTrackState);
    6981081}
    6991082
  • trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm

    r207694 r210621  
    3838}
    3939
     40MediaTime MediaSampleAVFObjC::outputPresentationTime() const
     41{
     42    return toMediaTime(CMSampleBufferGetOutputPresentationTimeStamp(m_sample.get()));
     43}
     44
    4045MediaTime MediaSampleAVFObjC::decodeTime() const
    4146{
     
    4651{
    4752    return toMediaTime(CMSampleBufferGetDuration(m_sample.get()));
     53}
     54
     55MediaTime MediaSampleAVFObjC::outputDuration() const
     56{
     57    return toMediaTime(CMSampleBufferGetOutputDuration(m_sample.get()));
    4858}
    4959
     
    112122void MediaSampleAVFObjC::dump(PrintStream& out) const
    113123{
    114     out.print("{PTS(", presentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}");
     124    out.print("{PTS(", presentationTime(), "), OPTS(", outputPresentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}");
    115125}
    116126
  • trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h

    r210319 r210621  
    5151    MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); }
    5252
     53    MediaTime timelineOffset() const { return m_timelineOffset; }
     54    void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; }
     55
    5356private:
    5457    AudioTrackPrivateMediaStream(MediaStreamTrackPrivate& track)
     
    5659        , m_id(track.id())
    5760        , m_label(track.label())
     61        , m_timelineOffset(MediaTime::invalidTime())
    5862    {
    5963    }
     
    6367    AtomicString m_label;
    6468    int m_index { 0 };
     69    MediaTime m_timelineOffset;
    6570};
    6671
  • trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp

    r209959 r210621  
    101101    m_isEnabled = enabled;
    102102
    103     if (m_preview)
    104         m_preview->setEnabled(enabled);
    105 
    106103    for (auto& observer : m_observers)
    107104        observer->trackEnabledChanged(*this);
     
    118115    m_isEnded = true;
    119116
    120     m_preview = nullptr;
    121117    m_source->requestStop(this);
    122118
     
    164160}
    165161
    166 RealtimeMediaSourcePreview* MediaStreamTrackPrivate::preview()
    167 {
    168     if (m_preview)
    169         return m_preview.get();
    170 
    171     m_preview = m_source->preview();
    172     return m_preview.get();
    173 }
    174 
    175162void MediaStreamTrackPrivate::applyConstraints(const MediaConstraints& constraints, RealtimeMediaSource::SuccessHandler successHandler, RealtimeMediaSource::FailureHandler failureHandler)
    176163{
  • trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h

    r209959 r210621  
    9292
    9393    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&);
    94     RealtimeMediaSourcePreview* preview();
    9594
    9695private:
     
    106105    Vector<Observer*> m_observers;
    107106    Ref<RealtimeMediaSource> m_source;
    108     RefPtr<RealtimeMediaSourcePreview> m_preview;
    109107
    110108    String m_id;
  • trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h

    r208985 r210621  
    4343#include "PlatformLayer.h"
    4444#include "RealtimeMediaSourceCapabilities.h"
    45 #include "RealtimeMediaSourcePreview.h"
    4645#include <wtf/RefCounted.h>
    4746#include <wtf/Vector.h>
     
    130129    virtual RefPtr<Image> currentFrameImage() { return nullptr; }
    131130    virtual void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) { }
    132     virtual RefPtr<RealtimeMediaSourcePreview> preview() { return nullptr; }
    133131
    134132    void setWidth(int);
  • trunk/Source/WebCore/platform/mediastream/VideoTrackPrivateMediaStream.h

    r210319 r210621  
    4141    }
    4242
    43     Kind kind() const override { return Kind::Main; }
    44     AtomicString id() const override { return m_id; }
    45     AtomicString label() const override { return m_label; }
    46     AtomicString language() const override { return emptyAtom; }
    47     int trackIndex() const override { return m_index; }
    48 
    4943    void setTrackIndex(int index) { m_index = index; }
    5044
    5145    MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); }
     46
     47    MediaTime timelineOffset() const { return m_timelineOffset; }
     48    void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; }
    5249
    5350private:
     
    5653        , m_id(track.id())
    5754        , m_label(track.label())
     55        , m_timelineOffset(MediaTime::invalidTime())
    5856    {
    5957    }
     58
     59    Kind kind() const final { return Kind::Main; }
     60    AtomicString id() const final { return m_id; }
     61    AtomicString label() const final { return m_label; }
     62    AtomicString language() const final { return emptyAtom; }
     63    int trackIndex() const final { return m_index; }
    6064
    6165    Ref<MediaStreamTrackPrivate> m_streamTrack;
     
    6367    AtomicString m_label;
    6468    int m_index { 0 };
     69    MediaTime m_timelineOffset;
    6570};
    6671
  • trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h

    r208851 r210621  
    11/*
    2  * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    6868    void updateSettings(RealtimeMediaSourceSettings&) override;
    6969    AudioSourceProvider* audioSourceProvider() override;
    70     RefPtr<AVMediaSourcePreview> createPreview() final;
    7170
    7271    RetainPtr<AVCaptureConnection> m_audioConnection;
  • trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm

    r210105 r210621  
    11/*
    2  * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3131#import "Logging.h"
    3232#import "MediaConstraints.h"
    33 #import "NotImplemented.h"
     33#import "MediaSampleAVFObjC.h"
    3434#import "RealtimeMediaSourceSettings.h"
    3535#import "SoftLinking.h"
     
    5050typedef AVCaptureOutput AVCaptureOutputType;
    5151
    52 #if !PLATFORM(IOS)
    53 typedef AVCaptureAudioPreviewOutput AVCaptureAudioPreviewOutputType;
    54 #endif
    55 
    5652SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
    5753
    5854SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioChannel)
    5955SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioDataOutput)
    60 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioPreviewOutput)
    6156SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection)
    6257SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice)
    6358SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput)
    6459SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput)
    65 
    66 #define AVCaptureAudioPreviewOutput getAVCaptureAudioPreviewOutputClass()
    6760
    6861#define AVCaptureAudioChannel getAVCaptureAudioChannelClass()
     
    8174namespace WebCore {
    8275
    83 #if !PLATFORM(IOS)
    84 class AVAudioSourcePreview: public AVMediaSourcePreview {
    85 public:
    86     static RefPtr<AVMediaSourcePreview> create(AVCaptureSession *, AVAudioCaptureSource*);
    87 
    88 private:
    89     AVAudioSourcePreview(AVCaptureSession *, AVAudioCaptureSource*);
    90 
    91     void invalidate() final;
    92 
    93     void play() const final;
    94     void pause() const final;
    95     void setVolume(double) const final;
    96     void setEnabled(bool) final;
    97     PlatformLayer* platformLayer() const final { return nullptr; }
    98 
    99     void updateState() const;
    100 
    101     RetainPtr<AVCaptureAudioPreviewOutputType> m_audioPreviewOutput;
    102     mutable double m_volume { 1 };
    103     mutable bool m_paused { false };
    104     mutable bool m_enabled { true };
    105 };
    106 
    107 RefPtr<AVMediaSourcePreview> AVAudioSourcePreview::create(AVCaptureSession *session, AVAudioCaptureSource* parent)
    108 {
    109     return adoptRef(new AVAudioSourcePreview(session, parent));
    110 }
    111 
    112 AVAudioSourcePreview::AVAudioSourcePreview(AVCaptureSession *session, AVAudioCaptureSource* parent)
    113     : AVMediaSourcePreview(parent)
    114 {
    115     m_audioPreviewOutput = adoptNS([allocAVCaptureAudioPreviewOutputInstance() init]);
    116     setVolume(1);
    117     [session addOutput:m_audioPreviewOutput.get()];
    118 }
    119 
    120 void AVAudioSourcePreview::invalidate()
    121 {
    122     m_audioPreviewOutput = nullptr;
    123     AVMediaSourcePreview::invalidate();
    124 }
    125 
    126 void AVAudioSourcePreview::play() const
    127 {
    128     m_paused = false;
    129     updateState();
    130 }
    131 
    132 void AVAudioSourcePreview::pause() const
    133 {
    134     m_paused = true;
    135     updateState();
    136 }
    137 
    138 void AVAudioSourcePreview::setEnabled(bool enabled)
    139 {
    140     m_enabled = enabled;
    141     updateState();
    142 }
    143 
    144 void AVAudioSourcePreview::setVolume(double volume) const
    145 {
    146     m_volume = volume;
    147     m_audioPreviewOutput.get().volume = volume;
    148 }
    149 
    150 void AVAudioSourcePreview::updateState() const
    151 {
    152     m_audioPreviewOutput.get().volume = (!m_enabled || m_paused) ? 0 : m_volume;
    153 }
    154 #endif
    155 
    15676RefPtr<AVMediaCaptureSource> AVAudioCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint)
    15777{
     
    191111void AVAudioCaptureSource::updateSettings(RealtimeMediaSourceSettings& settings)
    192112{
    193     // FIXME: use [AVCaptureAudioPreviewOutput volume] for volume
     113    // FIXME: support volume
    194114
    195115    settings.setDeviceId(id());
     
    277197        return;
    278198
     199    RetainPtr<CMSampleBufferRef> buffer = sampleBuffer;
     200    scheduleDeferredTask([this, buffer] {
     201        mediaDataUpdated(MediaSampleAVFObjC::create(buffer.get()));
     202    });
     203
    279204    std::unique_lock<Lock> lock(m_lock, std::try_to_lock);
    280205    if (!lock.owns_lock()) {
     
    305230}
    306231
    307 RefPtr<AVMediaSourcePreview> AVAudioCaptureSource::createPreview()
    308 {
    309 #if !PLATFORM(IOS)
    310     return AVAudioSourcePreview::create(session(), this);
    311 #else
    312     return nullptr;
    313 #endif
    314 }
    315    
    316232} // namespace WebCore
    317233
  • trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h

    r208851 r210621  
    11/*
    2  * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    4848class AVMediaCaptureSource;
    4949
    50 class AVMediaSourcePreview: public RealtimeMediaSourcePreview {
    51 public:
    52     virtual ~AVMediaSourcePreview();
    53 
    54     void invalidate() override;
    55 
    56 protected:
    57     AVMediaSourcePreview(AVMediaCaptureSource*);
    58 
    59 private:
    60     WeakPtr<AVMediaCaptureSource> m_parent;
    61 };
    62 
    6350class AVMediaCaptureSource : public RealtimeMediaSource {
    6451public:
     
    7663    void stopProducingData() final;
    7764    bool isProducingData() const final { return m_isRunning; }
    78 
    79     RefPtr<RealtimeMediaSourcePreview> preview() final;
    80     void removePreview(AVMediaSourcePreview*);
    81     WeakPtr<AVMediaCaptureSource> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }
    8265
    8366protected:
     
    10083    void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*);
    10184
    102     virtual RefPtr<AVMediaSourcePreview> createPreview() = 0;
    103 
    10485private:
    10586    void setupSession();
     
    11899    RetainPtr<AVCaptureSession> m_session;
    119100    RetainPtr<AVCaptureDevice> m_device;
    120     Vector<WeakPtr<RealtimeMediaSourcePreview>> m_previews;
    121     WeakPtrFactory<AVMediaCaptureSource> m_weakPtrFactory;
    122101    bool m_isRunning { false};
    123102};
  • trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm

    r210105 r210621  
    130130    , m_objcObserver(adoptNS([[WebCoreAVMediaCaptureSourceObserver alloc] initWithCallback:this]))
    131131    , m_device(device)
    132     , m_weakPtrFactory(this)
    133132{
    134133    setName(device.localizedName);
     
    241240        [m_session removeObserver:m_objcObserver.get() forKeyPath:keyName];
    242241
    243     for (const auto& preview : m_previews) {
    244         if (preview)
    245             preview->invalidate();
    246     }
    247     m_previews.clear();
    248 
    249242    shutdownCaptureSession();
    250243    m_session = nullptr;
     
    276269    ASSERT_NOT_REACHED();
    277270    return nullptr;
    278 }
    279 
    280 RefPtr<RealtimeMediaSourcePreview> AVMediaCaptureSource::preview()
    281 {
    282     RefPtr<AVMediaSourcePreview> preview = createPreview();
    283     if (!preview)
    284         return nullptr;
    285 
    286     m_previews.append(preview->createWeakPtr());
    287     return preview.leakRef();
    288 }
    289 
    290 void AVMediaCaptureSource::removePreview(AVMediaSourcePreview* preview)
    291 {
    292     size_t index;
    293     for (index = 0; index < m_previews.size(); ++index) {
    294         if (m_previews[index].get() == preview)
    295             break;
    296     }
    297 
    298     if (index < m_previews.size())
    299         m_previews.remove(index);
    300 }
    301 
    302 AVMediaSourcePreview::AVMediaSourcePreview(AVMediaCaptureSource* parent)
    303     : m_parent(parent->createWeakPtr())
    304 {
    305 }
    306 
    307 AVMediaSourcePreview::~AVMediaSourcePreview()
    308 {
    309     if (m_parent)
    310         m_parent->removePreview(this);
    311 }
    312 
    313 void AVMediaSourcePreview::invalidate()
    314 {
    315     m_parent = nullptr;
    316     RealtimeMediaSourcePreview::invalidate();
    317271}
    318272
  • trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h

    r209188 r210621  
    8080    void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final;
    8181
    82     RefPtr<AVMediaSourcePreview> createPreview() final;
    8382    RetainPtr<CGImageRef> currentFrameCGImage();
    8483    RefPtr<Image> currentFrameImage() final;
  • trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm

    r210105 r210621  
    11/*
    2  * Copyright (C) 2013-2015 Apple Inc. All rights reserved.
     2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    3939#import "PlatformLayer.h"
    4040#import "RealtimeMediaSourceCenter.h"
    41 #import "RealtimeMediaSourcePreview.h"
    4241#import "RealtimeMediaSourceSettings.h"
    4342#import "WebActionDisablingCALayerDelegate.h"
     
    102101using namespace WebCore;
    103102
    104 @interface WebCoreAVVideoCaptureSourceObserver : NSObject<CALayerDelegate> {
    105     AVVideoSourcePreview *_parent;
    106     BOOL _hasObserver;
    107 }
    108 
    109 - (void)setParent:(AVVideoSourcePreview *)parent;
    110 - (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context;
    111 @end
    112 
    113103namespace WebCore {
    114 
    115 class AVVideoSourcePreview: public AVMediaSourcePreview {
    116 public:
    117     static RefPtr<AVMediaSourcePreview> create(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);
    118 
    119     void backgroundLayerBoundsChanged();
    120     PlatformLayer* platformLayer() const final { return m_previewBackgroundLayer.get(); }
    121 
    122 private:
    123     AVVideoSourcePreview(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);
    124 
    125     void invalidate() final;
    126 
    127     void play() const final;
    128     void pause() const final;
    129     void setVolume(double) const final { };
    130     void setEnabled(bool) final;
    131     void setPaused(bool) const;
    132 
    133     RetainPtr<AVCaptureVideoPreviewLayerType> m_previewLayer;
    134     RetainPtr<PlatformLayer> m_previewBackgroundLayer;
    135     RetainPtr<AVCaptureDeviceTypedef> m_device;
    136     RetainPtr<WebCoreAVVideoCaptureSourceObserver> m_objcObserver;
    137 };
    138 
    139 RefPtr<AVMediaSourcePreview> AVVideoSourcePreview::create(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)
    140 {
    141     return adoptRef(new AVVideoSourcePreview(session, device, parent));
    142 }
    143 
    144 AVVideoSourcePreview::AVVideoSourcePreview(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)
    145     : AVMediaSourcePreview(parent)
    146     , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] init]))
    147 {
    148     m_device = device;
    149     m_previewLayer = adoptNS([allocAVCaptureVideoPreviewLayerInstance() initWithSession:session]);
    150     m_previewLayer.get().contentsGravity = kCAGravityResize;
    151     m_previewLayer.get().anchorPoint = CGPointZero;
    152     [m_previewLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];
    153 
    154     m_previewBackgroundLayer = adoptNS([[CALayer alloc] init]);
    155     m_previewBackgroundLayer.get().contentsGravity = kCAGravityResizeAspect;
    156     m_previewBackgroundLayer.get().anchorPoint = CGPointZero;
    157     m_previewBackgroundLayer.get().needsDisplayOnBoundsChange = YES;
    158     [m_previewBackgroundLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];
    159 
    160 #ifndef NDEBUG
    161     m_previewLayer.get().name = @"AVVideoCaptureSource preview layer";
    162     m_previewBackgroundLayer.get().name = @"AVVideoSourcePreview parent layer";
    163 #endif
    164 
    165     [m_previewBackgroundLayer addSublayer:m_previewLayer.get()];
    166 
    167     [m_objcObserver.get() setParent:this];
    168 }
    169 
    170 void AVVideoSourcePreview::backgroundLayerBoundsChanged()
    171 {
    172     if (m_previewBackgroundLayer && m_previewLayer)
    173         [m_previewLayer.get() setBounds:m_previewBackgroundLayer.get().bounds];
    174 }
    175 
    176 void AVVideoSourcePreview::invalidate()
    177 {
    178     [m_objcObserver.get() setParent:nil];
    179     m_objcObserver = nullptr;
    180     m_previewLayer = nullptr;
    181     m_previewBackgroundLayer = nullptr;
    182     m_device = nullptr;
    183     AVMediaSourcePreview::invalidate();
    184 }
    185 
    186 void AVVideoSourcePreview::play() const
    187 {
    188     setPaused(false);
    189 }
    190 
    191 void AVVideoSourcePreview::pause() const
    192 {
    193     setPaused(true);
    194 }
    195 
    196 void AVVideoSourcePreview::setPaused(bool paused) const
    197 {
    198     [m_device lockForConfiguration:nil];
    199     m_previewLayer.get().connection.enabled = !paused;
    200     [m_device unlockForConfiguration];
    201 }
    202 
    203 void AVVideoSourcePreview::setEnabled(bool enabled)
    204 {
    205     m_previewLayer.get().hidden = !enabled;
    206 }
    207104
    208105const OSType videoCaptureFormat = kCVPixelFormatType_32BGRA;
     
    513410
    514411    updateFramerate(sampleBuffer.get());
    515 
    516     CMSampleBufferRef newSampleBuffer = 0;
    517     CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer.get(), &newSampleBuffer);
    518     ASSERT(newSampleBuffer);
    519 
    520     CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(newSampleBuffer, true);
    521     if (attachmentsArray) {
    522         for (CFIndex i = 0; i < CFArrayGetCount(attachmentsArray); ++i) {
    523             CFMutableDictionaryRef attachments = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachmentsArray, i);
    524             CFDictionarySetValue(attachments, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
    525         }
    526     }
    527 
    528     m_buffer = adoptCF(newSampleBuffer);
     412    m_buffer = sampleBuffer;
    529413    m_lastImage = nullptr;
    530414
     
    606490}
    607491
    608 RefPtr<AVMediaSourcePreview> AVVideoCaptureSource::createPreview()
    609 {
    610     return AVVideoSourcePreview::create(session(), device(), this);
    611 }
    612 
    613492NSString* AVVideoCaptureSource::bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height) const
    614493{
     
    657536} // namespace WebCore
    658537
    659 @implementation WebCoreAVVideoCaptureSourceObserver
    660 
    661 static NSString * const KeyValueBoundsChangeKey = @"bounds";
    662 
    663 - (void)setParent:(AVVideoSourcePreview *)parent
    664 {
    665     if (_parent && _hasObserver && _parent->platformLayer()) {
    666         _hasObserver = false;
    667         [_parent->platformLayer() removeObserver:self forKeyPath:KeyValueBoundsChangeKey];
    668     }
    669 
    670     _parent = parent;
    671 
    672     if (_parent && _parent->platformLayer()) {
    673         _hasObserver = true;
    674         [_parent->platformLayer() addObserver:self forKeyPath:KeyValueBoundsChangeKey options:0 context:nullptr];
    675     }
    676 }
    677 
    678 - (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
    679 {
    680     UNUSED_PARAM(context);
    681 
    682     if (!_parent)
    683         return;
    684 
    685     if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])
    686         return;
    687 
    688 #if PLATFORM(IOS)
    689     WebThreadRun(^ {
    690         if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())
    691             _parent->backgroundLayerBoundsChanged();
    692     });
    693 #else
    694     if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())
    695         _parent->backgroundLayerBoundsChanged();
    696 #endif
    697 }
    698 
    699 @end
    700 
    701538#endif // ENABLE(MEDIA_STREAM)
  • trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm

    r208851 r210621  
    4949namespace WebCore {
    5050
     51static const int videoSampleRate = 90000;
     52
    5153RefPtr<MockRealtimeVideoSource> MockRealtimeVideoSource::create(const String& name, const MediaConstraints* constraints)
    5254{
     
    7577        return nullptr;
    7678
    77     CMSampleTimingInfo timingInfo;
    78 
    79     timingInfo.presentationTimeStamp = CMTimeMake(elapsedTime() * 1000, 1000);
    80     timingInfo.decodeTimeStamp = kCMTimeInvalid;
    81     timingInfo.duration = kCMTimeInvalid;
     79    CMTime sampleTime = CMTimeMake((elapsedTime() + .1) * videoSampleRate, videoSampleRate);
     80    CMSampleTimingInfo timingInfo = { kCMTimeInvalid, sampleTime, sampleTime };
    8281
    8382    CMVideoFormatDescriptionRef formatDescription = nullptr;
     
    101100RetainPtr<CVPixelBufferRef> MockRealtimeVideoSourceMac::pixelBufferFromCGImage(CGImageRef image) const
    102101{
     102    static CGColorSpaceRef deviceRGBColorSpace = CGColorSpaceCreateDeviceRGB();
     103
    103104    CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image));
    104105    CFDictionaryRef options = (__bridge CFDictionaryRef) @{
     
    113114    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    114115    void* data = CVPixelBufferGetBaseAddress(pixelBuffer);
    115     auto rgbColorSpace = adoptCF(CGColorSpaceCreateDeviceRGB());
    116     auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace.get(), (CGBitmapInfo) kCGImageAlphaNoneSkipFirst));
     116    auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), deviceRGBColorSpace, (CGBitmapInfo) kCGImageAlphaNoneSkipFirst));
    117117    CGContextDrawImage(context.get(), CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image);
    118118    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
  • trunk/Source/WebKit2/WebProcess/com.apple.WebProcess.sb.in

    r210076 r210621  
    449449        (iokit-user-client-class "IOUSBInterfaceUserClientV2"))
    450450    (allow device-camera))
     451
     452;; @@@@@
     453(allow device-microphone)
     454;; @@@@@
     455
Note: See TracChangeset for help on using the changeset viewer.