Changeset 210621 in webkit
- Timestamp:
- Jan 11, 2017 9:22:32 PM (7 years ago)
- Location:
- trunk/Source
- Files:
-
- 1 deleted
- 24 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/Source/WebCore/ChangeLog
r210616 r210621 1 2017-01-11 Eric Carlson <eric.carlson@apple.com> 2 3 [MediaStream, Mac] Render media stream audio buffers 4 https://bugs.webkit.org/show_bug.cgi?id=159836 5 <rdar://problem/27380390> 6 7 Reviewed by Jer Noble. 8 9 No new tests, it isn't possible to test audio rendering directly. A follow-up patch will 10 add a mock audio source that will enable audio testing. 11 12 * platform/cf/CoreMediaSoftLink.cpp: Include new functions used. 13 * platform/cf/CoreMediaSoftLink.h: 14 15 * WebCore.xcodeproj/project.pbxproj: Remove references to the deleted previews. 16 17 * platform/Logging.h: Add MediaCaptureSamples. 18 19 * platform/MediaSample.h: Add outputPresentationTime and outputDuration. 20 21 * platform/cf/CoreMediaSoftLink.cpp: Add CMSampleBufferGetOutputDuration, CMSampleBufferGetOutputPresentationTimeStamp, 22 CMTimeConvertScale, CMTimebaseGetEffectiveRate, CMAudioSampleBufferCreateWithPacketDescriptions, 23 CMSampleBufferSetDataBufferFromAudioBufferList, CMSampleBufferSetDataReady, 24 CMAudioFormatDescriptionCreate, CMClockGetHostTimeClock, and CMClockGetTime. 25 * platform/cf/CoreMediaSoftLink.h: 26 27 Create and use an AVSampleBufferAudioRenderer each audio stream track, when it is available, 28 to render for audio samples. Store the offset between the first sample received from a track's 29 output presentation and the synchronizer time so we can adjust sample timestamps to be 30 relative to the synchronizer's timeline regardless of their source. Remove the use of source 31 previews because not all sources will have them. 32 33 * platform/graphics/avfoundation/MediaSampleAVFObjC.h: 34 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h: 35 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm: 36 37 Add an ObjC helper to catch renderer status changes. 38 (-[WebAVSampleBufferStatusChangeListener initWithParent:]): 39 (-[WebAVSampleBufferStatusChangeListener dealloc]): 40 (-[WebAVSampleBufferStatusChangeListener invalidate]): 41 (-[WebAVSampleBufferStatusChangeListener beginObservingLayer:]): 42 (-[WebAVSampleBufferStatusChangeListener stopObservingLayer:]): 43 (-[WebAVSampleBufferStatusChangeListener beginObservingRenderer:]): 44 (-[WebAVSampleBufferStatusChangeListener stopObservingRenderer:]): 45 (-[WebAVSampleBufferStatusChangeListener observeValueForKeyPath:ofObject:change:context:]): 46 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC): 47 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::~MediaPlayerPrivateMediaStreamAVFObjC): 48 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue): 49 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue): 50 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes): 51 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample): 52 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample): 53 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData): 54 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData): 55 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer): 56 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer): 57 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers): 58 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider): 59 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange): 60 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange): 61 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers): 62 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer): 63 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer): 64 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::platformLayer): 65 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode): 66 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::play): 67 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack): Deleted. 68 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData): Deleted. 69 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer): Deleted. 70 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack): Deleted. 71 (WebCore::MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume): Deleted. 72 73 * platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm: 74 (WebCore::MediaSampleAVFObjC::outputPresentationTime): New. 75 (WebCore::MediaSampleAVFObjC::outputDuration): New. 76 (WebCore::MediaSampleAVFObjC::dump): Log outputPresentationTime. 77 78 * platform/mediastream/AudioTrackPrivateMediaStream.h: Add timelineOffset. 79 80 * platform/mediastream/MediaStreamTrackPrivate.cpp: 81 (WebCore::MediaStreamTrackPrivate::setEnabled): No more m_preview. 82 (WebCore::MediaStreamTrackPrivate::endTrack): Ditto. 83 (WebCore::MediaStreamTrackPrivate::preview): Deleted. 84 * platform/mediastream/MediaStreamTrackPrivate.h: 85 86 * platform/mediastream/RealtimeMediaSource.h: 87 (WebCore::RealtimeMediaSource::preview): Deleted. 88 89 * platform/mediastream/RealtimeMediaSourcePreview.h: Removed. 90 91 * platform/mediastream/VideoTrackPrivateMediaStream.h: Add timelineOffset. 92 93 * platform/mediastream/mac/AVAudioCaptureSource.h: 94 * platform/mediastream/mac/AVAudioCaptureSource.mm: 95 (WebCore::AVAudioCaptureSource::updateSettings): 96 (WebCore::AVAudioCaptureSource::captureOutputDidOutputSampleBufferFromConnection): Pass the 97 sample buffer up the chain. 98 (WebCore::AVAudioSourcePreview::create): Deleted. 99 (WebCore::AVAudioSourcePreview::AVAudioSourcePreview): Deleted. 100 (WebCore::AVAudioSourcePreview::invalidate): Deleted. 101 (WebCore::AVAudioSourcePreview::play): Deleted. 102 (WebCore::AVAudioSourcePreview::pause): Deleted. 103 (WebCore::AVAudioSourcePreview::setEnabled): Deleted. 104 (WebCore::AVAudioSourcePreview::setVolume): Deleted. 105 (WebCore::AVAudioSourcePreview::updateState): Deleted. 106 (WebCore::AVAudioCaptureSource::createPreview): Deleted. 107 108 * platform/mediastream/mac/AVMediaCaptureSource.h: 109 (WebCore::AVMediaSourcePreview): Deleted. 110 (WebCore::AVMediaCaptureSource::createWeakPtr): Deleted. 111 112 * platform/mediastream/mac/AVMediaCaptureSource.mm: 113 (WebCore::AVMediaCaptureSource::AVMediaCaptureSource): No more preview. 114 (WebCore::AVMediaCaptureSource::reset): 115 (WebCore::AVMediaCaptureSource::preview): Deleted. 116 (WebCore::AVMediaCaptureSource::removePreview): Deleted. 117 (WebCore::AVMediaSourcePreview::AVMediaSourcePreview): Deleted. 118 (WebCore::AVMediaSourcePreview::~AVMediaSourcePreview): Deleted. 119 (WebCore::AVMediaSourcePreview::invalidate): Deleted. 120 121 * platform/mediastream/mac/AVVideoCaptureSource.h: 122 * platform/mediastream/mac/AVVideoCaptureSource.mm: 123 (WebCore::AVVideoCaptureSource::processNewFrame): Don't set the "display immediately" attachment. 124 (WebCore::AVVideoSourcePreview::create): Deleted. 125 (WebCore::AVVideoSourcePreview::AVVideoSourcePreview): Deleted. 126 (WebCore::AVVideoSourcePreview::backgroundLayerBoundsChanged): Deleted. 127 (WebCore::AVVideoSourcePreview::invalidate): Deleted. 128 (WebCore::AVVideoSourcePreview::play): Deleted. 129 (WebCore::AVVideoSourcePreview::pause): Deleted. 130 (WebCore::AVVideoSourcePreview::setPaused): Deleted. 131 (WebCore::AVVideoSourcePreview::setEnabled): Deleted. 132 (WebCore::AVVideoCaptureSource::createPreview): Deleted. 133 (-[WebCoreAVVideoCaptureSourceObserver setParent:]): Deleted. 134 (-[WebCoreAVVideoCaptureSourceObserver observeValueForKeyPath:ofObject:change:context:]): Deleted. 135 136 * platform/mediastream/mac/MockRealtimeVideoSourceMac.mm: 137 (WebCore::MockRealtimeVideoSourceMac::CMSampleBufferFromPixelBuffer): Use a more typical video 138 time scale. Set the sample decode time. 139 (WebCore::MockRealtimeVideoSourceMac::pixelBufferFromCGImage): Use a static for colorspace 140 instead of fetching it for every frame. 141 142 * platform/mock/mediasource/MockSourceBufferPrivate.cpp: Add outputPresentationTime and outputDuration. 143 1 144 2017-01-11 Youenn Fablet <youenn@apple.com> 2 145 -
trunk/Source/WebCore/Modules/webaudio/ScriptProcessorNode.cpp
r207050 r210621 214 214 215 215 callOnMainThread([this] { 216 if (!m_hasAudioProcessListener) 217 return; 218 216 219 fireProcessEvent(); 217 220 -
trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj
r210588 r210621 280 280 07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */ = {isa = PBXBuildFile; fileRef = 07C1C0E41BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h */; settings = {ATTRIBUTES = (Private, ); }; }; 281 281 07CE77D516712A6A00C55A47 /* InbandTextTrackPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = 07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */; settings = {ATTRIBUTES = (Private, ); }; }; 282 07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */; settings = {ATTRIBUTES = (Private, ); }; };283 282 07D637401BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = 07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */; }; 284 283 07D637411BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = 07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */; }; … … 7254 7253 07C8AD121D073D630087C5CE /* AVFoundationMIMETypeCache.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AVFoundationMIMETypeCache.h; sourceTree = "<group>"; }; 7255 7254 07CE77D416712A6A00C55A47 /* InbandTextTrackPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = InbandTextTrackPrivateClient.h; sourceTree = "<group>"; }; 7256 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RealtimeMediaSourcePreview.h; sourceTree = "<group>"; };7257 7255 07D6373E1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebAudioSourceProviderAVFObjC.h; sourceTree = "<group>"; }; 7258 7256 07D6373F1BB0B11300256CE9 /* WebAudioSourceProviderAVFObjC.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebAudioSourceProviderAVFObjC.mm; sourceTree = "<group>"; }; … … 15201 15199 4A0FFA9F1AAF5EA20062803B /* RealtimeMediaSourceCenter.cpp */, 15202 15200 4A0FFAA01AAF5EA20062803B /* RealtimeMediaSourceCenter.h */, 15203 07D1503A1DDB6688008F7598 /* RealtimeMediaSourcePreview.h */,15204 15201 4A4F656E1AA997F100E38CDD /* RealtimeMediaSourceSettings.cpp */, 15205 15202 4A4F656F1AA997F100E38CDD /* RealtimeMediaSourceSettings.h */, … … 27227 27224 4A0FFAA21AAF5EA20062803B /* RealtimeMediaSourceCenter.h in Headers */, 27228 27225 4A0FFAA61AAF5EF60062803B /* RealtimeMediaSourceCenterMac.h in Headers */, 27229 07D1503B1DDB6965008F7598 /* RealtimeMediaSourcePreview.h in Headers */,27230 27226 4A4F65741AA997F100E38CDD /* RealtimeMediaSourceSettings.h in Headers */, 27231 27227 07C1C0E51BFB60ED00BD2256 /* RealtimeMediaSourceSupportedConstraints.h in Headers */, -
trunk/Source/WebCore/platform/Logging.h
r209873 r210621 63 63 M(MediaSource) \ 64 64 M(MediaSourceSamples) \ 65 M(MediaCaptureSamples) \ 65 66 M(MemoryPressure) \ 66 67 M(Network) \ -
trunk/Source/WebCore/platform/MediaSample.h
r207694 r210621 55 55 56 56 virtual MediaTime presentationTime() const = 0; 57 virtual MediaTime outputPresentationTime() const { return presentationTime(); } 57 58 virtual MediaTime decodeTime() const = 0; 58 59 virtual MediaTime duration() const = 0; 60 virtual MediaTime outputDuration() const { return duration(); } 59 61 virtual AtomicString trackID() const = 0; 60 62 virtual void setTrackID(const String&) = 0; -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp
r208444 r210621 86 86 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetImageBuffer, CVImageBufferRef, (CMSampleBufferRef sbuf), (sbuf)) 87 87 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 88 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 89 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 88 90 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary)) 89 91 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfoArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount timingArrayEntries, CMSampleTimingInfo *timingArrayOut, CMItemCount *timingArrayEntriesNeededOut), (sbuf, timingArrayEntries, timingArrayOut, timingArrayEntriesNeededOut)) 92 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method)) 90 93 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetTotalSampleSize, size_t, (CMSampleBufferRef sbuf), (sbuf)) 91 94 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSetAttachment, void, (CMAttachmentBearerRef target, CFStringRef key, CFTypeRef value, CMAttachmentMode attachmentMode), (target, key, value, attachmentMode)) … … 94 97 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetRate, OSStatus, (CMTimebaseRef timebase, Float64 rate), (timebase, rate)) 95 98 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time)) 99 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 96 100 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 97 101 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc)) … … 115 119 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut)) 116 120 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleSizeArray, OSStatus, (CMSampleBufferRef sbuf, CMItemCount sizeArrayEntries, size_t* sizeArrayOut, CMItemCount* sizeArrayEntriesNeededOut), (sbuf, sizeArrayEntries, sizeArrayOut, sizeArrayEntriesNeededOut)) 121 122 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut)) 123 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList)) 124 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf)) 125 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc)) 126 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ()) 127 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock)) 117 128 #endif // PLATFORM(COCOA) 118 129 -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h
r208444 r210621 48 48 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfo, OSStatus, (CMSampleBufferRef sbuf, CMItemIndex sampleIndex, CMSampleTimingInfo* timingInfoOut), (sbuf, sampleIndex, timingInfoOut)) 49 49 #define CMSampleBufferGetSampleTimingInfo softLink_CoreMedia_CMSampleBufferGetSampleTimingInfo 50 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method)) 51 #define CMTimeConvertScale softLink_CoreMedia_CMTimeConvertScale 50 52 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeAdd, CMTime, (CMTime time1, CMTime time2), (time1, time2)) 51 53 #define CMTimeAdd softLink_CoreMedia_CMTimeAdd … … 138 140 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 139 141 #define CMSampleBufferGetPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetPresentationTimeStamp 142 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputDuration, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 143 #define CMSampleBufferGetOutputDuration softLink_CoreMedia_CMSampleBufferGetOutputDuration 144 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetOutputPresentationTimeStamp, CMTime, (CMSampleBufferRef sbuf), (sbuf)) 145 #define CMSampleBufferGetOutputPresentationTimeStamp softLink_CoreMedia_CMSampleBufferGetOutputPresentationTimeStamp 140 146 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleAttachmentsArray, CFArrayRef, (CMSampleBufferRef sbuf, Boolean createIfNecessary), (sbuf, createIfNecessary)) 141 147 #define CMSampleBufferGetSampleAttachmentsArray softLink_CoreMedia_CMSampleBufferGetSampleAttachmentsArray … … 154 160 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time)) 155 161 #define CMTimebaseSetTime softLink_CoreMedia_CMTimebaseSetTime 162 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 163 #define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate 156 164 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 157 165 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary … … 194 202 #define CMSampleBufferGetSampleSizeArray softLink_CoreMedia_CMSampleBufferGetSampleSizeArray 195 203 204 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioSampleBufferCreateWithPacketDescriptions, OSStatus, (CFAllocatorRef allocator, CMBlockBufferRef dataBuffer, Boolean dataReady, CMSampleBufferMakeDataReadyCallback makeDataReadyCallback, void *makeDataReadyRefcon, CMFormatDescriptionRef formatDescription, CMItemCount numSamples, CMTime sbufPTS, const AudioStreamPacketDescription *packetDescriptions, CMSampleBufferRef *sBufOut), (allocator, dataBuffer, dataReady, makeDataReadyCallback, makeDataReadyRefcon, formatDescription, numSamples, sbufPTS, packetDescriptions, sBufOut)) 205 #define CMAudioSampleBufferCreateWithPacketDescriptions softLink_CoreMedia_CMAudioSampleBufferCreateWithPacketDescriptions 206 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataBufferFromAudioBufferList, OSStatus, (CMSampleBufferRef sbuf, CFAllocatorRef bbufStructAllocator, CFAllocatorRef bbufMemoryAllocator, uint32_t flags, const AudioBufferList *bufferList), (sbuf, bbufStructAllocator, bbufMemoryAllocator, flags, bufferList)) 207 #define CMSampleBufferSetDataBufferFromAudioBufferList softLink_CoreMedia_CMSampleBufferSetDataBufferFromAudioBufferList 208 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf)) 209 #define CMSampleBufferSetDataReady softLink_CoreMedia_CMSampleBufferSetDataReady 210 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc)) 211 #define CMAudioFormatDescriptionCreate softLink_CoreMedia_CMAudioFormatDescriptionCreate 212 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ()) 213 #define CMClockGetHostTimeClock softLink_CoreMedia_CMClockGetHostTimeClock 214 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock)) 215 #define CMClockGetTime softLink_CoreMedia_CMClockGetTime 196 216 #endif // PLATFORM(COCOA) 197 217 -
trunk/Source/WebCore/platform/graphics/avfoundation/MediaSampleAVFObjC.h
r207694 r210621 55 55 56 56 MediaTime presentationTime() const override; 57 MediaTime outputPresentationTime() const override; 57 58 MediaTime decodeTime() const override; 58 59 MediaTime duration() const override; 60 MediaTime outputDuration() const override; 59 61 60 62 AtomicString trackID() const override { return m_id; } -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.h
r208851 r210621 1 1 /* 2 * Copyright (C) 2015 Apple Inc. All rights reserved.2 * Copyright (C) 2015-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 40 40 OBJC_CLASS AVSampleBufferRenderSynchronizer; 41 41 OBJC_CLASS AVStreamSession; 42 OBJC_CLASS NSNumber; 43 OBJC_CLASS WebAVSampleBufferStatusChangeListener; 42 44 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 43 45 … … 54 56 #endif 55 57 58 #if __has_include(<AVFoundation/AVSampleBufferRenderSynchronizer.h>) 59 #define USE_RENDER_SYNCHRONIZER 1 60 #endif 61 56 62 class MediaPlayerPrivateMediaStreamAVFObjC final : public MediaPlayerPrivateInterface, private MediaStreamPrivate::Observer, private MediaStreamTrackPrivate::Observer { 57 63 public: … … 76 82 void destroyLayer(); 77 83 84 void rendererStatusDidChange(AVSampleBufferAudioRenderer*, NSNumber*); 85 void layerStatusDidChange(AVSampleBufferDisplayLayer*, NSNumber*); 86 78 87 private: 79 88 // MediaPlayerPrivateInterface … … 98 107 99 108 void setVolume(float) override; 100 void internalSetVolume(float, bool);101 109 void setMuted(bool) override; 102 110 bool supportsMuting() const override { return true; } … … 123 131 void setSize(const IntSize&) override { /* No-op */ } 124 132 125 void enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&); 126 127 void prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&); 128 void enqueueVideoSampleBuffer(MediaSample&); 133 void flushRenderers(); 134 135 using PendingSampleQueue = Deque<Ref<MediaSample>>; 136 void addSampleToPendingQueue(PendingSampleQueue&, MediaSample&); 137 void removeOldSamplesFromPendingQueue(PendingSampleQueue&); 138 139 void updateSampleTimes(MediaSample&, const MediaTime&, const char*); 140 MediaTime calculateTimelineOffset(const MediaSample&, double); 141 142 void enqueueVideoSample(MediaStreamTrackPrivate&, MediaSample&); 129 143 bool shouldEnqueueVideoSampleBuffer() const; 130 144 void flushAndRemoveVideoSampleBuffers(); 131 void requestNotificationWhenReadyForMediaData(); 145 void requestNotificationWhenReadyForVideoData(); 146 147 void enqueueAudioSample(MediaStreamTrackPrivate&, MediaSample&); 148 void createAudioRenderer(AtomicString); 149 void destroyAudioRenderer(AVSampleBufferAudioRenderer*); 150 void destroyAudioRenderer(AtomicString); 151 void destroyAudioRenderers(); 152 void requestNotificationWhenReadyForAudioData(AtomicString); 132 153 133 154 void paint(GraphicsContext&, const FloatRect&) override; … … 156 177 void updateTracks(); 157 178 void renderingModeChanged(); 179 void checkSelectedVideoTrack(); 158 180 159 181 void scheduleDeferredTask(Function<void ()>&&); … … 187 209 #endif 188 210 189 bool haveVideoLayer() const { return m_sampleBufferDisplayLayer || m_videoPreviewPlayer; } 211 MediaTime streamTime() const; 212 213 #if USE(RENDER_SYNCHRONIZER) 214 AudioSourceProvider* audioSourceProvider() final; 215 #endif 190 216 191 217 MediaPlayer* m_player { nullptr }; … … 193 219 RefPtr<MediaStreamPrivate> m_mediaStreamPrivate; 194 220 195 RefPtr< RealtimeMediaSourcePreview> m_videoPreviewPlayer;196 RefPtr<MediaStreamTrackPrivate> m_videoTrack; 197 221 RefPtr<MediaStreamTrackPrivate> m_activeVideoTrack; 222 223 RetainPtr<WebAVSampleBufferStatusChangeListener> m_statusChangeListener; 198 224 RetainPtr<AVSampleBufferDisplayLayer> m_sampleBufferDisplayLayer; 199 #if PLATFORM(MAC) 225 #if USE(RENDER_SYNCHRONIZER) 226 HashMap<String, RetainPtr<AVSampleBufferAudioRenderer>> m_audioRenderers; 200 227 RetainPtr<AVSampleBufferRenderSynchronizer> m_synchronizer; 201 #endif 228 #else 229 std::unique_ptr<Clock> m_clock; 230 #endif 231 232 MediaTime m_pausedTime; 202 233 RetainPtr<CGImageRef> m_pausedImage; 203 double m_pausedTime { 0 };204 std::unique_ptr<Clock> m_clock;205 234 206 235 HashMap<String, RefPtr<AudioTrackPrivateMediaStream>> m_audioTrackMap; 207 236 HashMap<String, RefPtr<VideoTrackPrivateMediaStream>> m_videoTrackMap; 208 Deque<Ref<MediaSample>> m_sampleQueue; 237 PendingSampleQueue m_pendingVideoSampleQueue; 238 #if USE(RENDER_SYNCHRONIZER) 239 PendingSampleQueue m_pendingAudioSampleQueue; 240 #endif 209 241 210 242 MediaPlayer::NetworkState m_networkState { MediaPlayer::Empty }; … … 220 252 bool m_hasReceivedMedia { false }; 221 253 bool m_isFrameDisplayed { false }; 254 bool m_pendingSelectedTrackCheck { false }; 222 255 223 256 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaStreamAVFObjC.mm
r210319 r210621 1 1 /* 2 * Copyright (C) 201 3-2017 Apple Inc. All rights reserved.2 * Copyright (C) 2015-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 53 53 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) 54 54 55 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferAudioRenderer) 55 56 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferDisplayLayer) 56 57 SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRenderSynchronizer) 57 58 59 SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmSpectral, NSString*) 60 SOFT_LINK_CONSTANT(AVFoundation, AVAudioTimePitchAlgorithmVarispeed, NSString*) 61 62 #define AVAudioTimePitchAlgorithmSpectral getAVAudioTimePitchAlgorithmSpectral() 63 #define AVAudioTimePitchAlgorithmVarispeed getAVAudioTimePitchAlgorithmVarispeed() 64 65 using namespace WebCore; 66 67 @interface WebAVSampleBufferStatusChangeListener : NSObject { 68 MediaPlayerPrivateMediaStreamAVFObjC* _parent; 69 Vector<RetainPtr<AVSampleBufferDisplayLayer>> _layers; 70 Vector<RetainPtr<AVSampleBufferAudioRenderer>> _renderers; 71 } 72 73 - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)callback; 74 - (void)invalidate; 75 - (void)beginObservingLayer:(AVSampleBufferDisplayLayer *)layer; 76 - (void)stopObservingLayer:(AVSampleBufferDisplayLayer *)layer; 77 - (void)beginObservingRenderer:(AVSampleBufferAudioRenderer *)renderer; 78 - (void)stopObservingRenderer:(AVSampleBufferAudioRenderer *)renderer; 79 @end 80 81 @implementation WebAVSampleBufferStatusChangeListener 82 83 - (id)initWithParent:(MediaPlayerPrivateMediaStreamAVFObjC*)parent 84 { 85 if (!(self = [super init])) 86 return nil; 87 88 _parent = parent; 89 return self; 90 } 91 92 - (void)dealloc 93 { 94 [self invalidate]; 95 [super dealloc]; 96 } 97 98 - (void)invalidate 99 { 100 for (auto& layer : _layers) 101 [layer removeObserver:self forKeyPath:@"status"]; 102 _layers.clear(); 103 104 for (auto& renderer : _renderers) 105 [renderer removeObserver:self forKeyPath:@"status"]; 106 _renderers.clear(); 107 108 [[NSNotificationCenter defaultCenter] removeObserver:self]; 109 110 _parent = nullptr; 111 } 112 113 - (void)beginObservingLayer:(AVSampleBufferDisplayLayer*)layer 114 { 115 ASSERT(_parent); 116 ASSERT(!_layers.contains(layer)); 117 118 _layers.append(layer); 119 [layer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr]; 120 } 121 122 - (void)stopObservingLayer:(AVSampleBufferDisplayLayer*)layer 123 { 124 ASSERT(_parent); 125 ASSERT(_layers.contains(layer)); 126 127 [layer removeObserver:self forKeyPath:@"status"]; 128 _layers.remove(_layers.find(layer)); 129 } 130 131 - (void)beginObservingRenderer:(AVSampleBufferAudioRenderer*)renderer 132 { 133 ASSERT(_parent); 134 ASSERT(!_renderers.contains(renderer)); 135 136 _renderers.append(renderer); 137 [renderer addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nullptr]; 138 } 139 140 - (void)stopObservingRenderer:(AVSampleBufferAudioRenderer*)renderer 141 { 142 ASSERT(_parent); 143 ASSERT(_renderers.contains(renderer)); 144 145 [renderer removeObserver:self forKeyPath:@"status"]; 146 _renderers.remove(_renderers.find(renderer)); 147 } 148 149 - (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context 150 { 151 UNUSED_PARAM(context); 152 UNUSED_PARAM(keyPath); 153 ASSERT(_parent); 154 155 RetainPtr<WebAVSampleBufferStatusChangeListener> protectedSelf = self; 156 if ([object isKindOfClass:getAVSampleBufferDisplayLayerClass()]) { 157 RetainPtr<AVSampleBufferDisplayLayer> layer = (AVSampleBufferDisplayLayer *)object; 158 RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey]; 159 160 ASSERT(_layers.contains(layer.get())); 161 ASSERT([keyPath isEqualToString:@"status"]); 162 163 callOnMainThread([protectedSelf = WTFMove(protectedSelf), layer = WTFMove(layer), status = WTFMove(status)] { 164 protectedSelf->_parent->layerStatusDidChange(layer.get(), status.get()); 165 }); 166 167 } else if ([object isKindOfClass:getAVSampleBufferAudioRendererClass()]) { 168 RetainPtr<AVSampleBufferAudioRenderer> renderer = (AVSampleBufferAudioRenderer *)object; 169 RetainPtr<NSNumber> status = [change valueForKey:NSKeyValueChangeNewKey]; 170 171 ASSERT(_renderers.contains(renderer.get())); 172 ASSERT([keyPath isEqualToString:@"status"]); 173 174 callOnMainThread([protectedSelf = WTFMove(protectedSelf), renderer = WTFMove(renderer), status = WTFMove(status)] { 175 protectedSelf->_parent->rendererStatusDidChange(renderer.get(), status.get()); 176 }); 177 } else 178 ASSERT_NOT_REACHED(); 179 } 180 @end 181 58 182 namespace WebCore { 59 183 60 184 #pragma mark - 61 185 #pragma mark MediaPlayerPrivateMediaStreamAVFObjC 186 187 static const double rendererLatency = 0.02; 62 188 63 189 MediaPlayerPrivateMediaStreamAVFObjC::MediaPlayerPrivateMediaStreamAVFObjC(MediaPlayer* player) 64 190 : m_player(player) 65 191 , m_weakPtrFactory(this) 192 , m_statusChangeListener(adoptNS([[WebAVSampleBufferStatusChangeListener alloc] initWithParent:this])) 193 #if USE(RENDER_SYNCHRONIZER) 194 , m_synchronizer(adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init])) 195 #else 66 196 , m_clock(Clock::create()) 197 #endif 67 198 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 68 199 , m_videoFullscreenLayerManager(VideoFullscreenLayerManager::create()) … … 82 213 } 83 214 215 destroyLayer(); 216 #if USE(RENDER_SYNCHRONIZER) 217 destroyAudioRenderers(); 218 #endif 219 84 220 m_audioTrackMap.clear(); 85 221 m_videoTrackMap.clear(); 86 87 destroyLayer();88 222 } 89 223 … … 128 262 #pragma mark AVSampleBuffer Methods 129 263 130 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSampleBufferFromTrack(MediaStreamTrackPrivate&, MediaSample&) 131 { 132 // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836 133 } 134 135 void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForMediaData() 136 { 137 [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 138 [m_sampleBufferDisplayLayer stopRequestingMediaData]; 139 140 while (!m_sampleQueue.isEmpty()) { 141 if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) { 142 requestNotificationWhenReadyForMediaData(); 143 return; 144 } 145 146 auto sample = m_sampleQueue.takeFirst(); 147 enqueueVideoSampleBuffer(sample.get()); 148 } 149 }]; 150 } 151 152 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSampleBuffer(MediaSample& sample) 153 { 264 void MediaPlayerPrivateMediaStreamAVFObjC::removeOldSamplesFromPendingQueue(PendingSampleQueue& queue) 265 { 266 MediaTime now = streamTime(); 267 while (!queue.isEmpty()) { 268 if (queue.first()->decodeTime() > now) 269 break; 270 queue.removeFirst(); 271 }; 272 } 273 274 void MediaPlayerPrivateMediaStreamAVFObjC::addSampleToPendingQueue(PendingSampleQueue& queue, MediaSample& sample) 275 { 276 removeOldSamplesFromPendingQueue(queue); 277 queue.append(sample); 278 } 279 280 void MediaPlayerPrivateMediaStreamAVFObjC::updateSampleTimes(MediaSample& sample, const MediaTime& timelineOffset, const char* loggingPrefix) 281 { 282 LOG(MediaCaptureSamples, "%s(%p): original sample = %s", loggingPrefix, this, toString(sample).utf8().data()); 283 sample.offsetTimestampsBy(timelineOffset); 284 LOG(MediaCaptureSamples, "%s(%p): adjusted sample = %s", loggingPrefix, this, toString(sample).utf8().data()); 285 286 #if !LOG_DISABLED 287 MediaTime now = streamTime(); 288 double delta = (sample.presentationTime() - now).toDouble(); 289 if (delta < 0) 290 LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %f seconds late", loggingPrefix, this, toString(now).utf8().data(), -delta); 291 else if (delta < .01) 292 LOG(Media, "%s(%p): *NOTE* audio sample at time %s is only %s seconds early", loggingPrefix, this, toString(now).utf8().data(), delta); 293 else if (delta > .3) 294 LOG(Media, "%s(%p): *NOTE* audio sample at time %s is %s seconds early!", loggingPrefix, this, toString(now).utf8().data(), delta); 295 #else 296 UNUSED_PARAM(loggingPrefix); 297 #endif 298 299 } 300 301 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::calculateTimelineOffset(const MediaSample& sample, double latency) 302 { 303 MediaTime sampleTime = sample.outputPresentationTime(); 304 if (!sampleTime || !sampleTime.isValid()) 305 sampleTime = sample.presentationTime(); 306 MediaTime timelineOffset = streamTime() - sampleTime + MediaTime::createWithDouble(latency); 307 if (timelineOffset.timeScale() != sampleTime.timeScale()) 308 timelineOffset = toMediaTime(CMTimeConvertScale(toCMTime(timelineOffset), sampleTime.timeScale(), kCMTimeRoundingMethod_Default)); 309 return timelineOffset; 310 } 311 312 #if USE(RENDER_SYNCHRONIZER) 313 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample(MediaStreamTrackPrivate& track, MediaSample& sample) 314 { 315 ASSERT(m_audioTrackMap.contains(track.id())); 316 ASSERT(m_audioRenderers.contains(sample.trackID())); 317 318 auto audioTrack = m_audioTrackMap.get(track.id()); 319 MediaTime timelineOffset = audioTrack->timelineOffset(); 320 if (timelineOffset == MediaTime::invalidTime()) { 321 timelineOffset = calculateTimelineOffset(sample, rendererLatency); 322 audioTrack->setTimelineOffset(timelineOffset); 323 LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample: timeline offset for track %s set to (%lld/%d)", track.id().utf8().data(), timelineOffset.timeValue(), timelineOffset.timeScale()); 324 } 325 326 updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueAudioSample"); 327 328 auto renderer = m_audioRenderers.get(sample.trackID()); 329 if (![renderer isReadyForMoreMediaData]) { 330 addSampleToPendingQueue(m_pendingAudioSampleQueue, sample); 331 requestNotificationWhenReadyForAudioData(sample.trackID()); 332 return; 333 } 334 335 [renderer enqueueSampleBuffer:sample.platformSample().sample.cmSampleBuffer]; 336 } 337 #endif 338 339 void MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample(MediaStreamTrackPrivate& track, MediaSample& sample) 340 { 341 ASSERT(m_videoTrackMap.contains(track.id())); 342 343 if (&track != m_mediaStreamPrivate->activeVideoTrack()) 344 return; 345 346 m_hasReceivedMedia = true; 347 updateReadyState(); 348 if (m_displayMode != LivePreview || (m_displayMode == PausedImage && m_isFrameDisplayed)) 349 return; 350 351 auto videoTrack = m_videoTrackMap.get(track.id()); 352 MediaTime timelineOffset = videoTrack->timelineOffset(); 353 if (timelineOffset == MediaTime::invalidTime()) { 354 timelineOffset = calculateTimelineOffset(sample, rendererLatency); 355 videoTrack->setTimelineOffset(timelineOffset); 356 LOG(MediaCaptureSamples, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample: timeline offset for track %s set to %f", track.id().utf8().data(), timelineOffset.toDouble()); 357 } 358 359 updateSampleTimes(sample, timelineOffset, "MediaPlayerPrivateMediaStreamAVFObjC::enqueueVideoSample"); 360 154 361 if (m_sampleBufferDisplayLayer) { 155 362 if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) { 156 m_sampleQueue.append(sample);157 requestNotificationWhenReadyFor MediaData();363 addSampleToPendingQueue(m_pendingVideoSampleQueue, sample); 364 requestNotificationWhenReadyForVideoData(); 158 365 return; 159 366 } … … 165 372 if (!m_hasEverEnqueuedVideoFrame) { 166 373 m_hasEverEnqueuedVideoFrame = true; 374 if (m_displayMode == PausedImage) 375 updatePausedImage(); 167 376 m_player->firstVideoFrameAvailable(); 168 updatePausedImage(); 169 } 170 } 171 172 void MediaPlayerPrivateMediaStreamAVFObjC::prepareVideoSampleBufferFromTrack(MediaStreamTrackPrivate& track, MediaSample& sample) 173 { 174 if (&track != m_mediaStreamPrivate->activeVideoTrack() || !shouldEnqueueVideoSampleBuffer()) 175 return; 176 177 enqueueVideoSampleBuffer(sample); 377 } 378 } 379 380 void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForVideoData() 381 { 382 [m_sampleBufferDisplayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 383 [m_sampleBufferDisplayLayer stopRequestingMediaData]; 384 385 while (!m_pendingVideoSampleQueue.isEmpty()) { 386 if (![m_sampleBufferDisplayLayer isReadyForMoreMediaData]) { 387 requestNotificationWhenReadyForVideoData(); 388 return; 389 } 390 391 auto sample = m_pendingVideoSampleQueue.takeFirst(); 392 enqueueVideoSample(*m_activeVideoTrack.get(), sample.get()); 393 } 394 }]; 395 } 396 397 #if USE(RENDER_SYNCHRONIZER) 398 void MediaPlayerPrivateMediaStreamAVFObjC::requestNotificationWhenReadyForAudioData(AtomicString trackID) 399 { 400 if (!m_audioRenderers.contains(trackID)) 401 return; 402 403 auto renderer = m_audioRenderers.get(trackID); 404 [renderer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 405 [renderer stopRequestingMediaData]; 406 407 auto audioTrack = m_audioTrackMap.get(trackID); 408 while (!m_pendingAudioSampleQueue.isEmpty()) { 409 if (![renderer isReadyForMoreMediaData]) { 410 requestNotificationWhenReadyForAudioData(trackID); 411 return; 412 } 413 414 auto sample = m_pendingAudioSampleQueue.takeFirst(); 415 enqueueAudioSample(audioTrack->streamTrack(), sample.get()); 416 } 417 }]; 418 } 419 420 void MediaPlayerPrivateMediaStreamAVFObjC::createAudioRenderer(AtomicString trackID) 421 { 422 ASSERT(!m_audioRenderers.contains(trackID)); 423 auto renderer = adoptNS([allocAVSampleBufferAudioRendererInstance() init]); 424 [renderer setAudioTimePitchAlgorithm:(m_player->preservesPitch() ? AVAudioTimePitchAlgorithmSpectral : AVAudioTimePitchAlgorithmVarispeed)]; 425 m_audioRenderers.set(trackID, renderer); 426 [m_synchronizer addRenderer:renderer.get()]; 427 [m_statusChangeListener beginObservingRenderer:renderer.get()]; 428 if (m_audioRenderers.size() == 1) 429 renderingModeChanged(); 430 } 431 432 void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AVSampleBufferAudioRenderer* renderer) 433 { 434 [m_statusChangeListener stopObservingRenderer:renderer]; 435 [renderer flush]; 436 [renderer stopRequestingMediaData]; 437 438 CMTime now = CMTimebaseGetTime([m_synchronizer timebase]); 439 [m_synchronizer removeRenderer:renderer atTime:now withCompletionHandler:^(BOOL) { }]; 440 } 441 442 void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderer(AtomicString trackID) 443 { 444 if (!m_audioRenderers.contains(trackID)) 445 return; 446 447 destroyAudioRenderer(m_audioRenderers.get(trackID).get()); 448 m_audioRenderers.remove(trackID); 449 if (!m_audioRenderers.size()) 450 renderingModeChanged(); 451 } 452 453 void MediaPlayerPrivateMediaStreamAVFObjC::destroyAudioRenderers() 454 { 455 m_pendingAudioSampleQueue.clear(); 456 for (auto& renderer : m_audioRenderers.values()) 457 destroyAudioRenderer(renderer.get()); 458 m_audioRenderers.clear(); 459 } 460 461 AudioSourceProvider* MediaPlayerPrivateMediaStreamAVFObjC::audioSourceProvider() 462 { 463 // FIXME: This should return a mix of all audio tracks - https://bugs.webkit.org/show_bug.cgi?id=160305 464 for (const auto& track : m_audioTrackMap.values()) { 465 if (track->streamTrack().ended() || !track->streamTrack().enabled() || track->streamTrack().muted()) 466 continue; 467 468 return track->streamTrack().audioSourceProvider(); 469 } 470 return nullptr; 471 } 472 #endif 473 474 void MediaPlayerPrivateMediaStreamAVFObjC::rendererStatusDidChange(AVSampleBufferAudioRenderer* renderer, NSNumber* status) 475 { 476 #if USE(RENDER_SYNCHRONIZER) 477 String trackID; 478 for (auto& pair : m_audioRenderers) { 479 if (pair.value == renderer) { 480 trackID = pair.key; 481 break; 482 } 483 } 484 ASSERT(!trackID.isEmpty()); 485 if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering) 486 m_audioTrackMap.get(trackID)->setTimelineOffset(MediaTime::invalidTime()); 487 #else 488 UNUSED_PARAM(renderer); 489 UNUSED_PARAM(status); 490 #endif 491 } 492 493 void MediaPlayerPrivateMediaStreamAVFObjC::layerStatusDidChange(AVSampleBufferDisplayLayer* layer, NSNumber* status) 494 { 495 ASSERT_UNUSED(layer, layer == m_sampleBufferDisplayLayer); 496 ASSERT(m_activeVideoTrack); 497 if (status.integerValue == AVQueuedSampleBufferRenderingStatusRendering) 498 m_videoTrackMap.get(m_activeVideoTrack->id())->setTimelineOffset(MediaTime::invalidTime()); 499 } 500 501 void MediaPlayerPrivateMediaStreamAVFObjC::flushRenderers() 502 { 503 if (m_sampleBufferDisplayLayer) 504 [m_sampleBufferDisplayLayer flush]; 505 506 #if USE(RENDER_SYNCHRONIZER) 507 for (auto& renderer : m_audioRenderers.values()) 508 [renderer flush]; 509 #endif 178 510 } 179 511 … … 197 529 void MediaPlayerPrivateMediaStreamAVFObjC::ensureLayer() 198 530 { 199 if (!m_mediaStreamPrivate || haveVideoLayer()) 200 return; 201 202 CALayer *videoLayer = nil; 203 if (m_mediaStreamPrivate->activeVideoTrack()) { 204 m_videoPreviewPlayer = m_mediaStreamPrivate->activeVideoTrack()->preview(); 205 if (m_videoPreviewPlayer) 206 videoLayer = m_videoPreviewPlayer->platformLayer(); 207 } 208 209 if (!videoLayer) { 210 m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); 211 videoLayer = m_sampleBufferDisplayLayer.get(); 531 if (m_sampleBufferDisplayLayer) 532 return; 533 534 m_sampleBufferDisplayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); 212 535 #ifndef NDEBUG 213 [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"]; 214 #endif 215 m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black); 216 217 #if PLATFORM(MAC) 218 m_synchronizer = adoptNS([allocAVSampleBufferRenderSynchronizerInstance() init]); 219 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()]; 220 #endif 221 } 536 [m_sampleBufferDisplayLayer setName:@"MediaPlayerPrivateMediaStreamAVFObjC AVSampleBufferDisplayLayer"]; 537 #endif 538 m_sampleBufferDisplayLayer.get().backgroundColor = cachedCGColor(Color::black); 539 [m_statusChangeListener beginObservingLayer:m_sampleBufferDisplayLayer.get()]; 540 541 #if USE(RENDER_SYNCHRONIZER) 542 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()]; 543 #endif 222 544 223 545 renderingModeChanged(); 224 546 225 547 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 226 m_videoFullscreenLayerManager->setVideoLayer( videoLayer, snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());548 m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size()); 227 549 #endif 228 550 } … … 230 552 void MediaPlayerPrivateMediaStreamAVFObjC::destroyLayer() 231 553 { 232 if (!haveVideoLayer()) 233 return; 234 235 m_videoPreviewPlayer = nullptr; 554 if (!m_sampleBufferDisplayLayer) 555 return; 236 556 237 557 if (m_sampleBufferDisplayLayer) { 558 m_pendingVideoSampleQueue.clear(); 559 [m_statusChangeListener stopObservingLayer:m_sampleBufferDisplayLayer.get()]; 238 560 [m_sampleBufferDisplayLayer stopRequestingMediaData]; 239 561 [m_sampleBufferDisplayLayer flush]; 240 #if PLATFORM(MAC)562 #if USE(RENDER_SYNCHRONIZER) 241 563 CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]); 242 564 [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL) { … … 306 628 PlatformLayer* MediaPlayerPrivateMediaStreamAVFObjC::platformLayer() const 307 629 { 308 if (! haveVideoLayer()|| m_displayMode == None)630 if (!m_sampleBufferDisplayLayer || m_displayMode == None) 309 631 return nullptr; 310 632 … … 312 634 return m_videoFullscreenLayerManager->videoInlineLayer(); 313 635 #else 314 if (m_videoPreviewPlayer)315 return m_videoPreviewPlayer->platformLayer();316 636 317 637 return m_sampleBufferDisplayLayer.get(); … … 321 641 MediaPlayerPrivateMediaStreamAVFObjC::DisplayMode MediaPlayerPrivateMediaStreamAVFObjC::currentDisplayMode() const 322 642 { 323 if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || ! haveVideoLayer())643 if (m_ended || m_intrinsicSize.isEmpty() || !metaDataAvailable() || !m_sampleBufferDisplayLayer) 324 644 return None; 325 645 … … 369 689 return; 370 690 371 m_clock->start();372 691 m_playing = true; 373 374 if (m_videoPreviewPlayer) 375 m_videoPreviewPlayer->play(); 376 #if PLATFORM(MAC) 377 else 378 [m_synchronizer setRate:1]; 379 #endif 380 381 for (const auto& track : m_audioTrackMap.values()) { 382 if (!track->enabled() || !track->streamTrack().preview()) 383 continue; 384 385 track->streamTrack().preview()->play(); 386 } 692 #if USE(RENDER_SYNCHRONIZER) 693 if (!m_synchronizer.get().rate) 694 [m_synchronizer setRate:1 ]; // streamtime 695 #else 696 if (!m_clock->isRunning()) 697 m_clock->start(); 698 #endif 387 699 388 700 m_haveEverPlayed = true; … … 400 712 return; 401 713 402 m_pausedTime = m_clock->currentTime();714 m_pausedTime = currentMediaTime(); 403 715 m_playing = false; 404 405 if (m_videoPreviewPlayer)406 m_videoPreviewPlayer->pause();407 #if PLATFORM(MAC)408 else409 [m_synchronizer setRate:0];410 #endif411 412 for (const auto& track : m_audioTrackMap.values()) {413 if (!track->enabled() || !track->streamTrack().preview())414 continue;415 416 track->streamTrack().preview()->pause();417 }418 716 419 717 updateDisplayMode(); 420 718 updatePausedImage(); 719 flushRenderers(); 421 720 } 422 721 … … 426 725 } 427 726 428 void MediaPlayerPrivateMediaStreamAVFObjC::internalSetVolume(float volume, bool internal)429 {430 if (!internal)431 m_volume = volume;432 433 if (!metaDataAvailable())434 return;435 436 for (const auto& track : m_audioTrackMap.values()) {437 if (!track->enabled() || !track->streamTrack().preview())438 continue;439 440 track->streamTrack().preview()->setVolume(volume);441 }442 }443 444 727 void MediaPlayerPrivateMediaStreamAVFObjC::setVolume(float volume) 445 728 { 446 internalSetVolume(volume, false); 729 LOG(Media, "MediaPlayerPrivateMediaStreamAVFObjC::setVolume(%p)", this); 730 731 if (m_volume == volume) 732 return; 733 734 m_volume = volume; 735 736 #if USE(RENDER_SYNCHRONIZER) 737 for (auto& renderer : m_audioRenderers.values()) 738 [renderer setVolume:volume]; 739 #endif 447 740 } 448 741 … … 456 749 m_muted = muted; 457 750 458 internalSetVolume(muted ? 0 : m_volume, true); 751 #if USE(RENDER_SYNCHRONIZER) 752 for (auto& renderer : m_audioRenderers.values()) 753 [renderer setMuted:muted]; 754 #endif 459 755 } 460 756 … … 482 778 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::currentMediaTime() const 483 779 { 484 return MediaTime::createWithDouble(m_playing ? m_clock->currentTime() : m_pausedTime); 780 if (!m_playing) 781 return m_pausedTime; 782 783 return streamTime(); 784 } 785 786 MediaTime MediaPlayerPrivateMediaStreamAVFObjC::streamTime() const 787 { 788 #if USE(RENDER_SYNCHRONIZER) 789 return toMediaTime(CMTimebaseGetTime([m_synchronizer timebase])); 790 #else 791 return MediaTime::createWithDouble(m_clock->currentTime()); 792 #endif 485 793 } 486 794 … … 599 907 ASSERT(m_mediaStreamPrivate); 600 908 909 if (!m_hasReceivedMedia) { 910 m_hasReceivedMedia = true; 911 updateReadyState(); 912 } 913 914 if (!m_playing || streamTime().toDouble() < 0) 915 return; 916 917 #if USE(RENDER_SYNCHRONIZER) 918 if (!CMTimebaseGetEffectiveRate([m_synchronizer timebase])) 919 return; 920 #endif 921 601 922 switch (track.type()) { 602 923 case RealtimeMediaSource::None: … … 604 925 break; 605 926 case RealtimeMediaSource::Audio: 606 // FIXME: https://bugs.webkit.org/show_bug.cgi?id=159836 927 #if USE(RENDER_SYNCHRONIZER) 928 enqueueAudioSample(track, mediaSample); 929 #endif 607 930 break; 608 931 case RealtimeMediaSource::Video: 609 prepareVideoSampleBufferFromTrack(track, mediaSample); 610 m_hasReceivedMedia = true; 611 scheduleDeferredTask([this] { 612 updateReadyState(); 613 }); 932 if (&track == m_activeVideoTrack.get()) 933 enqueueVideoSample(track, mediaSample); 614 934 break; 615 935 } … … 617 937 618 938 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 619 620 939 void MediaPlayerPrivateMediaStreamAVFObjC::setVideoFullscreenLayer(PlatformLayer *videoFullscreenLayer, std::function<void()> completionHandler) 621 940 { … … 627 946 m_videoFullscreenLayerManager->setVideoFullscreenFrame(frame); 628 947 } 629 630 #endif 631 632 template <typename RefT, typename PassRefT> 633 void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), MediaPlayer* player, void (MediaPlayer::*removedFunction)(PassRefT), void (MediaPlayer::*addedFunction)(PassRefT), std::function<void(RefT, int)> configureCallback, MediaStreamTrackPrivate::Observer* trackObserver) 948 #endif 949 950 typedef enum { 951 Add, 952 Remove, 953 Configure 954 } TrackState; 955 956 template <typename RefT> 957 void updateTracksOfType(HashMap<String, RefT>& trackMap, RealtimeMediaSource::Type trackType, MediaStreamTrackPrivateVector& currentTracks, RefT (*itemFactory)(MediaStreamTrackPrivate&), const Function<void(RefT, int, TrackState)>& configureTrack) 634 958 { 635 959 Vector<RefT> removedTracks; … … 661 985 662 986 int index = 0; 987 for (auto& track : removedTracks) 988 configureTrack(track, index++, TrackState::Remove); 989 990 index = 0; 991 for (auto& track : addedTracks) 992 configureTrack(track, index++, TrackState::Add); 993 994 index = 0; 663 995 for (const auto& track : trackMap.values()) 664 configureCallback(track, index++); 665 666 for (auto& track : removedTracks) { 667 (player->*removedFunction)(*track); 668 track->streamTrack().removeObserver(*trackObserver); 669 } 670 671 for (auto& track : addedTracks) { 672 (player->*addedFunction)(*track); 673 track->streamTrack().addObserver(*trackObserver); 674 } 996 configureTrack(track, index++, TrackState::Configure); 997 } 998 999 void MediaPlayerPrivateMediaStreamAVFObjC::checkSelectedVideoTrack() 1000 { 1001 if (m_pendingSelectedTrackCheck) 1002 return; 1003 1004 m_pendingSelectedTrackCheck = true; 1005 scheduleDeferredTask([this] { 1006 bool hideVideoLayer = true; 1007 m_activeVideoTrack = nullptr; 1008 if (m_mediaStreamPrivate->activeVideoTrack()) { 1009 for (const auto& track : m_videoTrackMap.values()) { 1010 if (&track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack()) { 1011 m_activeVideoTrack = m_mediaStreamPrivate->activeVideoTrack(); 1012 if (track->selected()) 1013 hideVideoLayer = false; 1014 break; 1015 } 1016 } 1017 } 1018 1019 ensureLayer(); 1020 m_sampleBufferDisplayLayer.get().hidden = hideVideoLayer; 1021 m_pendingSelectedTrackCheck = false; 1022 }); 675 1023 } 676 1024 … … 679 1027 MediaStreamTrackPrivateVector currentTracks = m_mediaStreamPrivate->tracks(); 680 1028 681 std::function<void(RefPtr<AudioTrackPrivateMediaStream>, int)> enableAudioTrack = [this](auto track, int index)1029 Function<void(RefPtr<AudioTrackPrivateMediaStream>, int, TrackState)> setAudioTrackState = [this](auto track, int index, TrackState state) 682 1030 { 683 track->setTrackIndex(index); 684 track->setEnabled(track->streamTrack().enabled() && !track->streamTrack().muted()); 1031 switch (state) { 1032 case TrackState::Remove: 1033 track->streamTrack().removeObserver(*this); 1034 m_player->removeAudioTrack(*track); 1035 #if USE(RENDER_SYNCHRONIZER) 1036 destroyAudioRenderer(track->id()); 1037 #endif 1038 break; 1039 case TrackState::Add: 1040 track->streamTrack().addObserver(*this); 1041 m_player->addAudioTrack(*track); 1042 #if USE(RENDER_SYNCHRONIZER) 1043 createAudioRenderer(track->id()); 1044 #endif 1045 break; 1046 case TrackState::Configure: 1047 track->setTrackIndex(index); 1048 bool enabled = track->streamTrack().enabled() && !track->streamTrack().muted(); 1049 track->setEnabled(enabled); 1050 #if USE(RENDER_SYNCHRONIZER) 1051 auto renderer = m_audioRenderers.get(track->id()); 1052 ASSERT(renderer); 1053 renderer.get().muted = !enabled; 1054 #endif 1055 break; 1056 } 685 1057 }; 686 updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeAudioTrack, &MediaPlayer::addAudioTrack, enableAudioTrack, (MediaStreamTrackPrivate::Observer*) this);687 688 std::function<void(RefPtr<VideoTrackPrivateMediaStream>, int)> enableVideoTrack = [this](auto track, int index)1058 updateTracksOfType(m_audioTrackMap, RealtimeMediaSource::Audio, currentTracks, &AudioTrackPrivateMediaStream::create, setAudioTrackState); 1059 1060 Function<void(RefPtr<VideoTrackPrivateMediaStream>, int, TrackState)> setVideoTrackState = [&](auto track, int index, TrackState state) 689 1061 { 690 track->setTrackIndex(index); 691 bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack(); 692 track->setSelected(selected); 693 694 if (selected) 695 ensureLayer(); 1062 switch (state) { 1063 case TrackState::Remove: 1064 track->streamTrack().removeObserver(*this); 1065 m_player->removeVideoTrack(*track); 1066 checkSelectedVideoTrack(); 1067 break; 1068 case TrackState::Add: 1069 track->streamTrack().addObserver(*this); 1070 m_player->addVideoTrack(*track); 1071 break; 1072 case TrackState::Configure: 1073 track->setTrackIndex(index); 1074 bool selected = &track->streamTrack() == m_mediaStreamPrivate->activeVideoTrack(); 1075 track->setSelected(selected); 1076 checkSelectedVideoTrack(); 1077 break; 1078 } 696 1079 }; 697 updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, m_player, &MediaPlayer::removeVideoTrack, &MediaPlayer::addVideoTrack, enableVideoTrack, (MediaStreamTrackPrivate::Observer*) this);1080 updateTracksOfType(m_videoTrackMap, RealtimeMediaSource::Video, currentTracks, &VideoTrackPrivateMediaStream::create, setVideoTrackState); 698 1081 } 699 1082 -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSampleAVFObjC.mm
r207694 r210621 38 38 } 39 39 40 MediaTime MediaSampleAVFObjC::outputPresentationTime() const 41 { 42 return toMediaTime(CMSampleBufferGetOutputPresentationTimeStamp(m_sample.get())); 43 } 44 40 45 MediaTime MediaSampleAVFObjC::decodeTime() const 41 46 { … … 46 51 { 47 52 return toMediaTime(CMSampleBufferGetDuration(m_sample.get())); 53 } 54 55 MediaTime MediaSampleAVFObjC::outputDuration() const 56 { 57 return toMediaTime(CMSampleBufferGetOutputDuration(m_sample.get())); 48 58 } 49 59 … … 112 122 void MediaSampleAVFObjC::dump(PrintStream& out) const 113 123 { 114 out.print("{PTS(", presentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}");124 out.print("{PTS(", presentationTime(), "), OPTS(", outputPresentationTime(), "), DTS(", decodeTime(), "), duration(", duration(), "), flags(", (int)flags(), "), presentationSize(", presentationSize().width(), "x", presentationSize().height(), ")}"); 115 125 } 116 126 -
trunk/Source/WebCore/platform/mediastream/AudioTrackPrivateMediaStream.h
r210319 r210621 51 51 MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); } 52 52 53 MediaTime timelineOffset() const { return m_timelineOffset; } 54 void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; } 55 53 56 private: 54 57 AudioTrackPrivateMediaStream(MediaStreamTrackPrivate& track) … … 56 59 , m_id(track.id()) 57 60 , m_label(track.label()) 61 , m_timelineOffset(MediaTime::invalidTime()) 58 62 { 59 63 } … … 63 67 AtomicString m_label; 64 68 int m_index { 0 }; 69 MediaTime m_timelineOffset; 65 70 }; 66 71 -
trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.cpp
r209959 r210621 101 101 m_isEnabled = enabled; 102 102 103 if (m_preview)104 m_preview->setEnabled(enabled);105 106 103 for (auto& observer : m_observers) 107 104 observer->trackEnabledChanged(*this); … … 118 115 m_isEnded = true; 119 116 120 m_preview = nullptr;121 117 m_source->requestStop(this); 122 118 … … 164 160 } 165 161 166 RealtimeMediaSourcePreview* MediaStreamTrackPrivate::preview()167 {168 if (m_preview)169 return m_preview.get();170 171 m_preview = m_source->preview();172 return m_preview.get();173 }174 175 162 void MediaStreamTrackPrivate::applyConstraints(const MediaConstraints& constraints, RealtimeMediaSource::SuccessHandler successHandler, RealtimeMediaSource::FailureHandler failureHandler) 176 163 { -
trunk/Source/WebCore/platform/mediastream/MediaStreamTrackPrivate.h
r209959 r210621 92 92 93 93 void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&); 94 RealtimeMediaSourcePreview* preview();95 94 96 95 private: … … 106 105 Vector<Observer*> m_observers; 107 106 Ref<RealtimeMediaSource> m_source; 108 RefPtr<RealtimeMediaSourcePreview> m_preview;109 107 110 108 String m_id; -
trunk/Source/WebCore/platform/mediastream/RealtimeMediaSource.h
r208985 r210621 43 43 #include "PlatformLayer.h" 44 44 #include "RealtimeMediaSourceCapabilities.h" 45 #include "RealtimeMediaSourcePreview.h"46 45 #include <wtf/RefCounted.h> 47 46 #include <wtf/Vector.h> … … 130 129 virtual RefPtr<Image> currentFrameImage() { return nullptr; } 131 130 virtual void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) { } 132 virtual RefPtr<RealtimeMediaSourcePreview> preview() { return nullptr; }133 131 134 132 void setWidth(int); -
trunk/Source/WebCore/platform/mediastream/VideoTrackPrivateMediaStream.h
r210319 r210621 41 41 } 42 42 43 Kind kind() const override { return Kind::Main; }44 AtomicString id() const override { return m_id; }45 AtomicString label() const override { return m_label; }46 AtomicString language() const override { return emptyAtom; }47 int trackIndex() const override { return m_index; }48 49 43 void setTrackIndex(int index) { m_index = index; } 50 44 51 45 MediaStreamTrackPrivate& streamTrack() { return m_streamTrack.get(); } 46 47 MediaTime timelineOffset() const { return m_timelineOffset; } 48 void setTimelineOffset(const MediaTime& offset) { m_timelineOffset = offset; } 52 49 53 50 private: … … 56 53 , m_id(track.id()) 57 54 , m_label(track.label()) 55 , m_timelineOffset(MediaTime::invalidTime()) 58 56 { 59 57 } 58 59 Kind kind() const final { return Kind::Main; } 60 AtomicString id() const final { return m_id; } 61 AtomicString label() const final { return m_label; } 62 AtomicString language() const final { return emptyAtom; } 63 int trackIndex() const final { return m_index; } 60 64 61 65 Ref<MediaStreamTrackPrivate> m_streamTrack; … … 63 67 AtomicString m_label; 64 68 int m_index { 0 }; 69 MediaTime m_timelineOffset; 65 70 }; 66 71 -
trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.h
r208851 r210621 1 1 /* 2 * Copyright (C) 2013-201 5Apple Inc. All rights reserved.2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 68 68 void updateSettings(RealtimeMediaSourceSettings&) override; 69 69 AudioSourceProvider* audioSourceProvider() override; 70 RefPtr<AVMediaSourcePreview> createPreview() final;71 70 72 71 RetainPtr<AVCaptureConnection> m_audioConnection; -
trunk/Source/WebCore/platform/mediastream/mac/AVAudioCaptureSource.mm
r210105 r210621 1 1 /* 2 * Copyright (C) 2013-201 5Apple Inc. All rights reserved.2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 31 31 #import "Logging.h" 32 32 #import "MediaConstraints.h" 33 #import " NotImplemented.h"33 #import "MediaSampleAVFObjC.h" 34 34 #import "RealtimeMediaSourceSettings.h" 35 35 #import "SoftLinking.h" … … 50 50 typedef AVCaptureOutput AVCaptureOutputType; 51 51 52 #if !PLATFORM(IOS)53 typedef AVCaptureAudioPreviewOutput AVCaptureAudioPreviewOutputType;54 #endif55 56 52 SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation) 57 53 58 54 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioChannel) 59 55 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioDataOutput) 60 SOFT_LINK_CLASS(AVFoundation, AVCaptureAudioPreviewOutput)61 56 SOFT_LINK_CLASS(AVFoundation, AVCaptureConnection) 62 57 SOFT_LINK_CLASS(AVFoundation, AVCaptureDevice) 63 58 SOFT_LINK_CLASS(AVFoundation, AVCaptureDeviceInput) 64 59 SOFT_LINK_CLASS(AVFoundation, AVCaptureOutput) 65 66 #define AVCaptureAudioPreviewOutput getAVCaptureAudioPreviewOutputClass()67 60 68 61 #define AVCaptureAudioChannel getAVCaptureAudioChannelClass() … … 81 74 namespace WebCore { 82 75 83 #if !PLATFORM(IOS)84 class AVAudioSourcePreview: public AVMediaSourcePreview {85 public:86 static RefPtr<AVMediaSourcePreview> create(AVCaptureSession *, AVAudioCaptureSource*);87 88 private:89 AVAudioSourcePreview(AVCaptureSession *, AVAudioCaptureSource*);90 91 void invalidate() final;92 93 void play() const final;94 void pause() const final;95 void setVolume(double) const final;96 void setEnabled(bool) final;97 PlatformLayer* platformLayer() const final { return nullptr; }98 99 void updateState() const;100 101 RetainPtr<AVCaptureAudioPreviewOutputType> m_audioPreviewOutput;102 mutable double m_volume { 1 };103 mutable bool m_paused { false };104 mutable bool m_enabled { true };105 };106 107 RefPtr<AVMediaSourcePreview> AVAudioSourcePreview::create(AVCaptureSession *session, AVAudioCaptureSource* parent)108 {109 return adoptRef(new AVAudioSourcePreview(session, parent));110 }111 112 AVAudioSourcePreview::AVAudioSourcePreview(AVCaptureSession *session, AVAudioCaptureSource* parent)113 : AVMediaSourcePreview(parent)114 {115 m_audioPreviewOutput = adoptNS([allocAVCaptureAudioPreviewOutputInstance() init]);116 setVolume(1);117 [session addOutput:m_audioPreviewOutput.get()];118 }119 120 void AVAudioSourcePreview::invalidate()121 {122 m_audioPreviewOutput = nullptr;123 AVMediaSourcePreview::invalidate();124 }125 126 void AVAudioSourcePreview::play() const127 {128 m_paused = false;129 updateState();130 }131 132 void AVAudioSourcePreview::pause() const133 {134 m_paused = true;135 updateState();136 }137 138 void AVAudioSourcePreview::setEnabled(bool enabled)139 {140 m_enabled = enabled;141 updateState();142 }143 144 void AVAudioSourcePreview::setVolume(double volume) const145 {146 m_volume = volume;147 m_audioPreviewOutput.get().volume = volume;148 }149 150 void AVAudioSourcePreview::updateState() const151 {152 m_audioPreviewOutput.get().volume = (!m_enabled || m_paused) ? 0 : m_volume;153 }154 #endif155 156 76 RefPtr<AVMediaCaptureSource> AVAudioCaptureSource::create(AVCaptureDeviceTypedef* device, const AtomicString& id, const MediaConstraints* constraints, String& invalidConstraint) 157 77 { … … 191 111 void AVAudioCaptureSource::updateSettings(RealtimeMediaSourceSettings& settings) 192 112 { 193 // FIXME: use [AVCaptureAudioPreviewOutput volume] forvolume113 // FIXME: support volume 194 114 195 115 settings.setDeviceId(id()); … … 277 197 return; 278 198 199 RetainPtr<CMSampleBufferRef> buffer = sampleBuffer; 200 scheduleDeferredTask([this, buffer] { 201 mediaDataUpdated(MediaSampleAVFObjC::create(buffer.get())); 202 }); 203 279 204 std::unique_lock<Lock> lock(m_lock, std::try_to_lock); 280 205 if (!lock.owns_lock()) { … … 305 230 } 306 231 307 RefPtr<AVMediaSourcePreview> AVAudioCaptureSource::createPreview()308 {309 #if !PLATFORM(IOS)310 return AVAudioSourcePreview::create(session(), this);311 #else312 return nullptr;313 #endif314 }315 316 232 } // namespace WebCore 317 233 -
trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.h
r208851 r210621 1 1 /* 2 * Copyright (C) 2013-201 5Apple Inc. All rights reserved.2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 48 48 class AVMediaCaptureSource; 49 49 50 class AVMediaSourcePreview: public RealtimeMediaSourcePreview {51 public:52 virtual ~AVMediaSourcePreview();53 54 void invalidate() override;55 56 protected:57 AVMediaSourcePreview(AVMediaCaptureSource*);58 59 private:60 WeakPtr<AVMediaCaptureSource> m_parent;61 };62 63 50 class AVMediaCaptureSource : public RealtimeMediaSource { 64 51 public: … … 76 63 void stopProducingData() final; 77 64 bool isProducingData() const final { return m_isRunning; } 78 79 RefPtr<RealtimeMediaSourcePreview> preview() final;80 void removePreview(AVMediaSourcePreview*);81 WeakPtr<AVMediaCaptureSource> createWeakPtr() { return m_weakPtrFactory.createWeakPtr(); }82 65 83 66 protected: … … 100 83 void setAudioSampleBufferDelegate(AVCaptureAudioDataOutput*); 101 84 102 virtual RefPtr<AVMediaSourcePreview> createPreview() = 0;103 104 85 private: 105 86 void setupSession(); … … 118 99 RetainPtr<AVCaptureSession> m_session; 119 100 RetainPtr<AVCaptureDevice> m_device; 120 Vector<WeakPtr<RealtimeMediaSourcePreview>> m_previews;121 WeakPtrFactory<AVMediaCaptureSource> m_weakPtrFactory;122 101 bool m_isRunning { false}; 123 102 }; -
trunk/Source/WebCore/platform/mediastream/mac/AVMediaCaptureSource.mm
r210105 r210621 130 130 , m_objcObserver(adoptNS([[WebCoreAVMediaCaptureSourceObserver alloc] initWithCallback:this])) 131 131 , m_device(device) 132 , m_weakPtrFactory(this)133 132 { 134 133 setName(device.localizedName); … … 241 240 [m_session removeObserver:m_objcObserver.get() forKeyPath:keyName]; 242 241 243 for (const auto& preview : m_previews) {244 if (preview)245 preview->invalidate();246 }247 m_previews.clear();248 249 242 shutdownCaptureSession(); 250 243 m_session = nullptr; … … 276 269 ASSERT_NOT_REACHED(); 277 270 return nullptr; 278 }279 280 RefPtr<RealtimeMediaSourcePreview> AVMediaCaptureSource::preview()281 {282 RefPtr<AVMediaSourcePreview> preview = createPreview();283 if (!preview)284 return nullptr;285 286 m_previews.append(preview->createWeakPtr());287 return preview.leakRef();288 }289 290 void AVMediaCaptureSource::removePreview(AVMediaSourcePreview* preview)291 {292 size_t index;293 for (index = 0; index < m_previews.size(); ++index) {294 if (m_previews[index].get() == preview)295 break;296 }297 298 if (index < m_previews.size())299 m_previews.remove(index);300 }301 302 AVMediaSourcePreview::AVMediaSourcePreview(AVMediaCaptureSource* parent)303 : m_parent(parent->createWeakPtr())304 {305 }306 307 AVMediaSourcePreview::~AVMediaSourcePreview()308 {309 if (m_parent)310 m_parent->removePreview(this);311 }312 313 void AVMediaSourcePreview::invalidate()314 {315 m_parent = nullptr;316 RealtimeMediaSourcePreview::invalidate();317 271 } 318 272 -
trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.h
r209188 r210621 80 80 void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) final; 81 81 82 RefPtr<AVMediaSourcePreview> createPreview() final;83 82 RetainPtr<CGImageRef> currentFrameCGImage(); 84 83 RefPtr<Image> currentFrameImage() final; -
trunk/Source/WebCore/platform/mediastream/mac/AVVideoCaptureSource.mm
r210105 r210621 1 1 /* 2 * Copyright (C) 2013-201 5Apple Inc. All rights reserved.2 * Copyright (C) 2013-2017 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 39 39 #import "PlatformLayer.h" 40 40 #import "RealtimeMediaSourceCenter.h" 41 #import "RealtimeMediaSourcePreview.h"42 41 #import "RealtimeMediaSourceSettings.h" 43 42 #import "WebActionDisablingCALayerDelegate.h" … … 102 101 using namespace WebCore; 103 102 104 @interface WebCoreAVVideoCaptureSourceObserver : NSObject<CALayerDelegate> {105 AVVideoSourcePreview *_parent;106 BOOL _hasObserver;107 }108 109 - (void)setParent:(AVVideoSourcePreview *)parent;110 - (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context;111 @end112 113 103 namespace WebCore { 114 115 class AVVideoSourcePreview: public AVMediaSourcePreview {116 public:117 static RefPtr<AVMediaSourcePreview> create(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);118 119 void backgroundLayerBoundsChanged();120 PlatformLayer* platformLayer() const final { return m_previewBackgroundLayer.get(); }121 122 private:123 AVVideoSourcePreview(AVCaptureSession*, AVCaptureDeviceTypedef*, AVVideoCaptureSource*);124 125 void invalidate() final;126 127 void play() const final;128 void pause() const final;129 void setVolume(double) const final { };130 void setEnabled(bool) final;131 void setPaused(bool) const;132 133 RetainPtr<AVCaptureVideoPreviewLayerType> m_previewLayer;134 RetainPtr<PlatformLayer> m_previewBackgroundLayer;135 RetainPtr<AVCaptureDeviceTypedef> m_device;136 RetainPtr<WebCoreAVVideoCaptureSourceObserver> m_objcObserver;137 };138 139 RefPtr<AVMediaSourcePreview> AVVideoSourcePreview::create(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)140 {141 return adoptRef(new AVVideoSourcePreview(session, device, parent));142 }143 144 AVVideoSourcePreview::AVVideoSourcePreview(AVCaptureSession *session, AVCaptureDeviceTypedef* device, AVVideoCaptureSource* parent)145 : AVMediaSourcePreview(parent)146 , m_objcObserver(adoptNS([[WebCoreAVVideoCaptureSourceObserver alloc] init]))147 {148 m_device = device;149 m_previewLayer = adoptNS([allocAVCaptureVideoPreviewLayerInstance() initWithSession:session]);150 m_previewLayer.get().contentsGravity = kCAGravityResize;151 m_previewLayer.get().anchorPoint = CGPointZero;152 [m_previewLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];153 154 m_previewBackgroundLayer = adoptNS([[CALayer alloc] init]);155 m_previewBackgroundLayer.get().contentsGravity = kCAGravityResizeAspect;156 m_previewBackgroundLayer.get().anchorPoint = CGPointZero;157 m_previewBackgroundLayer.get().needsDisplayOnBoundsChange = YES;158 [m_previewBackgroundLayer.get() setDelegate:[WebActionDisablingCALayerDelegate shared]];159 160 #ifndef NDEBUG161 m_previewLayer.get().name = @"AVVideoCaptureSource preview layer";162 m_previewBackgroundLayer.get().name = @"AVVideoSourcePreview parent layer";163 #endif164 165 [m_previewBackgroundLayer addSublayer:m_previewLayer.get()];166 167 [m_objcObserver.get() setParent:this];168 }169 170 void AVVideoSourcePreview::backgroundLayerBoundsChanged()171 {172 if (m_previewBackgroundLayer && m_previewLayer)173 [m_previewLayer.get() setBounds:m_previewBackgroundLayer.get().bounds];174 }175 176 void AVVideoSourcePreview::invalidate()177 {178 [m_objcObserver.get() setParent:nil];179 m_objcObserver = nullptr;180 m_previewLayer = nullptr;181 m_previewBackgroundLayer = nullptr;182 m_device = nullptr;183 AVMediaSourcePreview::invalidate();184 }185 186 void AVVideoSourcePreview::play() const187 {188 setPaused(false);189 }190 191 void AVVideoSourcePreview::pause() const192 {193 setPaused(true);194 }195 196 void AVVideoSourcePreview::setPaused(bool paused) const197 {198 [m_device lockForConfiguration:nil];199 m_previewLayer.get().connection.enabled = !paused;200 [m_device unlockForConfiguration];201 }202 203 void AVVideoSourcePreview::setEnabled(bool enabled)204 {205 m_previewLayer.get().hidden = !enabled;206 }207 104 208 105 const OSType videoCaptureFormat = kCVPixelFormatType_32BGRA; … … 513 410 514 411 updateFramerate(sampleBuffer.get()); 515 516 CMSampleBufferRef newSampleBuffer = 0; 517 CMSampleBufferCreateCopy(kCFAllocatorDefault, sampleBuffer.get(), &newSampleBuffer); 518 ASSERT(newSampleBuffer); 519 520 CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(newSampleBuffer, true); 521 if (attachmentsArray) { 522 for (CFIndex i = 0; i < CFArrayGetCount(attachmentsArray); ++i) { 523 CFMutableDictionaryRef attachments = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachmentsArray, i); 524 CFDictionarySetValue(attachments, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue); 525 } 526 } 527 528 m_buffer = adoptCF(newSampleBuffer); 412 m_buffer = sampleBuffer; 529 413 m_lastImage = nullptr; 530 414 … … 606 490 } 607 491 608 RefPtr<AVMediaSourcePreview> AVVideoCaptureSource::createPreview()609 {610 return AVVideoSourcePreview::create(session(), device(), this);611 }612 613 492 NSString* AVVideoCaptureSource::bestSessionPresetForVideoDimensions(std::optional<int> width, std::optional<int> height) const 614 493 { … … 657 536 } // namespace WebCore 658 537 659 @implementation WebCoreAVVideoCaptureSourceObserver660 661 static NSString * const KeyValueBoundsChangeKey = @"bounds";662 663 - (void)setParent:(AVVideoSourcePreview *)parent664 {665 if (_parent && _hasObserver && _parent->platformLayer()) {666 _hasObserver = false;667 [_parent->platformLayer() removeObserver:self forKeyPath:KeyValueBoundsChangeKey];668 }669 670 _parent = parent;671 672 if (_parent && _parent->platformLayer()) {673 _hasObserver = true;674 [_parent->platformLayer() addObserver:self forKeyPath:KeyValueBoundsChangeKey options:0 context:nullptr];675 }676 }677 678 - (void)observeValueForKeyPath:keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context679 {680 UNUSED_PARAM(context);681 682 if (!_parent)683 return;684 685 if ([[change valueForKey:NSKeyValueChangeNotificationIsPriorKey] boolValue])686 return;687 688 #if PLATFORM(IOS)689 WebThreadRun(^ {690 if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())691 _parent->backgroundLayerBoundsChanged();692 });693 #else694 if ([keyPath isEqual:KeyValueBoundsChangeKey] && object == _parent->platformLayer())695 _parent->backgroundLayerBoundsChanged();696 #endif697 }698 699 @end700 701 538 #endif // ENABLE(MEDIA_STREAM) -
trunk/Source/WebCore/platform/mediastream/mac/MockRealtimeVideoSourceMac.mm
r208851 r210621 49 49 namespace WebCore { 50 50 51 static const int videoSampleRate = 90000; 52 51 53 RefPtr<MockRealtimeVideoSource> MockRealtimeVideoSource::create(const String& name, const MediaConstraints* constraints) 52 54 { … … 75 77 return nullptr; 76 78 77 CMSampleTimingInfo timingInfo; 78 79 timingInfo.presentationTimeStamp = CMTimeMake(elapsedTime() * 1000, 1000); 80 timingInfo.decodeTimeStamp = kCMTimeInvalid; 81 timingInfo.duration = kCMTimeInvalid; 79 CMTime sampleTime = CMTimeMake((elapsedTime() + .1) * videoSampleRate, videoSampleRate); 80 CMSampleTimingInfo timingInfo = { kCMTimeInvalid, sampleTime, sampleTime }; 82 81 83 82 CMVideoFormatDescriptionRef formatDescription = nullptr; … … 101 100 RetainPtr<CVPixelBufferRef> MockRealtimeVideoSourceMac::pixelBufferFromCGImage(CGImageRef image) const 102 101 { 102 static CGColorSpaceRef deviceRGBColorSpace = CGColorSpaceCreateDeviceRGB(); 103 103 104 CGSize frameSize = CGSizeMake(CGImageGetWidth(image), CGImageGetHeight(image)); 104 105 CFDictionaryRef options = (__bridge CFDictionaryRef) @{ … … 113 114 CVPixelBufferLockBaseAddress(pixelBuffer, 0); 114 115 void* data = CVPixelBufferGetBaseAddress(pixelBuffer); 115 auto rgbColorSpace = adoptCF(CGColorSpaceCreateDeviceRGB()); 116 auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), rgbColorSpace.get(), (CGBitmapInfo) kCGImageAlphaNoneSkipFirst)); 116 auto context = adoptCF(CGBitmapContextCreate(data, frameSize.width, frameSize.height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), deviceRGBColorSpace, (CGBitmapInfo) kCGImageAlphaNoneSkipFirst)); 117 117 CGContextDrawImage(context.get(), CGRectMake(0, 0, CGImageGetWidth(image), CGImageGetHeight(image)), image); 118 118 CVPixelBufferUnlockBaseAddress(pixelBuffer, 0); -
trunk/Source/WebKit2/WebProcess/com.apple.WebProcess.sb.in
r210076 r210621 449 449 (iokit-user-client-class "IOUSBInterfaceUserClientV2")) 450 450 (allow device-camera)) 451 452 ;; @@@@@ 453 (allow device-microphone) 454 ;; @@@@@ 455
Note: See TracChangeset
for help on using the changeset viewer.