Changeset 217098 in webkit
- Timestamp:
- May 18, 2017 10:50:35 PM (7 years ago)
- Location:
- trunk/Source/WebCore
- Files:
-
- 4 added
- 16 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/Source/WebCore/ChangeLog
r217096 r217098 1 2017-05-18 Jer Noble <jer.noble@apple.com> 2 3 [MSE][Mac] Support painting MSE video-element to canvas 4 https://bugs.webkit.org/show_bug.cgi?id=125157 5 <rdar://problem/23062016> 6 7 Reviewed by Eric Carlson. 8 9 Test: media/media-source/media-source-paint-to-canvas.html 10 11 In order to have access to decoded video data for painting, decode the encoded samples manually 12 instead of adding them to the AVSampleBufferDisplayLayer. To facilitate doing so, add a new 13 utility class WebCoreDecompressionSession, which can decode samples and store them. 14 15 For the purposes of this patch, to avoid double-decoding of video data and to avoid severe complication 16 of our sample delivery pipeline, we will only support painting of decoded video samples when the video is 17 not displayed in the DOM. 18 19 * Modules/mediasource/MediaSource.cpp: 20 (WebCore::MediaSource::seekToTime): Always send waitForSeekCompleted() to give private a chance to delay seek completion. 21 * Modules/mediasource/SourceBuffer.cpp: 22 (WebCore::SourceBuffer::sourceBufferPrivateReenqueSamples): Added. 23 * Modules/mediasource/SourceBuffer.h: 24 * WebCore.xcodeproj/project.pbxproj: 25 * platform/cf/CoreMediaSoftLink.cpp: Added new soft link macros. 26 * platform/cf/CoreMediaSoftLink.h: Ditto. 27 * platform/cocoa/CoreVideoSoftLink.cpp: Ditto. 28 * platform/cocoa/CoreVideoSoftLink.h: Ditto. 29 * platform/graphics/SourceBufferPrivateClient.h: 30 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h: 31 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::sampleBufferDisplayLayer): Simple accessor. 32 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::decompressionSession): Ditto. 33 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm: 34 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC): 35 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::load): Update whether we should be displaying in a layer or decompression session.. 36 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setVisible): Ditto. 37 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted): m_seeking is now an enum. 38 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seeking): Ditto. 39 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted): Ditto. If waiting for a video frame, delay completing seek. 40 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime): Call updateLastImage() and return result. 41 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage): Fetch the image for the current time. 42 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paint): Pass to paintCurrentFrameInCanvas. 43 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext): Get a native image, and render it. 44 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged): Create or destroy a layer or decompression session as appropriate. 45 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): Creates a layer. 46 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer): Destroys a layer. 47 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession): Creates a decompression session. 48 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession): Destroys a decompression session. 49 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame): If seek completion delayed, complete now. Ditto for ready state change. 50 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setReadyState): If waiting for a video frame, delay ready state change. 51 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer): Deleted. 52 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer): Deleted. 53 * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h: 54 * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm: 55 (WebCore::MediaSourcePrivateAVFObjC::hasVideo): Promote to a class function. 56 (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideo): Return whether any of the active source buffers have video and are selected. 57 (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideoChanged): Call setSourceBufferWithSelectedVideo(). 58 (WebCore::MediaSourcePrivateAVFObjC::setVideoLayer): Set (or clear) the layer on the selected buffer. 59 (WebCore::MediaSourcePrivateAVFObjC::setDecompressionSession): Ditto for decompression session. 60 (WebCore::MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo): Remove the layer and decompression session from the unselected 61 62 buffer and add the decompression session or layer to the newly selected buffer. 63 (WebCore::MediaSourcePrivateAVFObjCHasVideo): Deleted. 64 * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h: 65 * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm: 66 (WebCore::SourceBufferPrivateAVFObjC::destroyRenderers): Clear the videoLayer and decompressionSession. 67 (WebCore::SourceBufferPrivateAVFObjC::hasSelectedVideo): Return whether the buffer has a selected video track. 68 (WebCore::SourceBufferPrivateAVFObjC::trackDidChangeEnabled): The media player now manages the video layer and decompression session lifetimes. 69 (WebCore::SourceBufferPrivateAVFObjC::flush): Flush the decompression session, if it exists. 70 (WebCore::SourceBufferPrivateAVFObjC::enqueueSample): Enqueue to the decompression session, if it exists. 71 (WebCore::SourceBufferPrivateAVFObjC::isReadyForMoreSamples): As the decompression session, if it exists. 72 (WebCore::SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples): Tell the decompression session to stop requesting data, if it exists. 73 (WebCore::SourceBufferPrivateAVFObjC::notifyClientWhenReadyForMoreSamples): Request media data from the decompression session, if it exists. 74 (WebCore::SourceBufferPrivateAVFObjC::setVideoLayer): Added. 75 (WebCore::SourceBufferPrivateAVFObjC::setDecompressionSession): Added. 76 * platform/graphics/cocoa/WebCoreDecompressionSession.h: Added. 77 (WebCore::WebCoreDecompressionSession::create): 78 (WebCore::WebCoreDecompressionSession::isInvalidated): 79 (WebCore::WebCoreDecompressionSession::createWeakPtr): 80 * platform/graphics/cocoa/WebCoreDecompressionSession.mm: Added. 81 (WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession): Register for media data requests. 82 (WebCore::WebCoreDecompressionSession::invalidate): Unregister for same. 83 (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaDataCallback): Pass to maybeBecomeReadyForMoreMediaData. 84 (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData): Check in-flight decodes, and decoded frame counts. 85 (WebCore::WebCoreDecompressionSession::enqueueSample): Pass the sample to be decoded on a background queue. 86 (WebCore::WebCoreDecompressionSession::decodeSample): Decode the sample. 87 (WebCore::WebCoreDecompressionSession::decompressionOutputCallback): Call handleDecompressionOutput. 88 (WebCore::WebCoreDecompressionSession::handleDecompressionOutput): Pass decoded sample to be enqueued on the main thread. 89 (WebCore::WebCoreDecompressionSession::getFirstVideoFrame): 90 (WebCore::WebCoreDecompressionSession::enqueueDecodedSample): Enqueue the frame (if it's a displayed frame). 91 (WebCore::WebCoreDecompressionSession::isReadyForMoreMediaData): Return whether we've hit our high water sample count. 92 (WebCore::WebCoreDecompressionSession::requestMediaDataWhenReady): 93 (WebCore::WebCoreDecompressionSession::stopRequestingMediaData): Unset the same. 94 (WebCore::WebCoreDecompressionSession::notifyWhenHasAvailableVideoFrame): Set a callback to notify when a decoded frame has been enqueued. 95 (WebCore::WebCoreDecompressionSession::imageForTime): Successively dequeue images until reaching one at or beyond the requested time. 96 (WebCore::WebCoreDecompressionSession::flush): Synchronously empty the producer and consumer queues. 97 (WebCore::WebCoreDecompressionSession::getDecodeTime): Utility method. 98 (WebCore::WebCoreDecompressionSession::getPresentationTime): Ditto. 99 (WebCore::WebCoreDecompressionSession::getDuration): Ditto. 100 (WebCore::WebCoreDecompressionSession::compareBuffers): Ditto. 101 * platform/cocoa/VideoToolboxSoftLink.cpp: Added. 102 * platform/cocoa/VideoToolboxSoftLink.h: Added. 103 1 104 2017-05-18 Said Abou-Hallawa <sabouhallawa@apple.com> 2 105 -
trunk/Source/WebCore/Modules/mediasource/MediaSource.cpp
r216860 r217098 234 234 // Continue 235 235 236 m_private->waitForSeekCompleted(); 236 237 completeSeek(); 237 238 } -
trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp
r215160 r217098 1779 1779 } 1780 1780 1781 void SourceBuffer::sourceBufferPrivateReenqueSamples(const AtomicString& trackID) 1782 { 1783 if (isRemoved()) 1784 return; 1785 1786 LOG(MediaSource, "SourceBuffer::sourceBufferPrivateReenqueSamples(%p)", this); 1787 auto it = m_trackBufferMap.find(trackID); 1788 if (it == m_trackBufferMap.end()) 1789 return; 1790 1791 auto& trackBuffer = it->value; 1792 trackBuffer.needsReenqueueing = true; 1793 reenqueueMediaForTime(trackBuffer, trackID, m_source->currentTime()); 1794 } 1795 1781 1796 void SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) 1782 1797 { 1798 if (isRemoved()) 1799 return; 1800 1783 1801 LOG(MediaSource, "SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(%p)", this); 1784 1802 auto it = m_trackBufferMap.find(trackID); -
trunk/Source/WebCore/Modules/mediasource/SourceBuffer.h
r210319 r217098 130 130 bool sourceBufferPrivateHasAudio() const final; 131 131 bool sourceBufferPrivateHasVideo() const final; 132 void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) final; 132 133 void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) final; 133 134 MediaTime sourceBufferPrivateFastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold) final; -
trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj
r217049 r217098 6129 6129 CD5896E11CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */; }; 6130 6130 CD5896E21CD2B15100B3BCC8 /* WebPlaybackControlsManager.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */; settings = {ATTRIBUTES = (Private, ); }; }; 6131 CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */; }; 6132 CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */; }; 6131 6133 CD5E5B5F1A15CE54000C609E /* PageConfiguration.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */; settings = {ATTRIBUTES = (Private, ); }; }; 6132 6134 CD5E5B611A15F156000C609E /* PageConfiguration.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CD5E5B601A15F156000C609E /* PageConfiguration.cpp */; }; … … 6230 6232 CDC8B5AB18047FF10016E685 /* SourceBufferPrivateAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */; }; 6231 6233 CDC8B5AD1804AE5D0016E685 /* SourceBufferPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */; }; 6234 CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */; }; 6235 CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */; }; 6232 6236 CDC979F41C498C0900DB50D4 /* WebCoreNSErrorExtras.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */; }; 6233 6237 CDC979F51C498C0900DB50D4 /* WebCoreNSErrorExtras.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */; }; … … 14619 14623 CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebPlaybackControlsManager.mm; sourceTree = "<group>"; }; 14620 14624 CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebPlaybackControlsManager.h; sourceTree = "<group>"; }; 14625 CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreDecompressionSession.mm; sourceTree = "<group>"; }; 14626 CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreDecompressionSession.h; sourceTree = "<group>"; }; 14621 14627 CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = PageConfiguration.h; sourceTree = "<group>"; }; 14622 14628 CD5E5B601A15F156000C609E /* PageConfiguration.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PageConfiguration.cpp; sourceTree = "<group>"; }; … … 14739 14745 CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateAVFObjC.h; sourceTree = "<group>"; }; 14740 14746 CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateClient.h; sourceTree = "<group>"; }; 14747 CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; }; 14748 CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; }; 14741 14749 CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreNSErrorExtras.mm; sourceTree = "<group>"; }; 14742 14750 CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreNSErrorExtras.h; sourceTree = "<group>"; }; … … 21116 21124 52D5A1A51C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.h */, 21117 21125 52D5A1A61C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.mm */, 21126 CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */, 21127 CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */, 21118 21128 ); 21119 21129 path = cocoa; … … 23357 23367 2D3EF4461917915C00034184 /* WebCoreCALayerExtras.h */, 23358 23368 2D3EF4471917915C00034184 /* WebCoreCALayerExtras.mm */, 23369 CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */, 23370 CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */, 23359 23371 316BDB8A1E6E153000DE0D5A /* WebGPULayer.h */, 23360 23372 316BDB891E6E153000DE0D5A /* WebGPULayer.mm */, … … 27065 27077 E440AA961C68420800A265CC /* ElementAndTextDescendantIterator.h in Headers */, 27066 27078 E46A2B1E17CA76B1000DBCD8 /* ElementChildIterator.h in Headers */, 27079 CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */, 27067 27080 B5B7A17117C10AC000E4AA0A /* ElementData.h in Headers */, 27068 27081 93D437A11D57B3F400AB85EA /* ElementDescendantIterator.h in Headers */, … … 28559 28572 413E00791DB0E4F2002341D2 /* MemoryRelease.h in Headers */, 28560 28573 93309DFA099E64920056E581 /* MergeIdenticalElementsCommand.h in Headers */, 28574 CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */, 28561 28575 E1ADECCE0E76AD8B004A1A5E /* MessageChannel.h in Headers */, 28562 28576 75793E840D0CE0B3007FC0AC /* MessageEvent.h in Headers */, … … 30944 30958 FD31603012B0267600C1A359 /* DelayProcessor.cpp in Sources */, 30945 30959 93309DDE099E64920056E581 /* DeleteFromTextNodeCommand.cpp in Sources */, 30960 CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */, 30946 30961 93309DE0099E64920056E581 /* DeleteSelectionCommand.cpp in Sources */, 30947 30962 9479493C1E045CF300018D85 /* DeprecatedCSSOMPrimitiveValue.cpp in Sources */, … … 33375 33390 B2227AB70D00BF220071B782 /* SVGStyleElement.cpp in Sources */, 33376 33391 B2227ABA0D00BF220071B782 /* SVGSVGElement.cpp in Sources */, 33392 CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */, 33377 33393 B2227ABD0D00BF220071B782 /* SVGSwitchElement.cpp in Sources */, 33378 33394 B2227AC00D00BF220071B782 /* SVGSymbolElement.cpp in Sources */, -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp
r210621 r217098 30 30 #include "CoreMediaSPI.h" 31 31 #include "SoftLinking.h" 32 #include <CoreVideo/CoreVideo.h> 32 33 33 34 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreMedia) … … 47 48 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeRangeGetEnd, CMTime, (CMTimeRange range), (range)) 48 49 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeRangeMake, CMTimeRange, (CMTime start, CMTime duration), (start, duration)) 50 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue)) 51 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut)) 52 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue)) 53 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue)) 54 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer)) 55 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue)) 56 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue)) 57 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 58 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 49 59 50 60 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms, CFStringRef) … … 98 108 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time)) 99 109 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 110 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 111 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 112 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags)) 113 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 114 100 115 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 101 116 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc)) -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h
r210621 r217098 64 64 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeRangeMake, CMTimeRange, (CMTime start, CMTime duration), (start, duration)) 65 65 #define CMTimeRangeMake softLink_CoreMedia_CMTimeRangeMake 66 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut)) 67 #define CMBufferQueueCreate softLink_CoreMedia_CMBufferQueueCreate 68 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue)) 69 #define CMBufferQueueReset softLink_CoreMedia_CMBufferQueueReset 70 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue)) 71 #define CMBufferQueueGetHead softLink_CoreMedia_CMBufferQueueGetHead 72 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue)) 73 #define CMBufferQueueDequeueAndRetain softLink_CoreMedia_CMBufferQueueDequeueAndRetain 74 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer)) 75 #define CMBufferQueueEnqueue softLink_CoreMedia_CMBufferQueueEnqueue 76 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue)) 77 #define CMBufferQueueIsEmpty softLink_CoreMedia_CMBufferQueueIsEmpty 78 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue)) 79 #define CMBufferQueueGetBufferCount softLink_CoreMedia_CMBufferQueueGetBufferCount 80 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 81 #define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp 82 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 83 #define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold 66 84 67 85 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMFormatDescriptionExtension_SampleDescriptionExtensionAtoms, CFStringRef) … … 162 180 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 163 181 #define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate 182 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 183 #define CMTimebaseAddTimerDispatchSource softLink_CoreMedia_CMTimebaseAddTimerDispatchSource 184 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 185 #define CMTimebaseRemoveTimerDispatchSource softLink_CoreMedia_CMTimebaseRemoveTimerDispatchSource 186 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags)) 187 #define CMTimebaseSetTimerDispatchSourceNextFireTime softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceNextFireTime 188 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 189 #define CMTimebaseSetTimerDispatchSourceToFireImmediately softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceToFireImmediately 190 164 191 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 165 192 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary -
trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp
r214302 r217098 31 31 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreVideo) 32 32 33 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ()) 33 34 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 34 35 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) -
trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.h
r214302 r217098 32 32 SOFT_LINK_FRAMEWORK_FOR_HEADER(WebCore, CoreVideo) 33 33 34 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ()) 35 #define CVPixelBufferGetTypeID softLink_CoreVideo_CVPixelBufferGetTypeID 34 36 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 35 37 #define CVPixelBufferGetWidth softLink_CoreVideo_CVPixelBufferGetWidth -
trunk/Source/WebCore/platform/graphics/SourceBufferPrivateClient.h
r210319 r217098 69 69 virtual bool sourceBufferPrivateHasVideo() const = 0; 70 70 71 virtual void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) = 0; 71 72 virtual void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) = 0; 72 73 -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
r216951 r217098 42 42 43 43 typedef struct OpaqueCMTimebase* CMTimebaseRef; 44 typedef struct __CVBuffer *CVPixelBufferRef; 45 typedef struct __CVBuffer *CVOpenGLTextureRef; 44 46 45 47 namespace WebCore { 46 48 47 49 class CDMSessionMediaSourceAVFObjC; 50 class MediaSourcePrivateAVFObjC; 51 class PixelBufferConformerCV; 48 52 class PlatformClockCM; 49 class MediaSourcePrivateAVFObjC; 53 class TextureCacheCV; 54 class VideoTextureCopierCV; 55 class WebCoreDecompressionSession; 50 56 51 57 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) … … 64 70 static void getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types); 65 71 static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&); 66 67 void addDisplayLayer(AVSampleBufferDisplayLayer*);68 void removeDisplayLayer(AVSampleBufferDisplayLayer*);69 72 70 73 void addAudioRenderer(AVSampleBufferAudioRenderer*); … … 92 95 void characteristicsChanged(); 93 96 97 MediaTime currentMediaTime() const override; 98 AVSampleBufferDisplayLayer* sampleBufferDisplayLayer() const { return m_sampleBufferDisplayLayer.get(); } 99 WebCoreDecompressionSession* decompressionSession() const { return m_decompressionSession.get(); } 100 94 101 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 95 102 void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override; … … 150 157 151 158 MediaTime durationMediaTime() const override; 152 MediaTime currentMediaTime() const override;153 159 MediaTime startTime() const override; 154 160 MediaTime initialTime() const override; … … 169 175 void setSize(const IntSize&) override; 170 176 177 NativeImagePtr nativeImageForCurrentTime() override; 178 bool updateLastPixelBuffer(); 179 bool updateLastImage(); 171 180 void paint(GraphicsContext&, const FloatRect&) override; 172 181 void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override; 173 182 bool copyVideoTextureToPlatformTexture(GraphicsContext3D*, Platform3DObject, GC3Denum target, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY) override; 183 174 184 bool hasAvailableVideoFrame() const override; 175 185 … … 178 188 void acceleratedRenderingStateChanged() override; 179 189 void notifyActiveSourceBuffersChanged() override; 190 191 // NOTE: Because the only way for MSE to recieve data is through an ArrayBuffer provided by 192 // javascript running in the page, the video will, by necessity, always be CORS correct and 193 // in the page's origin. 194 bool hasSingleSecurityOrigin() const override { return true; } 195 bool didPassCORSAccessCheck() const override { return true; } 180 196 181 197 MediaPlayer::MovieLoadType movieLoadType() const override; … … 203 219 void ensureLayer(); 204 220 void destroyLayer(); 221 void ensureDecompressionSession(); 222 void destroyDecompressionSession(); 223 205 224 bool shouldBePlaying() const; 206 225 … … 235 254 RetainPtr<id> m_durationObserver; 236 255 RetainPtr<AVStreamSession> m_streamSession; 256 RetainPtr<CVPixelBufferRef> m_lastPixelBuffer; 257 RetainPtr<CGImageRef> m_lastImage; 258 std::unique_ptr<PixelBufferConformerCV> m_rgbConformer; 259 RefPtr<WebCoreDecompressionSession> m_decompressionSession; 237 260 Deque<RetainPtr<id>> m_sizeChangeObservers; 238 261 Timer m_seekTimer; … … 240 263 MediaPlayer::NetworkState m_networkState; 241 264 MediaPlayer::ReadyState m_readyState; 265 bool m_readyStateIsWaitingForAvailableFrame { false }; 242 266 MediaTime m_lastSeekTime; 243 267 FloatSize m_naturalSize; … … 245 269 bool m_playing; 246 270 bool m_seeking; 247 bool m_seekCompleted; 271 enum SeekState { 272 Seeking, 273 WaitingForAvailableFame, 274 SeekCompleted, 275 }; 276 SeekState m_seekCompleted { SeekCompleted }; 248 277 mutable bool m_loadingProgressed; 249 bool m_hasAvailableVideoFrame; 278 bool m_hasBeenAskedToPaintGL { false }; 279 bool m_hasAvailableVideoFrame { false }; 250 280 bool m_allRenderersHaveAvailableSamples { false }; 251 281 RetainPtr<PlatformLayer> m_textTrackRepresentationLayer; 282 std::unique_ptr<TextureCacheCV> m_textureCache; 283 std::unique_ptr<VideoTextureCopierCV> m_videoTextureCopier; 284 RetainPtr<CVOpenGLTextureRef> m_lastTexture; 252 285 #if ENABLE(WIRELESS_PLAYBACK_TARGET) 253 286 RefPtr<MediaPlaybackTarget> m_playbackTarget; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
r216951 r217098 34 34 #import "CDMSessionMediaSourceAVFObjC.h" 35 35 #import "FileSystem.h" 36 #import "GraphicsContextCG.h" 36 37 #import "Logging.h" 37 38 #import "MediaSourcePrivateAVFObjC.h" 38 39 #import "MediaSourcePrivateClient.h" 39 40 #import "MediaTimeAVFoundation.h" 41 #import "PixelBufferConformerCV.h" 40 42 #import "PlatformClockCM.h" 41 43 #import "TextTrackRepresentation.h" 44 #import "TextureCacheCV.h" 45 #import "VideoTextureCopierCV.h" 46 #import "WebCoreDecompressionSession.h" 42 47 #import "WebCoreSystemInterface.h" 43 48 #import <AVFoundation/AVAsset.h> … … 122 127 , m_playing(0) 123 128 , m_seeking(false) 124 , m_seekCompleted(true)125 129 , m_loadingProgressed(false) 126 130 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) … … 149 153 if (shouldBePlaying()) 150 154 [m_synchronizer setRate:m_rate]; 151 if (!seeking() )155 if (!seeking() && m_seekCompleted == SeekCompleted) 152 156 m_player->timeChanged(); 153 157 } … … 242 246 243 247 m_mediaSourcePrivate = MediaSourcePrivateAVFObjC::create(this, client); 248 m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get()); 249 m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get()); 250 251 acceleratedRenderingStateChanged(); 244 252 } 245 253 … … 355 363 void MediaPlayerPrivateMediaSourceAVFObjC::setVisible(bool) 356 364 { 357 // No-op.365 acceleratedRenderingStateChanged(); 358 366 } 359 367 … … 438 446 return; 439 447 LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted(%p)", this); 440 m_seekCompleted = false;448 m_seekCompleted = Seeking; 441 449 } 442 450 443 451 void MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted() 444 452 { 445 if (m_seekCompleted) 446 return; 453 if (m_seekCompleted == SeekCompleted) 454 return; 455 if (hasVideo() && !m_hasAvailableVideoFrame) { 456 m_seekCompleted = WaitingForAvailableFame; 457 return; 458 } 447 459 LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted(%p)", this); 448 m_seekCompleted = true;460 m_seekCompleted = SeekCompleted; 449 461 if (shouldBePlaying()) 450 462 [m_synchronizer setRate:m_rate]; … … 455 467 bool MediaPlayerPrivateMediaSourceAVFObjC::seeking() const 456 468 { 457 return m_seeking || !m_seekCompleted;469 return m_seeking || m_seekCompleted != SeekCompleted; 458 470 } 459 471 … … 515 527 } 516 528 517 void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext&, const FloatRect&) 518 { 519 // FIXME(125157): Implement painting. 520 } 521 522 void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) 523 { 524 // FIXME(125157): Implement painting. 529 NativeImagePtr MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime() 530 { 531 updateLastImage(); 532 return m_lastImage.get(); 533 } 534 535 bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastPixelBuffer() 536 { 537 if (m_sampleBufferDisplayLayer || !m_decompressionSession) 538 return false; 539 540 auto flags = !m_lastPixelBuffer ? WebCoreDecompressionSession::AllowLater : WebCoreDecompressionSession::ExactTime; 541 auto newPixelBuffer = m_decompressionSession->imageForTime(currentMediaTime(), flags); 542 if (!newPixelBuffer) 543 return false; 544 545 m_lastPixelBuffer = newPixelBuffer; 546 return true; 547 } 548 549 bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage() 550 { 551 if (!updateLastPixelBuffer()) 552 return false; 553 554 ASSERT(m_lastPixelBuffer); 555 556 if (!m_rgbConformer) { 557 NSDictionary *attributes = @{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) }; 558 m_rgbConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)attributes); 559 } 560 561 m_lastImage = m_rgbConformer->createImageFromPixelBuffer(m_lastPixelBuffer.get()); 562 return true; 563 } 564 565 void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext& context, const FloatRect& rect) 566 { 567 paintCurrentFrameInContext(context, rect); 568 } 569 570 void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& outputRect) 571 { 572 if (context.paintingDisabled()) 573 return; 574 575 auto image = nativeImageForCurrentTime(); 576 if (!image) 577 return; 578 579 GraphicsContextStateSaver stateSaver(context); 580 FloatRect imageRect(0, 0, CGImageGetWidth(image.get()), CGImageGetHeight(image.get())); 581 context.drawNativeImage(image, imageRect.size(), outputRect, imageRect); 582 } 583 584 bool MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture(GraphicsContext3D* context, Platform3DObject outputTexture, GC3Denum outputTarget, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY) 585 { 586 if (flipY || premultiplyAlpha) 587 return false; 588 589 // We have been asked to paint into a WebGL canvas, so take that as a signal to create 590 // a decompression session, even if that means the native video can't also be displayed 591 // in page. 592 if (!m_hasBeenAskedToPaintGL) { 593 m_hasBeenAskedToPaintGL = true; 594 acceleratedRenderingStateChanged(); 595 } 596 597 ASSERT(context); 598 599 if (updateLastPixelBuffer()) { 600 if (!m_lastPixelBuffer) 601 return false; 602 603 if (!m_textureCache) { 604 m_textureCache = TextureCacheCV::create(*context); 605 if (!m_textureCache) 606 return false; 607 } 608 609 m_lastTexture = m_textureCache->textureFromImage(m_lastPixelBuffer.get(), outputTarget, level, internalFormat, format, type); 610 } 611 612 size_t width = CVPixelBufferGetWidth(m_lastPixelBuffer.get()); 613 size_t height = CVPixelBufferGetHeight(m_lastPixelBuffer.get()); 614 615 if (!m_videoTextureCopier) 616 m_videoTextureCopier = std::make_unique<VideoTextureCopierCV>(*context); 617 618 return m_videoTextureCopier->copyVideoTextureToPlatformTexture(m_lastTexture.get(), width, height, outputTexture, outputTarget, level, internalFormat, format, type, premultiplyAlpha, flipY); 525 619 } 526 620 … … 537 631 void MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged() 538 632 { 539 if (m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player)) 633 if (!m_hasBeenAskedToPaintGL && m_player->visible() && m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player)) { 634 destroyDecompressionSession(); 540 635 ensureLayer(); 541 else636 } else { 542 637 destroyLayer(); 638 ensureDecompressionSession(); 639 } 543 640 } 544 641 … … 609 706 610 707 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()]; 611 612 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 708 if (m_mediaSourcePrivate) 709 m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get()); 710 #if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)) 613 711 m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size()); 614 712 #endif 713 m_player->client().mediaPlayerRenderingModeChanged(m_player); 615 714 } 616 715 … … 624 723 // No-op. 625 724 }]; 725 726 if (m_mediaSourcePrivate) 727 m_mediaSourcePrivate->setVideoLayer(nullptr); 728 #if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)) 729 m_videoFullscreenLayerManager->didDestroyVideoLayer(); 730 #endif 626 731 m_sampleBufferDisplayLayer = nullptr; 732 setHasAvailableVideoFrame(false); 733 m_player->client().mediaPlayerRenderingModeChanged(m_player); 734 } 735 736 void MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession() 737 { 738 if (m_decompressionSession) 739 return; 740 741 m_decompressionSession = WebCoreDecompressionSession::create(); 742 m_decompressionSession->setTimebase([m_synchronizer timebase]); 743 m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get()); 744 745 m_player->client().mediaPlayerRenderingModeChanged(m_player); 746 } 747 748 void MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession() 749 { 750 if (!m_decompressionSession) 751 return; 752 753 m_mediaSourcePrivate->setDecompressionSession(nullptr); 754 m_decompressionSession->invalidate(); 755 m_decompressionSession = nullptr; 756 setHasAvailableVideoFrame(false); 627 757 } 628 758 … … 638 768 m_hasAvailableVideoFrame = flag; 639 769 updateAllRenderersHaveAvailableSamples(); 770 771 if (!m_hasAvailableVideoFrame) 772 return; 773 774 m_player->firstVideoFrameAvailable(); 775 if (m_seekCompleted == WaitingForAvailableFame) 776 seekCompleted(); 777 778 if (m_readyStateIsWaitingForAvailableFrame) { 779 m_readyStateIsWaitingForAvailableFrame = false; 780 m_player->readyStateChanged(); 781 } 640 782 } 641 783 … … 658 800 659 801 do { 660 if ( m_sampleBufferDisplayLayer&& !m_hasAvailableVideoFrame) {802 if (hasVideo() && !m_hasAvailableVideoFrame) { 661 803 allRenderersHaveAvailableSamples = false; 662 804 break; … … 815 957 [m_synchronizer setRate:0]; 816 958 959 if (m_readyState >= MediaPlayerEnums::HaveCurrentData && hasVideo() && !m_hasAvailableVideoFrame) { 960 m_readyStateIsWaitingForAvailableFrame = true; 961 return; 962 } 963 817 964 m_player->readyStateChanged(); 818 965 } … … 825 972 m_networkState = networkState; 826 973 m_player->networkStateChanged(); 827 }828 829 void MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)830 {831 ASSERT(displayLayer);832 if (displayLayer == m_sampleBufferDisplayLayer)833 return;834 835 m_sampleBufferDisplayLayer = displayLayer;836 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];837 m_player->client().mediaPlayerRenderingModeChanged(m_player);838 839 // FIXME: move this somewhere appropriate:840 m_player->firstVideoFrameAvailable();841 842 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)843 m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());844 #endif845 }846 847 void MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)848 {849 if (displayLayer != m_sampleBufferDisplayLayer)850 return;851 852 CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);853 [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL){854 // No-op.855 }];856 857 m_sampleBufferDisplayLayer = nullptr;858 m_player->client().mediaPlayerRenderingModeChanged(m_player);859 860 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)861 m_videoFullscreenLayerManager->didDestroyVideoLayer();862 #endif863 974 } 864 975 … … 895 1006 void MediaPlayerPrivateMediaSourceAVFObjC::characteristicsChanged() 896 1007 { 1008 updateAllRenderersHaveAvailableSamples(); 897 1009 m_player->characteristicChanged(); 898 1010 } -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h
r207694 r217098 36 36 37 37 OBJC_CLASS AVAsset; 38 OBJC_CLASS AVSampleBufferDisplayLayer; 38 39 OBJC_CLASS AVStreamDataParser; 39 40 OBJC_CLASS NSError; … … 48 49 class SourceBufferPrivateAVFObjC; 49 50 class TimeRanges; 51 class WebCoreDecompressionSession; 50 52 51 53 class MediaSourcePrivateAVFObjC final : public MediaSourcePrivate { … … 72 74 bool hasAudio() const; 73 75 bool hasVideo() const; 76 bool hasSelectedVideo() const; 74 77 75 78 void willSeek(); … … 77 80 MediaTime fastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold); 78 81 FloatSize naturalSize() const; 82 83 void hasSelectedVideoChanged(SourceBufferPrivateAVFObjC&); 84 void setVideoLayer(AVSampleBufferDisplayLayer*); 85 void setDecompressionSession(WebCoreDecompressionSession*); 79 86 80 87 private: … … 89 96 void removeSourceBuffer(SourceBufferPrivate*); 90 97 98 void setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC*); 99 91 100 friend class SourceBufferPrivateAVFObjC; 92 101 … … 96 105 Vector<SourceBufferPrivateAVFObjC*> m_activeSourceBuffers; 97 106 Deque<SourceBufferPrivateAVFObjC*> m_sourceBuffersNeedingSessions; 107 SourceBufferPrivateAVFObjC* m_sourceBufferWithSelectedVideo { nullptr }; 98 108 bool m_isEnded; 99 109 }; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm
r208658 r217098 175 175 } 176 176 177 static bool MediaSourcePrivateAVFObjCHasVideo(SourceBufferPrivateAVFObjC* sourceBuffer)178 {179 return sourceBuffer->hasVideo();180 }181 182 177 bool MediaSourcePrivateAVFObjC::hasVideo() const 183 178 { 184 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), MediaSourcePrivateAVFObjCHasVideo); 179 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) { 180 return sourceBuffer->hasVideo(); 181 }); 182 } 183 184 bool MediaSourcePrivateAVFObjC::hasSelectedVideo() const 185 { 186 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) { 187 return sourceBuffer->hasSelectedVideo(); 188 }); 185 189 } 186 190 … … 219 223 } 220 224 225 void MediaSourcePrivateAVFObjC::hasSelectedVideoChanged(SourceBufferPrivateAVFObjC& sourceBuffer) 226 { 227 bool hasSelectedVideo = sourceBuffer.hasSelectedVideo(); 228 if (m_sourceBufferWithSelectedVideo == &sourceBuffer && !hasSelectedVideo) 229 setSourceBufferWithSelectedVideo(nullptr); 230 else if (m_sourceBufferWithSelectedVideo != &sourceBuffer && hasSelectedVideo) 231 setSourceBufferWithSelectedVideo(&sourceBuffer); 232 } 233 234 void MediaSourcePrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer) 235 { 236 if (m_sourceBufferWithSelectedVideo) 237 m_sourceBufferWithSelectedVideo->setVideoLayer(layer); 238 } 239 240 void MediaSourcePrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession) 241 { 242 if (m_sourceBufferWithSelectedVideo) 243 m_sourceBufferWithSelectedVideo->setDecompressionSession(decompressionSession); 244 } 245 246 void MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC* sourceBuffer) 247 { 248 if (m_sourceBufferWithSelectedVideo) { 249 m_sourceBufferWithSelectedVideo->setVideoLayer(nullptr); 250 m_sourceBufferWithSelectedVideo->setDecompressionSession(nullptr); 251 } 252 253 m_sourceBufferWithSelectedVideo = sourceBuffer; 254 255 if (m_sourceBufferWithSelectedVideo) { 256 m_sourceBufferWithSelectedVideo->setVideoLayer(m_player->sampleBufferDisplayLayer()); 257 m_sourceBufferWithSelectedVideo->setDecompressionSession(m_player->decompressionSession()); 258 } 259 } 260 221 261 } 222 262 -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h
r210319 r217098 63 63 class AudioTrackPrivateMediaSourceAVFObjC; 64 64 class VideoTrackPrivateMediaSourceAVFObjC; 65 class WebCoreDecompressionSession; 65 66 66 67 class SourceBufferPrivateAVFObjCErrorClient { … … 89 90 90 91 bool hasVideo() const; 92 bool hasSelectedVideo() const; 91 93 bool hasAudio() const; 92 94 … … 109 111 void layerDidReceiveError(AVSampleBufferDisplayLayer *, NSError *); 110 112 void rendererDidReceiveError(AVSampleBufferAudioRenderer *, NSError *); 113 114 void setVideoLayer(AVSampleBufferDisplayLayer*); 115 void setDecompressionSession(WebCoreDecompressionSession*); 111 116 112 117 private: … … 153 158 OSObjectPtr<dispatch_semaphore_t> m_hasSessionSemaphore; 154 159 OSObjectPtr<dispatch_group_t> m_isAppendingGroup; 160 RefPtr<WebCoreDecompressionSession> m_decompressionSession; 155 161 156 162 MediaSourcePrivateAVFObjC* m_mediaSource; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm
r214930 r217098 30 30 31 31 #import "AVFoundationSPI.h" 32 #import "AudioTrackPrivateMediaSourceAVFObjC.h" 32 33 #import "CDMSessionAVContentKeySession.h" 33 34 #import "CDMSessionMediaSourceAVFObjC.h" 35 #import "InbandTextTrackPrivateAVFObjC.h" 34 36 #import "Logging.h" 35 37 #import "MediaDescription.h" … … 43 45 #import "SourceBufferPrivateClient.h" 44 46 #import "TimeRanges.h" 45 #import "AudioTrackPrivateMediaSourceAVFObjC.h"46 47 #import "VideoTrackPrivateMediaSourceAVFObjC.h" 47 #import " InbandTextTrackPrivateAVFObjC.h"48 #import "WebCoreDecompressionSession.h" 48 49 #import <AVFoundation/AVAssetTrack.h> 49 50 #import <QuartzCore/CALayer.h> 51 #import <map> 50 52 #import <objc/runtime.h> 51 53 #import <runtime/TypedArrayInlines.h> 52 #import <wtf/text/AtomicString.h>53 #import <wtf/text/CString.h>54 54 #import <wtf/BlockObjCExceptions.h> 55 55 #import <wtf/HashCountedSet.h> 56 56 #import <wtf/MainThread.h> 57 57 #import <wtf/WeakPtr.h> 58 #import <map> 58 #import <wtf/text/AtomicString.h> 59 #import <wtf/text/CString.h> 59 60 60 61 #pragma mark - Soft Linking … … 652 653 void SourceBufferPrivateAVFObjC::destroyRenderers() 653 654 { 654 if (m_displayLayer) { 655 if (m_mediaSource) 656 m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get()); 657 [m_displayLayer flush]; 658 [m_displayLayer stopRequestingMediaData]; 659 [m_errorListener stopObservingLayer:m_displayLayer.get()]; 660 m_displayLayer = nullptr; 661 } 655 if (m_displayLayer) 656 setVideoLayer(nullptr); 657 658 if (m_decompressionSession) 659 setDecompressionSession(nullptr); 662 660 663 661 for (auto& renderer : m_audioRenderers.values()) { … … 697 695 } 698 696 697 bool SourceBufferPrivateAVFObjC::hasSelectedVideo() const 698 { 699 return m_enabledVideoTrackID != -1; 700 } 701 699 702 bool SourceBufferPrivateAVFObjC::hasAudio() const 700 703 { … … 708 711 m_enabledVideoTrackID = -1; 709 712 [m_parser setShouldProvideMediaData:NO forTrackID:trackID]; 710 if (m_mediaSource) 711 m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get()); 713 714 if (m_decompressionSession) 715 m_decompressionSession->stopRequestingMediaData(); 712 716 } else if (track->selected()) { 713 717 m_enabledVideoTrackID = trackID; 714 718 [m_parser setShouldProvideMediaData:YES forTrackID:trackID]; 715 if (!m_displayLayer) { 716 m_displayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); 717 #ifndef NDEBUG 718 [m_displayLayer setName:@"SourceBufferPrivateAVFObjC AVSampleBufferDisplayLayer"]; 719 #endif 720 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 719 720 if (m_decompressionSession) { 721 m_decompressionSession->requestMediaDataWhenReady([this, trackID] { 721 722 didBecomeReadyForMoreSamples(trackID); 722 }]; 723 [m_errorListener beginObservingLayer:m_displayLayer.get()]; 723 }); 724 724 } 725 if (m_mediaSource)726 m_mediaSource->player()->addDisplayLayer(m_displayLayer.get()); 727 }725 } 726 727 m_mediaSource->hasSelectedVideoChanged(*this); 728 728 } 729 729 … … 794 794 [m_displayLayer flushAndRemoveImage]; 795 795 796 if (m_decompressionSession) { 797 m_decompressionSession->flush(); 798 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 799 if (weakThis && weakThis->m_mediaSource) 800 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 801 }); 802 } 803 796 804 for (auto& renderer : m_audioRenderers.values()) 797 805 [renderer flush]; … … 853 861 LOG(MediaSource, "SourceBufferPrivateAVFObjC::flush(%p) - trackId: %d", this, trackID); 854 862 855 if (trackID == m_enabledVideoTrackID) 863 if (trackID == m_enabledVideoTrackID) { 856 864 flush(m_displayLayer.get()); 857 else if (m_audioRenderers.contains(trackID)) 865 if (m_decompressionSession) { 866 m_decompressionSession->flush(); 867 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 868 if (weakThis && weakThis->m_mediaSource) 869 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 870 }); 871 } 872 } else if (m_audioRenderers.contains(trackID)) 858 873 flush(m_audioRenderers.get(trackID).get()); 859 874 } … … 905 920 } 906 921 907 [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer]; 908 if (m_mediaSource) 909 m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying()); 922 if (m_decompressionSession) 923 m_decompressionSession->enqueueSample(platformSample.sample.cmSampleBuffer); 924 925 if (m_displayLayer) { 926 [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer]; 927 if (m_mediaSource) 928 m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying()); 929 } 910 930 } else { 911 931 auto renderer = m_audioRenderers.get(trackID); … … 920 940 int trackID = trackIDString.toInt(); 921 941 if (trackID == m_enabledVideoTrackID) 922 return [m_displayLayer isReadyForMoreMediaData]; 923 else if (m_audioRenderers.contains(trackID)) 942 return !m_decompressionSession || m_decompressionSession->isReadyForMoreMediaData(); 943 944 if (m_audioRenderers.contains(trackID)) 924 945 return [m_audioRenderers.get(trackID) isReadyForMoreMediaData]; 925 946 … … 962 983 void SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(int trackID) 963 984 { 964 if (trackID == m_enabledVideoTrackID) 985 LOG(Media, "SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(%p) - track(%d)", this, trackID); 986 if (trackID == m_enabledVideoTrackID) { 987 if (m_decompressionSession) 988 m_decompressionSession->stopRequestingMediaData(); 965 989 [m_displayLayer stopRequestingMediaData]; 966 else if (m_audioRenderers.contains(trackID))990 } else if (m_audioRenderers.contains(trackID)) 967 991 [m_audioRenderers.get(trackID) stopRequestingMediaData]; 968 992 else { … … 979 1003 int trackID = trackIDString.toInt(); 980 1004 if (trackID == m_enabledVideoTrackID) { 981 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 1005 if (m_decompressionSession) { 1006 m_decompressionSession->requestMediaDataWhenReady([this, trackID] { 1007 didBecomeReadyForMoreSamples(trackID); 1008 }); 1009 } 1010 if (m_displayLayer) { 1011 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 1012 didBecomeReadyForMoreSamples(trackID); 1013 }]; 1014 } 1015 } else if (m_audioRenderers.contains(trackID)) { 1016 [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 982 1017 didBecomeReadyForMoreSamples(trackID); 983 1018 }]; 984 } else if (m_audioRenderers.contains(trackID)) { 985 [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 986 didBecomeReadyForMoreSamples(trackID); 1019 } 1020 } 1021 1022 void SourceBufferPrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer) 1023 { 1024 if (layer == m_displayLayer) 1025 return; 1026 1027 ASSERT(!layer || !m_decompressionSession || hasSelectedVideo()); 1028 1029 if (m_displayLayer) { 1030 [m_displayLayer flush]; 1031 [m_displayLayer stopRequestingMediaData]; 1032 [m_errorListener stopObservingLayer:m_displayLayer.get()]; 1033 } 1034 1035 m_displayLayer = layer; 1036 1037 if (m_displayLayer) { 1038 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 1039 didBecomeReadyForMoreSamples(m_enabledVideoTrackID); 987 1040 }]; 988 } 1041 [m_errorListener beginObservingLayer:m_displayLayer.get()]; 1042 if (m_client) 1043 m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID)); 1044 } 1045 } 1046 1047 void SourceBufferPrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession) 1048 { 1049 if (m_decompressionSession == decompressionSession) 1050 return; 1051 1052 if (m_decompressionSession) { 1053 m_decompressionSession->stopRequestingMediaData(); 1054 m_decompressionSession->invalidate(); 1055 } 1056 1057 m_decompressionSession = decompressionSession; 1058 1059 if (!m_decompressionSession) 1060 return; 1061 1062 WeakPtr<SourceBufferPrivateAVFObjC> weakThis = createWeakPtr(); 1063 m_decompressionSession->requestMediaDataWhenReady([weakThis] { 1064 if (weakThis) 1065 weakThis->didBecomeReadyForMoreSamples(weakThis->m_enabledVideoTrackID); 1066 }); 1067 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 1068 if (weakThis && weakThis->m_mediaSource) 1069 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 1070 }); 1071 if (m_client) 1072 m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID)); 989 1073 } 990 1074
Note: See TracChangeset
for help on using the changeset viewer.