Changeset 217185 in webkit
- Timestamp:
- May 20, 2017 9:55:01 AM (7 years ago)
- Location:
- trunk
- Files:
-
- 6 added
- 19 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/LayoutTests/ChangeLog
r217181 r217185 1 2017-05-20 Jer Noble <jer.noble@apple.com> 2 3 [MSE][Mac] Support painting MSE video-element to canvas 4 https://bugs.webkit.org/show_bug.cgi?id=125157 5 <rdar://problem/23062016> 6 7 Reviewed by Eric Carlson. 8 9 * media/media-source/content/test-fragmented.mp4: Add a 'edts' atom to move the presentation time for the 10 first sample to 0:00. 11 * media/media-source/content/test-fragmented-manifest.json: 12 * media/media-source/media-source-paint-to-canvas-expected.txt: Added. 13 * media/media-source/media-source-paint-to-canvas.html: Added. 14 1 15 2017-05-19 Chris Dumez <cdumez@apple.com> 2 16 … … 362 376 363 377 * media/media-source/content/test-fragmented.mp4: 364 365 2017-04-11 Jer Noble <jer.noble@apple.com>366 367 [MSE][Mac] Support painting MSE video-element to canvas368 https://bugs.webkit.org/show_bug.cgi?id=125157369 <rdar://problem/23062016>370 371 Reviewed by Eric Carlson.372 373 * media/media-source/content/test-fragmented.mp4: Add a 'edts' atom to move the presentation time for the374 first sample to 0:00.375 * media/media-source/media-source-paint-to-canvas-expected.txt: Added.376 * media/media-source/media-source-paint-to-canvas.html: Added.377 378 378 379 2017-05-19 Zan Dobersek <zdobersek@igalia.com> -
trunk/LayoutTests/media/media-source/content/test-fragmented-manifest.json
r207523 r217185 2 2 "url": "content/test-fragmented.mp4", 3 3 "type": "video/mp4; codecs=\"mp4a.40.2,avc1.4d281e\"", 4 "init": { "offset": 0, "size": 12 31},5 "duration": 10 .327753,4 "init": { "offset": 0, "size": 1259 }, 5 "duration": 10, 6 6 "media": [ 7 { "offset": 12 31, "size": 67526, "timecode": 0.000000, "duration": 1.041668},8 { "offset": 687 57, "size": 72683, "timecode": 1.016916, "duration": 1.024752},9 { "offset": 1414 40, "size": 78499, "timecode": 2.015374, "duration": 1.026294},10 { "offset": 2199 39, "size": 77358, "timecode": 3.013832, "duration": 1.027835},11 { "offset": 297 297, "size": 80748, "timecode": 4.012290, "duration": 1.029377},12 { "offset": 3780 45, "size": 78038, "timecode": 5.010748, "duration": 1.030919},13 { "offset": 456 083, "size": 82223, "timecode": 6.009206, "duration": 1.032461 },14 { "offset": 5383 06, "size": 78331, "timecode": 7.007664, "duration": 1.034003},15 { "offset": 6166 37, "size": 80736, "timecode": 8.006122, "duration": 1.035545},16 { "offset": 697 373, "size": 77752, "timecode": 9.004580, "duration": 1.044899}7 { "offset": 1259, "size": 67526, "timestamp": 0, "duration": 1 }, 8 { "offset": 68785, "size": 72683, "timestamp": 1, "duration": 1 }, 9 { "offset": 141468, "size": 78499, "timestamp": 2, "duration": 1 }, 10 { "offset": 219967, "size": 77358, "timestamp": 3, "duration": 1 }, 11 { "offset": 297325, "size": 80748, "timestamp": 4, "duration": 1 }, 12 { "offset": 378073, "size": 78038, "timestamp": 5, "duration": 1 }, 13 { "offset": 456111, "size": 82223, "timestamp": 6, "duration": 1 }, 14 { "offset": 538334, "size": 78331, "timestamp": 7, "duration": 1 }, 15 { "offset": 616665, "size": 80736, "timestamp": 8, "duration": 1 }, 16 { "offset": 697401, "size": 77752, "timestamp": 9, "duration": 1 } 17 17 ] 18 18 } -
trunk/Source/WebCore/ChangeLog
r217183 r217185 1 2017-05-20 Jer Noble <jer.noble@apple.com> 2 3 [MSE][Mac] Support painting MSE video-element to canvas 4 https://bugs.webkit.org/show_bug.cgi?id=125157 5 <rdar://problem/23062016> 6 7 Reviewed by Eric Carlson. 8 9 Test: media/media-source/media-source-paint-to-canvas.html 10 11 In order to have access to decoded video data for painting, decode the encoded samples manually 12 instead of adding them to the AVSampleBufferDisplayLayer. To facilitate doing so, add a new 13 utility class WebCoreDecompressionSession, which can decode samples and store them. 14 15 For the purposes of this patch, to avoid double-decoding of video data and to avoid severe complication 16 of our sample delivery pipeline, we will only support painting of decoded video samples when the video is 17 not displayed in the DOM. 18 19 * Modules/mediasource/MediaSource.cpp: 20 (WebCore::MediaSource::seekToTime): Always send waitForSeekCompleted() to give private a chance to delay seek completion. 21 * Modules/mediasource/SourceBuffer.cpp: 22 (WebCore::SourceBuffer::sourceBufferPrivateReenqueSamples): Added. 23 * Modules/mediasource/SourceBuffer.h: 24 * WebCore.xcodeproj/project.pbxproj: 25 * platform/cf/CoreMediaSoftLink.cpp: Added new soft link macros. 26 * platform/cf/CoreMediaSoftLink.h: Ditto. 27 * platform/cocoa/CoreVideoSoftLink.cpp: Ditto. 28 * platform/cocoa/CoreVideoSoftLink.h: Ditto. 29 * platform/graphics/SourceBufferPrivateClient.h: 30 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h: 31 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::sampleBufferDisplayLayer): Simple accessor. 32 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::decompressionSession): Ditto. 33 * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm: 34 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::MediaPlayerPrivateMediaSourceAVFObjC): 35 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::load): Update whether we should be displaying in a layer or decompression session.. 36 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setVisible): Ditto. 37 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted): m_seeking is now an enum. 38 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seeking): Ditto. 39 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted): Ditto. If waiting for a video frame, delay completing seek. 40 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime): Call updateLastImage() and return result. 41 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage): Fetch the image for the current time. 42 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paint): Pass to paintCurrentFrameInCanvas. 43 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext): Get a native image, and render it. 44 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged): Create or destroy a layer or decompression session as appropriate. 45 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureLayer): Creates a layer. 46 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyLayer): Destroys a layer. 47 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession): Creates a decompression session. 48 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession): Destroys a decompression session. 49 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setHasAvailableVideoFrame): If seek completion delayed, complete now. Ditto for ready state change. 50 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::setReadyState): If waiting for a video frame, delay ready state change. 51 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer): Deleted. 52 (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer): Deleted. 53 * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h: 54 * platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm: 55 (WebCore::MediaSourcePrivateAVFObjC::hasVideo): Promote to a class function. 56 (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideo): Return whether any of the active source buffers have video and are selected. 57 (WebCore::MediaSourcePrivateAVFObjC::hasSelectedVideoChanged): Call setSourceBufferWithSelectedVideo(). 58 (WebCore::MediaSourcePrivateAVFObjC::setVideoLayer): Set (or clear) the layer on the selected buffer. 59 (WebCore::MediaSourcePrivateAVFObjC::setDecompressionSession): Ditto for decompression session. 60 (WebCore::MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo): Remove the layer and decompression session from the unselected 61 62 buffer and add the decompression session or layer to the newly selected buffer. 63 (WebCore::MediaSourcePrivateAVFObjCHasVideo): Deleted. 64 * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h: 65 * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm: 66 (WebCore::SourceBufferPrivateAVFObjC::destroyRenderers): Clear the videoLayer and decompressionSession. 67 (WebCore::SourceBufferPrivateAVFObjC::hasSelectedVideo): Return whether the buffer has a selected video track. 68 (WebCore::SourceBufferPrivateAVFObjC::trackDidChangeEnabled): The media player now manages the video layer and decompression session lifetimes. 69 (WebCore::SourceBufferPrivateAVFObjC::flush): Flush the decompression session, if it exists. 70 (WebCore::SourceBufferPrivateAVFObjC::enqueueSample): Enqueue to the decompression session, if it exists. 71 (WebCore::SourceBufferPrivateAVFObjC::isReadyForMoreSamples): As the decompression session, if it exists. 72 (WebCore::SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples): Tell the decompression session to stop requesting data, if it exists. 73 (WebCore::SourceBufferPrivateAVFObjC::notifyClientWhenReadyForMoreSamples): Request media data from the decompression session, if it exists. 74 (WebCore::SourceBufferPrivateAVFObjC::setVideoLayer): Added. 75 (WebCore::SourceBufferPrivateAVFObjC::setDecompressionSession): Added. 76 * platform/graphics/cocoa/WebCoreDecompressionSession.h: Added. 77 (WebCore::WebCoreDecompressionSession::create): 78 (WebCore::WebCoreDecompressionSession::isInvalidated): 79 (WebCore::WebCoreDecompressionSession::createWeakPtr): 80 * platform/graphics/cocoa/WebCoreDecompressionSession.mm: Added. 81 (WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession): Register for media data requests. 82 (WebCore::WebCoreDecompressionSession::invalidate): Unregister for same. 83 (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaDataCallback): Pass to maybeBecomeReadyForMoreMediaData. 84 (WebCore::WebCoreDecompressionSession::maybeBecomeReadyForMoreMediaData): Check in-flight decodes, and decoded frame counts. 85 (WebCore::WebCoreDecompressionSession::enqueueSample): Pass the sample to be decoded on a background queue. 86 (WebCore::WebCoreDecompressionSession::decodeSample): Decode the sample. 87 (WebCore::WebCoreDecompressionSession::decompressionOutputCallback): Call handleDecompressionOutput. 88 (WebCore::WebCoreDecompressionSession::handleDecompressionOutput): Pass decoded sample to be enqueued on the main thread. 89 (WebCore::WebCoreDecompressionSession::getFirstVideoFrame): 90 (WebCore::WebCoreDecompressionSession::enqueueDecodedSample): Enqueue the frame (if it's a displayed frame). 91 (WebCore::WebCoreDecompressionSession::isReadyForMoreMediaData): Return whether we've hit our high water sample count. 92 (WebCore::WebCoreDecompressionSession::requestMediaDataWhenReady): 93 (WebCore::WebCoreDecompressionSession::stopRequestingMediaData): Unset the same. 94 (WebCore::WebCoreDecompressionSession::notifyWhenHasAvailableVideoFrame): Set a callback to notify when a decoded frame has been enqueued. 95 (WebCore::WebCoreDecompressionSession::imageForTime): Successively dequeue images until reaching one at or beyond the requested time. 96 (WebCore::WebCoreDecompressionSession::flush): Synchronously empty the producer and consumer queues. 97 (WebCore::WebCoreDecompressionSession::getDecodeTime): Utility method. 98 (WebCore::WebCoreDecompressionSession::getPresentationTime): Ditto. 99 (WebCore::WebCoreDecompressionSession::getDuration): Ditto. 100 (WebCore::WebCoreDecompressionSession::compareBuffers): Ditto. 101 * platform/cocoa/VideoToolboxSoftLink.cpp: Added. 102 * platform/cocoa/VideoToolboxSoftLink.h: Added. 103 1 104 2017-05-19 Joseph Pecoraro <pecoraro@apple.com> 2 105 -
trunk/Source/WebCore/Modules/mediasource/MediaSource.cpp
r217133 r217185 234 234 // Continue 235 235 236 m_private->waitForSeekCompleted(); 236 237 completeSeek(); 237 238 } -
trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp
r217133 r217185 1779 1779 } 1780 1780 1781 void SourceBuffer::sourceBufferPrivateReenqueSamples(const AtomicString& trackID) 1782 { 1783 if (isRemoved()) 1784 return; 1785 1786 LOG(MediaSource, "SourceBuffer::sourceBufferPrivateReenqueSamples(%p)", this); 1787 auto it = m_trackBufferMap.find(trackID); 1788 if (it == m_trackBufferMap.end()) 1789 return; 1790 1791 auto& trackBuffer = it->value; 1792 trackBuffer.needsReenqueueing = true; 1793 reenqueueMediaForTime(trackBuffer, trackID, m_source->currentTime()); 1794 } 1795 1781 1796 void SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) 1782 1797 { 1798 if (isRemoved()) 1799 return; 1800 1783 1801 LOG(MediaSource, "SourceBuffer::sourceBufferPrivateDidBecomeReadyForMoreSamples(%p)", this); 1784 1802 auto it = m_trackBufferMap.find(trackID); … … 1931 1949 // 4.1 Let track ranges equal the track buffer ranges for the current track buffer. 1932 1950 PlatformTimeRanges trackRanges = trackBuffer.buffered; 1951 if (!trackRanges.length()) 1952 continue; 1953 1933 1954 // 4.2 If readyState is "ended", then set the end time on the last range in track ranges to highest end time. 1934 1955 if (m_source->isEnded()) -
trunk/Source/WebCore/Modules/mediasource/SourceBuffer.h
r217133 r217185 130 130 bool sourceBufferPrivateHasAudio() const final; 131 131 bool sourceBufferPrivateHasVideo() const final; 132 void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) final; 132 133 void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) final; 133 134 MediaTime sourceBufferPrivateFastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold) final; -
trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj
r217151 r217185 6128 6128 CD5896E11CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */; }; 6129 6129 CD5896E21CD2B15100B3BCC8 /* WebPlaybackControlsManager.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */; settings = {ATTRIBUTES = (Private, ); }; }; 6130 CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */; }; 6131 CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */; }; 6130 6132 CD5E5B5F1A15CE54000C609E /* PageConfiguration.h in Headers */ = {isa = PBXBuildFile; fileRef = CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */; settings = {ATTRIBUTES = (Private, ); }; }; 6131 6133 CD5E5B611A15F156000C609E /* PageConfiguration.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CD5E5B601A15F156000C609E /* PageConfiguration.cpp */; }; … … 6229 6231 CDC8B5AB18047FF10016E685 /* SourceBufferPrivateAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */; }; 6230 6232 CDC8B5AD1804AE5D0016E685 /* SourceBufferPrivateClient.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */; }; 6233 CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */; }; 6234 CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */; }; 6231 6235 CDC979F41C498C0900DB50D4 /* WebCoreNSErrorExtras.mm in Sources */ = {isa = PBXBuildFile; fileRef = CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */; }; 6232 6236 CDC979F51C498C0900DB50D4 /* WebCoreNSErrorExtras.h in Headers */ = {isa = PBXBuildFile; fileRef = CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */; }; … … 14620 14624 CD5896DF1CD2B15100B3BCC8 /* WebPlaybackControlsManager.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebPlaybackControlsManager.mm; sourceTree = "<group>"; }; 14621 14625 CD5896E01CD2B15100B3BCC8 /* WebPlaybackControlsManager.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebPlaybackControlsManager.h; sourceTree = "<group>"; }; 14626 CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreDecompressionSession.mm; sourceTree = "<group>"; }; 14627 CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreDecompressionSession.h; sourceTree = "<group>"; }; 14622 14628 CD5E5B5E1A15CE54000C609E /* PageConfiguration.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = PageConfiguration.h; sourceTree = "<group>"; }; 14623 14629 CD5E5B601A15F156000C609E /* PageConfiguration.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PageConfiguration.cpp; sourceTree = "<group>"; }; … … 14740 14746 CDC8B5A918047FF10016E685 /* SourceBufferPrivateAVFObjC.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateAVFObjC.h; sourceTree = "<group>"; }; 14741 14747 CDC8B5AC1804AE5D0016E685 /* SourceBufferPrivateClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = SourceBufferPrivateClient.h; sourceTree = "<group>"; }; 14748 CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; }; 14749 CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; }; 14742 14750 CDC979F21C498C0900DB50D4 /* WebCoreNSErrorExtras.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreNSErrorExtras.mm; sourceTree = "<group>"; }; 14743 14751 CDC979F31C498C0900DB50D4 /* WebCoreNSErrorExtras.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreNSErrorExtras.h; sourceTree = "<group>"; }; … … 21119 21127 52D5A1A51C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.h */, 21120 21128 52D5A1A61C57488900DE34A3 /* WebVideoFullscreenModelVideoElement.mm */, 21129 CDC939A51E9BDFB100BB768D /* VideoToolboxSoftLink.cpp */, 21130 CDC939A61E9BDFB100BB768D /* VideoToolboxSoftLink.h */, 21121 21131 ); 21122 21132 path = cocoa; … … 23360 23370 2D3EF4461917915C00034184 /* WebCoreCALayerExtras.h */, 23361 23371 2D3EF4471917915C00034184 /* WebCoreCALayerExtras.mm */, 23372 CD5D27751E8318E000D80A3D /* WebCoreDecompressionSession.mm */, 23373 CD5D27761E8318E000D80A3D /* WebCoreDecompressionSession.h */, 23362 23374 316BDB8A1E6E153000DE0D5A /* WebGPULayer.h */, 23363 23375 316BDB891E6E153000DE0D5A /* WebGPULayer.mm */, … … 27071 27083 E440AA961C68420800A265CC /* ElementAndTextDescendantIterator.h in Headers */, 27072 27084 E46A2B1E17CA76B1000DBCD8 /* ElementChildIterator.h in Headers */, 27085 CD5D27781E8318E000D80A3D /* WebCoreDecompressionSession.h in Headers */, 27073 27086 B5B7A17117C10AC000E4AA0A /* ElementData.h in Headers */, 27074 27087 93D437A11D57B3F400AB85EA /* ElementDescendantIterator.h in Headers */, … … 28565 28578 413E00791DB0E4F2002341D2 /* MemoryRelease.h in Headers */, 28566 28579 93309DFA099E64920056E581 /* MergeIdenticalElementsCommand.h in Headers */, 28580 CDC939A81E9BDFB100BB768D /* VideoToolboxSoftLink.h in Headers */, 28567 28581 E1ADECCE0E76AD8B004A1A5E /* MessageChannel.h in Headers */, 28568 28582 75793E840D0CE0B3007FC0AC /* MessageEvent.h in Headers */, … … 30950 30964 FD31603012B0267600C1A359 /* DelayProcessor.cpp in Sources */, 30951 30965 93309DDE099E64920056E581 /* DeleteFromTextNodeCommand.cpp in Sources */, 30966 CDC939A71E9BDFB100BB768D /* VideoToolboxSoftLink.cpp in Sources */, 30952 30967 93309DE0099E64920056E581 /* DeleteSelectionCommand.cpp in Sources */, 30953 30968 9479493C1E045CF300018D85 /* DeprecatedCSSOMPrimitiveValue.cpp in Sources */, … … 33383 33398 B2227AB70D00BF220071B782 /* SVGStyleElement.cpp in Sources */, 33384 33399 B2227ABA0D00BF220071B782 /* SVGSVGElement.cpp in Sources */, 33400 CD5D27771E8318E000D80A3D /* WebCoreDecompressionSession.mm in Sources */, 33385 33401 B2227ABD0D00BF220071B782 /* SVGSwitchElement.cpp in Sources */, 33386 33402 B2227AC00D00BF220071B782 /* SVGSymbolElement.cpp in Sources */, -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp
r217133 r217185 30 30 #include "CoreMediaSPI.h" 31 31 #include "SoftLinking.h" 32 33 #if PLATFORM(COCOA) 34 #include <CoreVideo/CoreVideo.h> 35 #endif 32 36 33 37 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreMedia) … … 70 74 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimeInvalid, CMTime) 71 75 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimeZero, CMTime) 76 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMTimePositiveInfinity, CMTime) 72 77 73 78 #if PLATFORM(COCOA) … … 98 103 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTime, OSStatus, (CMTimebaseRef timebase, CMTime time), (timebase, time)) 99 104 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 105 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 106 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 107 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags)) 108 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 100 109 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 101 110 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionCreateForImageBuffer, OSStatus, (CFAllocatorRef allocator, CVImageBufferRef imageBuffer, CMVideoFormatDescriptionRef* outDesc), (allocator, imageBuffer, outDesc)) 102 111 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetDimensions, CMVideoDimensions, (CMVideoFormatDescriptionRef videoDesc), (videoDesc)) 103 112 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture)) 113 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue)) 114 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut)) 115 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue)) 116 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue)) 117 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer)) 118 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue)) 119 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue)) 120 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 121 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 122 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 104 123 105 124 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef) -
trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h
r217133 r217185 109 109 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimeZero, CMTime) 110 110 #define kCMTimeZero get_CoreMedia_kCMTimeZero() 111 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMTimePositiveInfinity, CMTime) 112 #define kCMTimePositiveInfinity get_CoreMedia_kCMTimePositiveInfinity() 111 113 112 114 #if PLATFORM(COCOA) … … 162 164 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseGetEffectiveRate, Float64, (CMTimebaseRef timebase), (timebase)) 163 165 #define CMTimebaseGetEffectiveRate softLink_CoreMedia_CMTimebaseGetEffectiveRate 166 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseAddTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 167 #define CMTimebaseAddTimerDispatchSource softLink_CoreMedia_CMTimebaseAddTimerDispatchSource 168 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseRemoveTimerDispatchSource, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 169 #define CMTimebaseRemoveTimerDispatchSource softLink_CoreMedia_CMTimebaseRemoveTimerDispatchSource 170 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceNextFireTime, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource, CMTime fireTime, uint32_t flags), (timebase, timerSource, fireTime, flags)) 171 #define CMTimebaseSetTimerDispatchSourceNextFireTime softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceNextFireTime 172 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimebaseSetTimerDispatchSourceToFireImmediately, OSStatus, (CMTimebaseRef timebase, dispatch_source_t timerSource), (timebase, timerSource)) 173 #define CMTimebaseSetTimerDispatchSourceToFireImmediately softLink_CoreMedia_CMTimebaseSetTimerDispatchSourceToFireImmediately 164 174 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeCopyAsDictionary, CFDictionaryRef, (CMTime time, CFAllocatorRef allocator), (time, allocator)) 165 175 #define CMTimeCopyAsDictionary softLink_CoreMedia_CMTimeCopyAsDictionary … … 170 180 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMVideoFormatDescriptionGetPresentationDimensions, CGSize, (CMVideoFormatDescriptionRef videoDesc, Boolean usePixelAspectRatio, Boolean useCleanAperture), (videoDesc, usePixelAspectRatio, useCleanAperture)) 171 181 #define CMVideoFormatDescriptionGetPresentationDimensions softLink_CoreMedia_CMVideoFormatDescriptionGetPresentationDimensions 182 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueCreate, OSStatus, (CFAllocatorRef allocator, CMItemCount capacity, const CMBufferCallbacks* callbacks, CMBufferQueueRef* queueOut), (allocator, capacity, callbacks, queueOut)) 183 #define CMBufferQueueCreate softLink_CoreMedia_CMBufferQueueCreate 184 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueReset, OSStatus, (CMBufferQueueRef queue), (queue)) 185 #define CMBufferQueueReset softLink_CoreMedia_CMBufferQueueReset 186 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetHead, CMBufferRef, (CMBufferQueueRef queue), (queue)) 187 #define CMBufferQueueGetHead softLink_CoreMedia_CMBufferQueueGetHead 188 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueDequeueAndRetain, CMBufferRef, (CMBufferQueueRef queue), (queue)) 189 #define CMBufferQueueDequeueAndRetain softLink_CoreMedia_CMBufferQueueDequeueAndRetain 190 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueEnqueue, OSStatus, (CMBufferQueueRef queue, CMBufferRef buffer), (queue, buffer)) 191 #define CMBufferQueueEnqueue softLink_CoreMedia_CMBufferQueueEnqueue 192 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue)) 193 #define CMBufferQueueIsEmpty softLink_CoreMedia_CMBufferQueueIsEmpty 194 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue)) 195 #define CMBufferQueueGetBufferCount softLink_CoreMedia_CMBufferQueueGetBufferCount 196 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 197 #define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp 198 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 199 #define CMBufferQueueGetEndPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetEndPresentationTimeStamp 200 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 201 #define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold 172 202 173 203 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef) -
trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.cpp
r217133 r217185 31 31 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, CoreVideo) 32 32 33 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ()) 33 34 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 34 35 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreVideo, CVPixelBufferGetHeight, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) -
trunk/Source/WebCore/platform/cocoa/CoreVideoSoftLink.h
r217133 r217185 32 32 SOFT_LINK_FRAMEWORK_FOR_HEADER(WebCore, CoreVideo) 33 33 34 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetTypeID, CFTypeID, (), ()) 35 #define CVPixelBufferGetTypeID softLink_CoreVideo_CVPixelBufferGetTypeID 34 36 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreVideo, CVPixelBufferGetWidth, size_t, (CVPixelBufferRef pixelBuffer), (pixelBuffer)) 35 37 #define CVPixelBufferGetWidth softLink_CoreVideo_CVPixelBufferGetWidth -
trunk/Source/WebCore/platform/graphics/SourceBufferPrivateClient.h
r217133 r217185 69 69 virtual bool sourceBufferPrivateHasVideo() const = 0; 70 70 71 virtual void sourceBufferPrivateReenqueSamples(const AtomicString& trackID) = 0; 71 72 virtual void sourceBufferPrivateDidBecomeReadyForMoreSamples(const AtomicString& trackID) = 0; 72 73 -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.h
r217175 r217185 42 42 43 43 typedef struct OpaqueCMTimebase* CMTimebaseRef; 44 typedef struct __CVBuffer *CVPixelBufferRef; 45 typedef struct __CVBuffer *CVOpenGLTextureRef; 44 46 45 47 namespace WebCore { 46 48 47 49 class CDMSessionMediaSourceAVFObjC; 50 class MediaSourcePrivateAVFObjC; 51 class PixelBufferConformerCV; 48 52 class PlatformClockCM; 49 class MediaSourcePrivateAVFObjC; 53 class TextureCacheCV; 54 class VideoTextureCopierCV; 55 class WebCoreDecompressionSession; 50 56 51 57 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) … … 64 70 static void getSupportedTypes(HashSet<String, ASCIICaseInsensitiveHash>& types); 65 71 static MediaPlayer::SupportsType supportsType(const MediaEngineSupportParameters&); 66 67 void addDisplayLayer(AVSampleBufferDisplayLayer*);68 void removeDisplayLayer(AVSampleBufferDisplayLayer*);69 72 70 73 void addAudioRenderer(AVSampleBufferAudioRenderer*); … … 92 95 void characteristicsChanged(); 93 96 97 MediaTime currentMediaTime() const override; 98 AVSampleBufferDisplayLayer* sampleBufferDisplayLayer() const { return m_sampleBufferDisplayLayer.get(); } 99 WebCoreDecompressionSession* decompressionSession() const { return m_decompressionSession.get(); } 100 94 101 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 95 102 void setVideoFullscreenLayer(PlatformLayer*, std::function<void()> completionHandler) override; … … 151 158 152 159 MediaTime durationMediaTime() const override; 153 MediaTime currentMediaTime() const override;154 160 MediaTime startTime() const override; 155 161 MediaTime initialTime() const override; … … 170 176 void setSize(const IntSize&) override; 171 177 178 NativeImagePtr nativeImageForCurrentTime() override; 179 bool updateLastPixelBuffer(); 180 bool updateLastImage(); 172 181 void paint(GraphicsContext&, const FloatRect&) override; 173 182 void paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) override; 174 183 bool copyVideoTextureToPlatformTexture(GraphicsContext3D*, Platform3DObject, GC3Denum target, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY) override; 184 175 185 bool hasAvailableVideoFrame() const override; 176 186 … … 179 189 void acceleratedRenderingStateChanged() override; 180 190 void notifyActiveSourceBuffersChanged() override; 191 192 // NOTE: Because the only way for MSE to recieve data is through an ArrayBuffer provided by 193 // javascript running in the page, the video will, by necessity, always be CORS correct and 194 // in the page's origin. 195 bool hasSingleSecurityOrigin() const override { return true; } 196 bool didPassCORSAccessCheck() const override { return true; } 181 197 182 198 MediaPlayer::MovieLoadType movieLoadType() const override; … … 204 220 void ensureLayer(); 205 221 void destroyLayer(); 222 void ensureDecompressionSession(); 223 void destroyDecompressionSession(); 224 206 225 bool shouldBePlaying() const; 207 226 … … 236 255 RetainPtr<id> m_durationObserver; 237 256 RetainPtr<AVStreamSession> m_streamSession; 257 RetainPtr<CVPixelBufferRef> m_lastPixelBuffer; 258 RetainPtr<CGImageRef> m_lastImage; 259 std::unique_ptr<PixelBufferConformerCV> m_rgbConformer; 260 RefPtr<WebCoreDecompressionSession> m_decompressionSession; 238 261 Deque<RetainPtr<id>> m_sizeChangeObservers; 239 262 Timer m_seekTimer; … … 241 264 MediaPlayer::NetworkState m_networkState; 242 265 MediaPlayer::ReadyState m_readyState; 266 bool m_readyStateIsWaitingForAvailableFrame { false }; 243 267 MediaTime m_lastSeekTime; 244 268 FloatSize m_naturalSize; … … 246 270 bool m_playing; 247 271 bool m_seeking; 248 bool m_seekCompleted; 272 enum SeekState { 273 Seeking, 274 WaitingForAvailableFame, 275 SeekCompleted, 276 }; 277 SeekState m_seekCompleted { SeekCompleted }; 249 278 mutable bool m_loadingProgressed; 250 bool m_hasAvailableVideoFrame; 279 bool m_hasBeenAskedToPaintGL { false }; 280 bool m_hasAvailableVideoFrame { false }; 251 281 bool m_allRenderersHaveAvailableSamples { false }; 252 282 RetainPtr<PlatformLayer> m_textTrackRepresentationLayer; 283 std::unique_ptr<TextureCacheCV> m_textureCache; 284 std::unique_ptr<VideoTextureCopierCV> m_videoTextureCopier; 285 RetainPtr<CVOpenGLTextureRef> m_lastTexture; 253 286 #if ENABLE(WIRELESS_PLAYBACK_TARGET) 254 287 RefPtr<MediaPlaybackTarget> m_playbackTarget; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm
r217133 r217185 34 34 #import "CDMSessionMediaSourceAVFObjC.h" 35 35 #import "FileSystem.h" 36 #import "GraphicsContextCG.h" 36 37 #import "Logging.h" 37 38 #import "MediaSourcePrivateAVFObjC.h" 38 39 #import "MediaSourcePrivateClient.h" 39 40 #import "MediaTimeAVFoundation.h" 41 #import "PixelBufferConformerCV.h" 40 42 #import "PlatformClockCM.h" 41 43 #import "TextTrackRepresentation.h" 44 #import "TextureCacheCV.h" 45 #import "VideoTextureCopierCV.h" 46 #import "WebCoreDecompressionSession.h" 42 47 #import "WebCoreSystemInterface.h" 43 48 #import <AVFoundation/AVAsset.h> … … 122 127 , m_playing(0) 123 128 , m_seeking(false) 124 , m_seekCompleted(true)125 129 , m_loadingProgressed(false) 126 130 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) … … 149 153 if (shouldBePlaying()) 150 154 [m_synchronizer setRate:m_rate]; 151 if (!seeking() )155 if (!seeking() && m_seekCompleted == SeekCompleted) 152 156 m_player->timeChanged(); 153 157 } … … 242 246 243 247 m_mediaSourcePrivate = MediaSourcePrivateAVFObjC::create(this, client); 248 m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get()); 249 m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get()); 250 251 acceleratedRenderingStateChanged(); 244 252 } 245 253 … … 355 363 void MediaPlayerPrivateMediaSourceAVFObjC::setVisible(bool) 356 364 { 357 // No-op.365 acceleratedRenderingStateChanged(); 358 366 } 359 367 … … 438 446 return; 439 447 LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::waitForSeekCompleted(%p)", this); 440 m_seekCompleted = false;448 m_seekCompleted = Seeking; 441 449 } 442 450 443 451 void MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted() 444 452 { 445 if (m_seekCompleted) 446 return; 453 if (m_seekCompleted == SeekCompleted) 454 return; 455 if (hasVideo() && !m_hasAvailableVideoFrame) { 456 m_seekCompleted = WaitingForAvailableFame; 457 return; 458 } 447 459 LOG(MediaSource, "MediaPlayerPrivateMediaSourceAVFObjC::seekCompleted(%p)", this); 448 m_seekCompleted = true;460 m_seekCompleted = SeekCompleted; 449 461 if (shouldBePlaying()) 450 462 [m_synchronizer setRate:m_rate]; … … 455 467 bool MediaPlayerPrivateMediaSourceAVFObjC::seeking() const 456 468 { 457 return m_seeking || !m_seekCompleted;469 return m_seeking || m_seekCompleted != SeekCompleted; 458 470 } 459 471 … … 515 527 } 516 528 517 void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext&, const FloatRect&) 518 { 519 // FIXME(125157): Implement painting. 520 } 521 522 void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext&, const FloatRect&) 523 { 524 // FIXME(125157): Implement painting. 529 NativeImagePtr MediaPlayerPrivateMediaSourceAVFObjC::nativeImageForCurrentTime() 530 { 531 updateLastImage(); 532 return m_lastImage.get(); 533 } 534 535 bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastPixelBuffer() 536 { 537 if (m_sampleBufferDisplayLayer || !m_decompressionSession) 538 return false; 539 540 auto flags = !m_lastPixelBuffer ? WebCoreDecompressionSession::AllowLater : WebCoreDecompressionSession::ExactTime; 541 auto newPixelBuffer = m_decompressionSession->imageForTime(currentMediaTime(), flags); 542 if (!newPixelBuffer) 543 return false; 544 545 m_lastPixelBuffer = newPixelBuffer; 546 return true; 547 } 548 549 bool MediaPlayerPrivateMediaSourceAVFObjC::updateLastImage() 550 { 551 if (!updateLastPixelBuffer()) 552 return false; 553 554 ASSERT(m_lastPixelBuffer); 555 556 if (!m_rgbConformer) { 557 NSDictionary *attributes = @{ (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA) }; 558 m_rgbConformer = std::make_unique<PixelBufferConformerCV>((CFDictionaryRef)attributes); 559 } 560 561 m_lastImage = m_rgbConformer->createImageFromPixelBuffer(m_lastPixelBuffer.get()); 562 return true; 563 } 564 565 void MediaPlayerPrivateMediaSourceAVFObjC::paint(GraphicsContext& context, const FloatRect& rect) 566 { 567 paintCurrentFrameInContext(context, rect); 568 } 569 570 void MediaPlayerPrivateMediaSourceAVFObjC::paintCurrentFrameInContext(GraphicsContext& context, const FloatRect& outputRect) 571 { 572 if (context.paintingDisabled()) 573 return; 574 575 auto image = nativeImageForCurrentTime(); 576 if (!image) 577 return; 578 579 GraphicsContextStateSaver stateSaver(context); 580 FloatRect imageRect(0, 0, CGImageGetWidth(image.get()), CGImageGetHeight(image.get())); 581 context.drawNativeImage(image, imageRect.size(), outputRect, imageRect); 582 } 583 584 bool MediaPlayerPrivateMediaSourceAVFObjC::copyVideoTextureToPlatformTexture(GraphicsContext3D* context, Platform3DObject outputTexture, GC3Denum outputTarget, GC3Dint level, GC3Denum internalFormat, GC3Denum format, GC3Denum type, bool premultiplyAlpha, bool flipY) 585 { 586 if (flipY || premultiplyAlpha) 587 return false; 588 589 // We have been asked to paint into a WebGL canvas, so take that as a signal to create 590 // a decompression session, even if that means the native video can't also be displayed 591 // in page. 592 if (!m_hasBeenAskedToPaintGL) { 593 m_hasBeenAskedToPaintGL = true; 594 acceleratedRenderingStateChanged(); 595 } 596 597 ASSERT(context); 598 599 if (updateLastPixelBuffer()) { 600 if (!m_lastPixelBuffer) 601 return false; 602 603 if (!m_textureCache) { 604 m_textureCache = TextureCacheCV::create(*context); 605 if (!m_textureCache) 606 return false; 607 } 608 609 m_lastTexture = m_textureCache->textureFromImage(m_lastPixelBuffer.get(), outputTarget, level, internalFormat, format, type); 610 } 611 612 size_t width = CVPixelBufferGetWidth(m_lastPixelBuffer.get()); 613 size_t height = CVPixelBufferGetHeight(m_lastPixelBuffer.get()); 614 615 if (!m_videoTextureCopier) 616 m_videoTextureCopier = std::make_unique<VideoTextureCopierCV>(*context); 617 618 return m_videoTextureCopier->copyVideoTextureToPlatformTexture(m_lastTexture.get(), width, height, outputTexture, outputTarget, level, internalFormat, format, type, premultiplyAlpha, flipY); 525 619 } 526 620 … … 537 631 void MediaPlayerPrivateMediaSourceAVFObjC::acceleratedRenderingStateChanged() 538 632 { 539 if (m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player)) 633 if (!m_hasBeenAskedToPaintGL && m_player->visible() && m_player->client().mediaPlayerRenderingCanBeAccelerated(m_player)) { 634 destroyDecompressionSession(); 540 635 ensureLayer(); 541 else636 } else { 542 637 destroyLayer(); 638 ensureDecompressionSession(); 639 } 543 640 } 544 641 … … 609 706 610 707 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()]; 611 612 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE) 708 if (m_mediaSourcePrivate) 709 m_mediaSourcePrivate->setVideoLayer(m_sampleBufferDisplayLayer.get()); 710 #if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)) 613 711 m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size()); 614 712 #endif 713 m_player->client().mediaPlayerRenderingModeChanged(m_player); 615 714 } 616 715 … … 624 723 // No-op. 625 724 }]; 725 726 if (m_mediaSourcePrivate) 727 m_mediaSourcePrivate->setVideoLayer(nullptr); 728 #if PLATFORM(IOS) || (PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)) 729 m_videoFullscreenLayerManager->didDestroyVideoLayer(); 730 #endif 626 731 m_sampleBufferDisplayLayer = nullptr; 732 setHasAvailableVideoFrame(false); 733 m_player->client().mediaPlayerRenderingModeChanged(m_player); 734 } 735 736 void MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession() 737 { 738 if (m_decompressionSession) 739 return; 740 741 m_decompressionSession = WebCoreDecompressionSession::create(); 742 m_decompressionSession->setTimebase([m_synchronizer timebase]); 743 744 if (m_mediaSourcePrivate) 745 m_mediaSourcePrivate->setDecompressionSession(m_decompressionSession.get()); 746 747 m_player->client().mediaPlayerRenderingModeChanged(m_player); 748 } 749 750 void MediaPlayerPrivateMediaSourceAVFObjC::destroyDecompressionSession() 751 { 752 if (!m_decompressionSession) 753 return; 754 755 if (m_mediaSourcePrivate) 756 m_mediaSourcePrivate->setDecompressionSession(nullptr); 757 758 m_decompressionSession->invalidate(); 759 m_decompressionSession = nullptr; 760 setHasAvailableVideoFrame(false); 627 761 } 628 762 … … 638 772 m_hasAvailableVideoFrame = flag; 639 773 updateAllRenderersHaveAvailableSamples(); 774 775 if (!m_hasAvailableVideoFrame) 776 return; 777 778 m_player->firstVideoFrameAvailable(); 779 if (m_seekCompleted == WaitingForAvailableFame) 780 seekCompleted(); 781 782 if (m_readyStateIsWaitingForAvailableFrame) { 783 m_readyStateIsWaitingForAvailableFrame = false; 784 m_player->readyStateChanged(); 785 } 640 786 } 641 787 … … 658 804 659 805 do { 660 if ( m_sampleBufferDisplayLayer&& !m_hasAvailableVideoFrame) {806 if (hasVideo() && !m_hasAvailableVideoFrame) { 661 807 allRenderersHaveAvailableSamples = false; 662 808 break; … … 815 961 [m_synchronizer setRate:0]; 816 962 963 if (m_readyState >= MediaPlayerEnums::HaveCurrentData && hasVideo() && !m_hasAvailableVideoFrame) { 964 m_readyStateIsWaitingForAvailableFrame = true; 965 return; 966 } 967 817 968 m_player->readyStateChanged(); 818 969 } … … 825 976 m_networkState = networkState; 826 977 m_player->networkStateChanged(); 827 }828 829 void MediaPlayerPrivateMediaSourceAVFObjC::addDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)830 {831 ASSERT(displayLayer);832 if (displayLayer == m_sampleBufferDisplayLayer)833 return;834 835 m_sampleBufferDisplayLayer = displayLayer;836 [m_synchronizer addRenderer:m_sampleBufferDisplayLayer.get()];837 m_player->client().mediaPlayerRenderingModeChanged(m_player);838 839 // FIXME: move this somewhere appropriate:840 m_player->firstVideoFrameAvailable();841 842 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)843 m_videoFullscreenLayerManager->setVideoLayer(m_sampleBufferDisplayLayer.get(), snappedIntRect(m_player->client().mediaPlayerContentBoxRect()).size());844 #endif845 }846 847 void MediaPlayerPrivateMediaSourceAVFObjC::removeDisplayLayer(AVSampleBufferDisplayLayer* displayLayer)848 {849 if (displayLayer != m_sampleBufferDisplayLayer)850 return;851 852 CMTime currentTime = CMTimebaseGetTime([m_synchronizer timebase]);853 [m_synchronizer removeRenderer:m_sampleBufferDisplayLayer.get() atTime:currentTime withCompletionHandler:^(BOOL){854 // No-op.855 }];856 857 m_sampleBufferDisplayLayer = nullptr;858 m_player->client().mediaPlayerRenderingModeChanged(m_player);859 860 #if PLATFORM(MAC) && ENABLE(VIDEO_PRESENTATION_MODE)861 m_videoFullscreenLayerManager->didDestroyVideoLayer();862 #endif863 978 } 864 979 … … 895 1010 void MediaPlayerPrivateMediaSourceAVFObjC::characteristicsChanged() 896 1011 { 1012 updateAllRenderersHaveAvailableSamples(); 897 1013 m_player->characteristicChanged(); 898 1014 } -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.h
r217133 r217185 36 36 37 37 OBJC_CLASS AVAsset; 38 OBJC_CLASS AVSampleBufferDisplayLayer; 38 39 OBJC_CLASS AVStreamDataParser; 39 40 OBJC_CLASS NSError; … … 48 49 class SourceBufferPrivateAVFObjC; 49 50 class TimeRanges; 51 class WebCoreDecompressionSession; 50 52 51 53 class MediaSourcePrivateAVFObjC final : public MediaSourcePrivate { … … 72 74 bool hasAudio() const; 73 75 bool hasVideo() const; 76 bool hasSelectedVideo() const; 74 77 75 78 void willSeek(); … … 77 80 MediaTime fastSeekTimeForMediaTime(const MediaTime&, const MediaTime& negativeThreshold, const MediaTime& positiveThreshold); 78 81 FloatSize naturalSize() const; 82 83 void hasSelectedVideoChanged(SourceBufferPrivateAVFObjC&); 84 void setVideoLayer(AVSampleBufferDisplayLayer*); 85 void setDecompressionSession(WebCoreDecompressionSession*); 79 86 80 87 private: … … 89 96 void removeSourceBuffer(SourceBufferPrivate*); 90 97 98 void setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC*); 99 91 100 friend class SourceBufferPrivateAVFObjC; 92 101 … … 96 105 Vector<SourceBufferPrivateAVFObjC*> m_activeSourceBuffers; 97 106 Deque<SourceBufferPrivateAVFObjC*> m_sourceBuffersNeedingSessions; 107 SourceBufferPrivateAVFObjC* m_sourceBufferWithSelectedVideo { nullptr }; 98 108 bool m_isEnded; 99 109 }; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaSourcePrivateAVFObjC.mm
r217133 r217185 175 175 } 176 176 177 static bool MediaSourcePrivateAVFObjCHasVideo(SourceBufferPrivateAVFObjC* sourceBuffer)178 {179 return sourceBuffer->hasVideo();180 }181 182 177 bool MediaSourcePrivateAVFObjC::hasVideo() const 183 178 { 184 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), MediaSourcePrivateAVFObjCHasVideo); 179 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) { 180 return sourceBuffer->hasVideo(); 181 }); 182 } 183 184 bool MediaSourcePrivateAVFObjC::hasSelectedVideo() const 185 { 186 return std::any_of(m_activeSourceBuffers.begin(), m_activeSourceBuffers.end(), [] (SourceBufferPrivateAVFObjC* sourceBuffer) { 187 return sourceBuffer->hasSelectedVideo(); 188 }); 185 189 } 186 190 … … 219 223 } 220 224 225 void MediaSourcePrivateAVFObjC::hasSelectedVideoChanged(SourceBufferPrivateAVFObjC& sourceBuffer) 226 { 227 bool hasSelectedVideo = sourceBuffer.hasSelectedVideo(); 228 if (m_sourceBufferWithSelectedVideo == &sourceBuffer && !hasSelectedVideo) 229 setSourceBufferWithSelectedVideo(nullptr); 230 else if (m_sourceBufferWithSelectedVideo != &sourceBuffer && hasSelectedVideo) 231 setSourceBufferWithSelectedVideo(&sourceBuffer); 232 } 233 234 void MediaSourcePrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer) 235 { 236 if (m_sourceBufferWithSelectedVideo) 237 m_sourceBufferWithSelectedVideo->setVideoLayer(layer); 238 } 239 240 void MediaSourcePrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession) 241 { 242 if (m_sourceBufferWithSelectedVideo) 243 m_sourceBufferWithSelectedVideo->setDecompressionSession(decompressionSession); 244 } 245 246 void MediaSourcePrivateAVFObjC::setSourceBufferWithSelectedVideo(SourceBufferPrivateAVFObjC* sourceBuffer) 247 { 248 if (m_sourceBufferWithSelectedVideo) { 249 m_sourceBufferWithSelectedVideo->setVideoLayer(nullptr); 250 m_sourceBufferWithSelectedVideo->setDecompressionSession(nullptr); 251 } 252 253 m_sourceBufferWithSelectedVideo = sourceBuffer; 254 255 if (m_sourceBufferWithSelectedVideo) { 256 m_sourceBufferWithSelectedVideo->setVideoLayer(m_player->sampleBufferDisplayLayer()); 257 m_sourceBufferWithSelectedVideo->setDecompressionSession(m_player->decompressionSession()); 258 } 259 } 260 221 261 } 222 262 -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h
r217133 r217185 63 63 class AudioTrackPrivateMediaSourceAVFObjC; 64 64 class VideoTrackPrivateMediaSourceAVFObjC; 65 class WebCoreDecompressionSession; 65 66 66 67 class SourceBufferPrivateAVFObjCErrorClient { … … 89 90 90 91 bool hasVideo() const; 92 bool hasSelectedVideo() const; 91 93 bool hasAudio() const; 92 94 … … 109 111 void layerDidReceiveError(AVSampleBufferDisplayLayer *, NSError *); 110 112 void rendererDidReceiveError(AVSampleBufferAudioRenderer *, NSError *); 113 114 void setVideoLayer(AVSampleBufferDisplayLayer*); 115 void setDecompressionSession(WebCoreDecompressionSession*); 111 116 112 117 private: … … 153 158 OSObjectPtr<dispatch_semaphore_t> m_hasSessionSemaphore; 154 159 OSObjectPtr<dispatch_group_t> m_isAppendingGroup; 160 RefPtr<WebCoreDecompressionSession> m_decompressionSession; 155 161 156 162 MediaSourcePrivateAVFObjC* m_mediaSource; -
trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm
r217183 r217185 30 30 31 31 #import "AVFoundationSPI.h" 32 #import "AudioTrackPrivateMediaSourceAVFObjC.h" 32 33 #import "CDMSessionAVContentKeySession.h" 33 34 #import "CDMSessionMediaSourceAVFObjC.h" 35 #import "InbandTextTrackPrivateAVFObjC.h" 34 36 #import "Logging.h" 35 37 #import "MediaDescription.h" … … 43 45 #import "SourceBufferPrivateClient.h" 44 46 #import "TimeRanges.h" 45 #import "AudioTrackPrivateMediaSourceAVFObjC.h"46 47 #import "VideoTrackPrivateMediaSourceAVFObjC.h" 47 #import " InbandTextTrackPrivateAVFObjC.h"48 #import "WebCoreDecompressionSession.h" 48 49 #import <AVFoundation/AVAssetTrack.h> 49 50 #import <QuartzCore/CALayer.h> 51 #import <map> 50 52 #import <objc/runtime.h> 51 53 #import <runtime/TypedArrayInlines.h> 52 #import <wtf/text/AtomicString.h>53 #import <wtf/text/CString.h>54 54 #import <wtf/BlockObjCExceptions.h> 55 55 #import <wtf/HashCountedSet.h> 56 56 #import <wtf/MainThread.h> 57 57 #import <wtf/WeakPtr.h> 58 #import <map> 58 #import <wtf/text/AtomicString.h> 59 #import <wtf/text/CString.h> 59 60 60 61 #pragma mark - Soft Linking … … 650 651 void SourceBufferPrivateAVFObjC::destroyRenderers() 651 652 { 652 if (m_displayLayer) { 653 if (m_mediaSource) 654 m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get()); 655 [m_displayLayer flush]; 656 [m_displayLayer stopRequestingMediaData]; 657 [m_errorListener stopObservingLayer:m_displayLayer.get()]; 658 m_displayLayer = nullptr; 659 } 653 if (m_displayLayer) 654 setVideoLayer(nullptr); 655 656 if (m_decompressionSession) 657 setDecompressionSession(nullptr); 660 658 661 659 for (auto& renderer : m_audioRenderers.values()) { … … 695 693 } 696 694 695 bool SourceBufferPrivateAVFObjC::hasSelectedVideo() const 696 { 697 return m_enabledVideoTrackID != -1; 698 } 699 697 700 bool SourceBufferPrivateAVFObjC::hasAudio() const 698 701 { … … 706 709 m_enabledVideoTrackID = -1; 707 710 [m_parser setShouldProvideMediaData:NO forTrackID:trackID]; 708 if (m_mediaSource) 709 m_mediaSource->player()->removeDisplayLayer(m_displayLayer.get()); 711 712 if (m_decompressionSession) 713 m_decompressionSession->stopRequestingMediaData(); 710 714 } else if (track->selected()) { 711 715 m_enabledVideoTrackID = trackID; 712 716 [m_parser setShouldProvideMediaData:YES forTrackID:trackID]; 713 if (!m_displayLayer) { 714 m_displayLayer = adoptNS([allocAVSampleBufferDisplayLayerInstance() init]); 715 #ifndef NDEBUG 716 [m_displayLayer setName:@"SourceBufferPrivateAVFObjC AVSampleBufferDisplayLayer"]; 717 #endif 718 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 717 718 if (m_decompressionSession) { 719 m_decompressionSession->requestMediaDataWhenReady([this, trackID] { 719 720 didBecomeReadyForMoreSamples(trackID); 720 }]; 721 [m_errorListener beginObservingLayer:m_displayLayer.get()]; 721 }); 722 722 } 723 if (m_mediaSource)724 m_mediaSource->player()->addDisplayLayer(m_displayLayer.get()); 725 }723 } 724 725 m_mediaSource->hasSelectedVideoChanged(*this); 726 726 } 727 727 … … 792 792 [m_displayLayer flushAndRemoveImage]; 793 793 794 if (m_decompressionSession) { 795 m_decompressionSession->flush(); 796 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 797 if (weakThis && weakThis->m_mediaSource) 798 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 799 }); 800 } 801 794 802 for (auto& renderer : m_audioRenderers.values()) 795 803 [renderer flush]; … … 851 859 LOG(MediaSource, "SourceBufferPrivateAVFObjC::flush(%p) - trackId: %d", this, trackID); 852 860 853 if (trackID == m_enabledVideoTrackID) 861 if (trackID == m_enabledVideoTrackID) { 854 862 flush(m_displayLayer.get()); 855 else if (m_audioRenderers.contains(trackID)) 863 if (m_decompressionSession) { 864 m_decompressionSession->flush(); 865 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 866 if (weakThis && weakThis->m_mediaSource) 867 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 868 }); 869 } 870 } else if (m_audioRenderers.contains(trackID)) 856 871 flush(m_audioRenderers.get(trackID).get()); 857 872 } … … 903 918 } 904 919 905 [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer]; 906 if (m_mediaSource) 907 m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying()); 920 if (m_decompressionSession) 921 m_decompressionSession->enqueueSample(platformSample.sample.cmSampleBuffer); 922 923 if (m_displayLayer) { 924 [m_displayLayer enqueueSampleBuffer:platformSample.sample.cmSampleBuffer]; 925 if (m_mediaSource) 926 m_mediaSource->player()->setHasAvailableVideoFrame(!sample->isNonDisplaying()); 927 } 908 928 } else { 909 929 auto renderer = m_audioRenderers.get(trackID); … … 918 938 int trackID = trackIDString.toInt(); 919 939 if (trackID == m_enabledVideoTrackID) 920 return [m_displayLayer isReadyForMoreMediaData]; 921 else if (m_audioRenderers.contains(trackID)) 940 return !m_decompressionSession || m_decompressionSession->isReadyForMoreMediaData(); 941 942 if (m_audioRenderers.contains(trackID)) 922 943 return [m_audioRenderers.get(trackID) isReadyForMoreMediaData]; 923 944 … … 960 981 void SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(int trackID) 961 982 { 962 if (trackID == m_enabledVideoTrackID) 983 LOG(Media, "SourceBufferPrivateAVFObjC::didBecomeReadyForMoreSamples(%p) - track(%d)", this, trackID); 984 if (trackID == m_enabledVideoTrackID) { 985 if (m_decompressionSession) 986 m_decompressionSession->stopRequestingMediaData(); 963 987 [m_displayLayer stopRequestingMediaData]; 964 else if (m_audioRenderers.contains(trackID))988 } else if (m_audioRenderers.contains(trackID)) 965 989 [m_audioRenderers.get(trackID) stopRequestingMediaData]; 966 990 else { … … 977 1001 int trackID = trackIDString.toInt(); 978 1002 if (trackID == m_enabledVideoTrackID) { 979 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 1003 if (m_decompressionSession) { 1004 m_decompressionSession->requestMediaDataWhenReady([this, trackID] { 1005 didBecomeReadyForMoreSamples(trackID); 1006 }); 1007 } 1008 if (m_displayLayer) { 1009 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 1010 didBecomeReadyForMoreSamples(trackID); 1011 }]; 1012 } 1013 } else if (m_audioRenderers.contains(trackID)) { 1014 [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 980 1015 didBecomeReadyForMoreSamples(trackID); 981 1016 }]; 982 } else if (m_audioRenderers.contains(trackID)) { 983 [m_audioRenderers.get(trackID) requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^{ 984 didBecomeReadyForMoreSamples(trackID); 1017 } 1018 } 1019 1020 void SourceBufferPrivateAVFObjC::setVideoLayer(AVSampleBufferDisplayLayer* layer) 1021 { 1022 if (layer == m_displayLayer) 1023 return; 1024 1025 ASSERT(!layer || !m_decompressionSession || hasSelectedVideo()); 1026 1027 if (m_displayLayer) { 1028 [m_displayLayer flush]; 1029 [m_displayLayer stopRequestingMediaData]; 1030 [m_errorListener stopObservingLayer:m_displayLayer.get()]; 1031 } 1032 1033 m_displayLayer = layer; 1034 1035 if (m_displayLayer) { 1036 [m_displayLayer requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:^ { 1037 didBecomeReadyForMoreSamples(m_enabledVideoTrackID); 985 1038 }]; 986 } 1039 [m_errorListener beginObservingLayer:m_displayLayer.get()]; 1040 if (m_client) 1041 m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID)); 1042 } 1043 } 1044 1045 void SourceBufferPrivateAVFObjC::setDecompressionSession(WebCoreDecompressionSession* decompressionSession) 1046 { 1047 if (m_decompressionSession == decompressionSession) 1048 return; 1049 1050 if (m_decompressionSession) { 1051 m_decompressionSession->stopRequestingMediaData(); 1052 m_decompressionSession->invalidate(); 1053 } 1054 1055 m_decompressionSession = decompressionSession; 1056 1057 if (!m_decompressionSession) 1058 return; 1059 1060 WeakPtr<SourceBufferPrivateAVFObjC> weakThis = createWeakPtr(); 1061 m_decompressionSession->requestMediaDataWhenReady([weakThis] { 1062 if (weakThis) 1063 weakThis->didBecomeReadyForMoreSamples(weakThis->m_enabledVideoTrackID); 1064 }); 1065 m_decompressionSession->notifyWhenHasAvailableVideoFrame([weakThis = createWeakPtr()] { 1066 if (weakThis && weakThis->m_mediaSource) 1067 weakThis->m_mediaSource->player()->setHasAvailableVideoFrame(true); 1068 }); 1069 if (m_client) 1070 m_client->sourceBufferPrivateReenqueSamples(AtomicString::number(m_enabledVideoTrackID)); 987 1071 } 988 1072
Note: See TracChangeset
for help on using the changeset viewer.