Changeset 254670 in webkit
- Timestamp:
- Jan 16, 2020 2:43:19 AM (4 years ago)
- Location:
- trunk
- Files:
-
- 6 added
- 3 edited
Legend:
- Unmodified
- Added
- Removed
-
trunk/LayoutTests/ChangeLog
r254664 r254670 1 2020-01-16 Alicia Boya García <aboya@igalia.com> 2 3 [MSE] Don't enqueue samples that start at a big discontinuity 4 https://bugs.webkit.org/show_bug.cgi?id=201323 5 6 Reviewed by Xabier Rodriguez-Calvar. 7 8 * media/media-source/media-source-append-presentation-durations.html: Added. 9 * media/media-source/media-source-first-append-not-starting-at-zero.html: Added. 10 * media/media-source/media-source-remove-b-frame.html: Added. 11 1 12 2020-01-15 Lauro Moura <lmoura@igalia.com> 2 13 -
trunk/Source/WebCore/ChangeLog
r254669 r254670 1 2020-01-16 Alicia Boya García <aboya@igalia.com> 2 3 [MSE] Don't enqueue samples that start at a big discontinuity 4 https://bugs.webkit.org/show_bug.cgi?id=201323 5 6 With the old logic SourceBuffer was enqueueing the first frame to be 7 appended in any circumstances. This was a bug because the user could 8 append first [5, 10) and then [0, 5). With the old behavior [5, 10) 9 would be enqueued first despite being clearly ahead of the initial 10 playback time (zero). By the time [0, 5) is enqueued it can't be 11 enqueued anymore because the decodeQueue is already ahead. 12 13 This patch fixes that logic to work when the first segments are 14 appended unordered. The test media-source-first-append-not-starting-at-zero.html 15 validates it. 16 17 The test media-source-append-presentation-durations.html checks the 18 new logic does not break in presence of presentation duration != 19 decode duration. 20 21 As part of the same logic block, the lastEnqueuedPresentationTime was 22 used to decide when it's necessary to perform reenqueue after an 23 .erase() (it is necessary if any enqueued frames are replaced). Using 24 lastEnqueuedPresentationTime was not entirely accurate in presence of 25 B-frames, as you could erase a frame that has a presentation time 26 higher than the last enqueued one. That logic is replaced with a 27 monotonicly increasing highestEnqueuedPresentationTime and is tested 28 by media-source-remove-b-frame.html. 29 30 Reviewed by Xabier Rodriguez-Calvar. 31 32 Tests: media/media-source/media-source-append-presentation-durations.html 33 media/media-source/media-source-first-append-not-starting-at-zero.html 34 media/media-source/media-source-remove-b-frame.html 35 36 * Modules/mediasource/SourceBuffer.cpp: 37 (WebCore::SourceBuffer::TrackBuffer::TrackBuffer): 38 (WebCore::SourceBuffer::removeCodedFrames): 39 (WebCore::SourceBuffer::sourceBufferPrivateDidReceiveSample): 40 (WebCore::SourceBuffer::provideMediaData): 41 (WebCore::SourceBuffer::reenqueueMediaForTime): 42 (WebCore::SourceBuffer::TrackBuffer::lastEnqueuedDecodeDuration): Deleted. 43 1 44 2020-01-16 Cathie Chen <cathiechen@igalia.com> 2 45 -
trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp
r254200 r254670 67 67 static const double ExponentialMovingAverageCoefficient = 0.1; 68 68 69 // Do not enqueue samples spanning a significant unbuffered gap. 70 // NOTE: one second is somewhat arbitrary. MediaSource::monitorSourceBuffers() is run 71 // on the playbackTimer, which is effectively every 350ms. Allowing > 350ms gap between 72 // enqueued samples allows for situations where we overrun the end of a buffered range 73 // but don't notice for 350s of playback time, and the client can enqueue data for the 74 // new current time without triggering this early return. 75 // FIXME(135867): Make this gap detection logic less arbitrary. 76 static const MediaTime discontinuityTolerance = MediaTime(1, 1); 77 69 78 struct SourceBuffer::TrackBuffer { 70 79 MediaTime lastDecodeTimestamp; … … 72 81 MediaTime lastFrameDuration; 73 82 MediaTime highestPresentationTimestamp; 74 MediaTime lastEnqueuedPresentationTime;83 MediaTime highestEnqueuedPresentationTime; 75 84 MediaTime minimumEnqueuedPresentationTime; 76 85 DecodeOrderSampleMap::KeyType lastEnqueuedDecodeKey; 77 MediaTime lastEnqueuedDecodeDuration;86 MediaTime enqueueDiscontinuityBoundary { MediaTime::zeroTime() }; 78 87 MediaTime roundedTimestampOffset; 79 88 uint32_t lastFrameTimescale { 0 }; … … 92 101 , lastFrameDuration(MediaTime::invalidTime()) 93 102 , highestPresentationTimestamp(MediaTime::invalidTime()) 94 , lastEnqueuedPresentationTime(MediaTime::invalidTime())103 , highestEnqueuedPresentationTime(MediaTime::invalidTime()) 95 104 , lastEnqueuedDecodeKey({MediaTime::invalidTime(), MediaTime::invalidTime()}) 96 , lastEnqueuedDecodeDuration(MediaTime::invalidTime())105 , enqueueDiscontinuityBoundary(discontinuityTolerance) 97 106 { 98 107 } … … 865 874 // Only force the TrackBuffer to re-enqueue if the removed ranges overlap with enqueued and possibly 866 875 // not yet displayed samples. 867 if (trackBuffer. lastEnqueuedPresentationTime.isValid() && currentMediaTime < trackBuffer.lastEnqueuedPresentationTime) {868 PlatformTimeRanges possiblyEnqueuedRanges(currentMediaTime, trackBuffer. lastEnqueuedPresentationTime);876 if (trackBuffer.highestEnqueuedPresentationTime.isValid() && currentMediaTime < trackBuffer.highestEnqueuedPresentationTime) { 877 PlatformTimeRanges possiblyEnqueuedRanges(currentMediaTime, trackBuffer.highestEnqueuedPresentationTime); 869 878 possiblyEnqueuedRanges.intersectWith(erasedRanges); 870 879 if (possiblyEnqueuedRanges.length()) { … … 1738 1747 // not yet displayed samples. 1739 1748 MediaTime currentMediaTime = m_source->currentTime(); 1740 if (trackBuffer. lastEnqueuedPresentationTime.isValid() && currentMediaTime < trackBuffer.lastEnqueuedPresentationTime) {1741 PlatformTimeRanges possiblyEnqueuedRanges(currentMediaTime, trackBuffer. lastEnqueuedPresentationTime);1749 if (trackBuffer.highestEnqueuedPresentationTime.isValid() && currentMediaTime < trackBuffer.highestEnqueuedPresentationTime) { 1750 PlatformTimeRanges possiblyEnqueuedRanges(currentMediaTime, trackBuffer.highestEnqueuedPresentationTime); 1742 1751 possiblyEnqueuedRanges.intersectWith(erasedRanges); 1743 1752 if (possiblyEnqueuedRanges.length()) … … 1761 1770 1762 1771 // Note: The terminology here is confusing: "enqueuing" means providing a frame to the inner media framework. 1763 // First, frames are inserted in the decode queue; later, at the end of the append all the frames in the decode1764 // queue are "enqueued" (sent to the inner media framework) in `provideMediaData()`.1772 // First, frames are inserted in the decode queue; later, at the end of the append some of the frames in the 1773 // decode may be "enqueued" (sent to the inner media framework) in `provideMediaData()`. 1765 1774 // 1766 // In order to check whether a frame should be added to the decode queue we check whether it starts after the 1767 // lastEnqueuedDecodeKey. 1775 // In order to check whether a frame should be added to the decode queue we check that it does not precede any 1776 // frame already enqueued. 1777 // 1778 // Note that adding a frame to the decode queue is no guarantee that it will be actually enqueued at that point. 1779 // If the frame is after the discontinuity boundary, the enqueueing algorithm will hold it there until samples 1780 // with earlier timestamps are enqueued. The decode queue is not FIFO, but rather an ordered map. 1768 1781 DecodeOrderSampleMap::KeyType decodeKey(sample.decodeTime(), sample.presentationTime()); 1769 1782 if (trackBuffer.lastEnqueuedDecodeKey.first.isInvalid() || decodeKey > trackBuffer.lastEnqueuedDecodeKey) { … … 2020 2033 auto sample = trackBuffer.decodeQueue.begin()->second; 2021 2034 2022 // Do not enqueue samples spanning a significant unbuffered gap. 2023 // NOTE: one second is somewhat arbitrary. MediaSource::monitorSourceBuffers() is run 2024 // on the playbackTimer, which is effectively every 350ms. Allowing > 350ms gap between 2025 // enqueued samples allows for situations where we overrun the end of a buffered range 2026 // but don't notice for 350s of playback time, and the client can enqueue data for the 2027 // new current time without triggering this early return. 2028 // FIXME(135867): Make this gap detection logic less arbitrary. 2029 MediaTime oneSecond(1, 1); 2030 if (trackBuffer.lastEnqueuedDecodeKey.first.isValid() 2031 && trackBuffer.lastEnqueuedDecodeDuration.isValid() 2032 && sample->decodeTime() - trackBuffer.lastEnqueuedDecodeKey.first > oneSecond + trackBuffer.lastEnqueuedDecodeDuration) { 2033 2034 DEBUG_LOG(LOGIDENTIFIER, "bailing early because of unbuffered gap, new sample: ", sample->decodeTime(), ", last enqueued sample ends: ", trackBuffer.lastEnqueuedDecodeKey.first + trackBuffer.lastEnqueuedDecodeDuration); 2035 if (sample->decodeTime() > trackBuffer.enqueueDiscontinuityBoundary) { 2036 DEBUG_LOG(LOGIDENTIFIER, "bailing early because of unbuffered gap, new sample: ", sample->decodeTime(), " >= the current discontinuity boundary: ", trackBuffer.enqueueDiscontinuityBoundary); 2035 2037 break; 2036 2038 } … … 2039 2041 trackBuffer.decodeQueue.erase(trackBuffer.decodeQueue.begin()); 2040 2042 2041 trackBuffer.lastEnqueuedPresentationTime = sample->presentationTime(); 2043 MediaTime samplePresentationEnd = sample->presentationTime() + sample->duration(); 2044 if (trackBuffer.highestEnqueuedPresentationTime.isInvalid() || samplePresentationEnd > trackBuffer.highestEnqueuedPresentationTime) 2045 trackBuffer.highestEnqueuedPresentationTime = samplePresentationEnd; 2046 2042 2047 trackBuffer.lastEnqueuedDecodeKey = {sample->decodeTime(), sample->presentationTime()}; 2043 trackBuffer.lastEnqueuedDecodeDuration = sample->duration(); 2048 trackBuffer.enqueueDiscontinuityBoundary = sample->decodeTime() + sample->duration() + discontinuityTolerance; 2049 2044 2050 m_private->enqueueSample(sample.releaseNonNull(), trackID); 2045 2051 #if !RELEASE_LOG_DISABLED … … 2110 2116 m_private->flush(trackID); 2111 2117 trackBuffer.decodeQueue.clear(); 2118 2119 trackBuffer.highestEnqueuedPresentationTime = MediaTime::invalidTime(); 2120 trackBuffer.lastEnqueuedDecodeKey = {MediaTime::invalidTime(), MediaTime::invalidTime()}; 2121 trackBuffer.enqueueDiscontinuityBoundary = time + discontinuityTolerance; 2112 2122 2113 2123 // Find the sample which contains the current presentation time. … … 2136 2146 DecodeOrderSampleMap::KeyType decodeKey(copy->decodeTime(), copy->presentationTime()); 2137 2147 trackBuffer.decodeQueue.insert(DecodeOrderSampleMap::MapType::value_type(decodeKey, WTFMove(copy))); 2138 }2139 2140 if (!trackBuffer.decodeQueue.empty()) {2141 auto lastSampleIter = trackBuffer.decodeQueue.rbegin();2142 auto lastSampleDecodeKey = lastSampleIter->first;2143 auto lastSampleDuration = lastSampleIter->second->duration();2144 trackBuffer.lastEnqueuedPresentationTime = lastSampleDecodeKey.second;2145 trackBuffer.lastEnqueuedDecodeKey = lastSampleDecodeKey;2146 trackBuffer.lastEnqueuedDecodeDuration = lastSampleDuration;2147 } else {2148 trackBuffer.lastEnqueuedPresentationTime = MediaTime::invalidTime();2149 trackBuffer.lastEnqueuedDecodeKey = {MediaTime::invalidTime(), MediaTime::invalidTime()};2150 trackBuffer.lastEnqueuedDecodeDuration = MediaTime::invalidTime();2151 2148 } 2152 2149
Note: See TracChangeset
for help on using the changeset viewer.