Changeset 255910 in webkit
- Timestamp:
- Feb 6, 2020 12:34:49 AM (4 years ago)
- Location:
- trunk
- Files:
-
- 7 added
- 30 edited
- 1 copied
Legend:
- Unmodified
- Added
- Removed
-
trunk/LayoutTests/ChangeLog
r255895 r255910 1 2020-02-06 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 6 Reviewed by Eric Carlson. 7 8 Disable tests on all platforms except the ones supporting AVAssetWriterDelegate. 9 10 * TestExpectations: 11 * http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html: 12 Remove web audio generation since there seems to be some unstability in web audio -> stream -> media recorder. 13 which should be fixed as follow-up specific patches. 14 * platform/mac/TestExpectations: 15 Enable running tests. 16 1 17 2020-02-05 Devin Rousso <drousso@apple.com> 2 18 -
trunk/LayoutTests/TestExpectations
r255875 r255910 3333 3333 webgl/1.0.3/conformance/extensions/webgl-draw-buffers.html [ Skip ] 3334 3334 3335 webkit.org/b/197673 http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable.html [ Pass Failure Timeout ] 3335 # Not supported by default 3336 http/wpt/mediarecorder [ Skip ] 3337 imported/w3c/web-platform-tests/mediacapture-record [ Skip ] 3338 fast/history/page-cache-media-recorder.html [ Skip ] 3336 3339 3337 3340 # To aid transition to ANGLE backend for WebGL, skip all of WebGL 2 tests for now. -
trunk/LayoutTests/http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html
r255819 r255910 57 57 58 58 async_test(t => { 59 const ac = new AudioContext();60 const osc = ac.createOscillator();61 const dest = ac.createMediaStreamDestination();62 const audio = dest.stream;63 osc.connect(dest);64 65 59 const video = createVideoStream(); 66 assert_equals(video.getAudioTracks().length, 0, "video mediastream starts with no audio track");67 assert_equals(audio.getAudioTracks().length, 1, "audio mediastream starts with one audio track");68 video.addTrack(audio.getAudioTracks()[0]);69 assert_equals(video.getAudioTracks().length, 1, "video mediastream starts with one audio track");70 60 const recorder = new MediaRecorder(video); 71 61 let mode = 0; -
trunk/LayoutTests/platform/mac/TestExpectations
r255869 r255910 1829 1829 [ Catalina+ ] fast/text/design-system-ui-16.html [ Pass ] 1830 1830 1831 [ Catalina+ ] http/wpt/mediarecorder [ Pass Failure ] 1832 [ Catalina+ ] imported/w3c/web-platform-tests/mediacapture-record [ Pass Failure ] 1833 [ Catalina+ ] fast/history/page-cache-media-recorder.html [ Pass Failure ] 1834 1831 1835 webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ] 1832 1836 -
trunk/Source/WTF/ChangeLog
r255905 r255910 1 2020-02-06 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 6 Reviewed by Eric Carlson. 7 8 * wtf/PlatformHave.h: 9 1 10 2020-02-05 Don Olmstead <don.olmstead@sony.com> 2 11 -
trunk/Source/WTF/wtf/PlatformHave.h
r255819 r255910 567 567 #endif 568 568 569 #if PLATFORM(COCOA) && (defined __has_include && __has_include(<AVFoundation/AVAssetWriter_Private.h>)) 570 #define HAVE_AVASSETWRITERDELEGATE 1 571 #endif 572 569 573 #if PLATFORM(IOS_FAMILY) && !PLATFORM(WATCHOS) && !PLATFORM(APPLETV) 570 574 #define HAVE_SYSTEM_FONT_STYLE_TITLE_0 1 -
trunk/Source/WebCore/ChangeLog
r255909 r255910 1 2020-02-06 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 AVAssetWriterDelegate allows to grab recorded data whenever wanted. 10 This delegate requires passing compressed samples to AVAssetWriter. 11 Implement video encoding and audio encoding in dedicated classes and use these classes before adding buffers to AVAssetWriter. 12 These classes are AudioSampleBufferCompressor and VideoSampleBufferCompressor. 13 They support AAC and H264 so far and should be further improved to support more encoding options. 14 15 Instantiate real writer only for platforms supporting AVAssetWriterDelegate, since it is not supported everywhere. 16 The writer, doing the pacakging, is receiving compressed buffer from the audio/video compressors. 17 It then sends data when being request to flush to its delegate, which will send data to the MediaRecorderPrivateWriter. 18 The MediaRecorderPrivateWriter stores the data in a SharedBuffer until MediaRecorder asks for data. 19 20 Note that, whenever we request data, we flush the writer and insert an end of video sample to make sure video data gets flushed. 21 Therefore data should not be requested too fast to get adequate video compression. 22 23 Covered by existing tests. 24 25 * Modules/mediarecorder/MediaRecorderProvider.cpp: 26 (WebCore::MediaRecorderProvider::createMediaRecorderPrivate): 27 * WebCore.xcodeproj/project.pbxproj: 28 * platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp: 29 (WebCore::MediaRecorderPrivateAVFImpl::create): 30 * platform/mediarecorder/MediaRecorderPrivateAVFImpl.h: 31 * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.h: Added. 32 * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm: Added. 33 (WebCore::AudioSampleBufferCompressor::create): 34 (WebCore::AudioSampleBufferCompressor::AudioSampleBufferCompressor): 35 (WebCore::AudioSampleBufferCompressor::~AudioSampleBufferCompressor): 36 (WebCore::AudioSampleBufferCompressor::initialize): 37 (WebCore::AudioSampleBufferCompressor::finish): 38 (WebCore::AudioSampleBufferCompressor::initAudioConverterForSourceFormatDescription): 39 (WebCore::AudioSampleBufferCompressor::computeBufferSizeForAudioFormat): 40 (WebCore::AudioSampleBufferCompressor::attachPrimingTrimsIfNeeded): 41 (WebCore::AudioSampleBufferCompressor::gradualDecoderRefreshCount): 42 (WebCore::AudioSampleBufferCompressor::sampleBufferWithNumPackets): 43 (WebCore::AudioSampleBufferCompressor::audioConverterComplexInputDataProc): 44 (WebCore::AudioSampleBufferCompressor::provideSourceDataNumOutputPackets): 45 (WebCore::AudioSampleBufferCompressor::processSampleBuffersUntilLowWaterTime): 46 (WebCore::AudioSampleBufferCompressor::processSampleBuffer): 47 (WebCore::AudioSampleBufferCompressor::addSampleBuffer): 48 (WebCore::AudioSampleBufferCompressor::getOutputSampleBuffer): 49 (WebCore::AudioSampleBufferCompressor::takeOutputSampleBuffer): 50 * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h: 51 * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm: 52 (-[WebAVAssetWriterDelegate initWithWriter:]): 53 (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedHeaderData:]): 54 (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedMediaData:fragmentedMediaDataReport:]): 55 (-[WebAVAssetWriterDelegate close]): 56 (WebCore::MediaRecorderPrivateWriter::create): 57 (WebCore::MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback): 58 (WebCore::MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback): 59 (WebCore::MediaRecorderPrivateWriter::MediaRecorderPrivateWriter): 60 (WebCore::MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter): 61 (WebCore::MediaRecorderPrivateWriter::initialize): 62 (WebCore::MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers): 63 (WebCore::MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers): 64 (WebCore::MediaRecorderPrivateWriter::startAssetWriter): 65 (WebCore::MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer): 66 (WebCore::MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer): 67 (WebCore::MediaRecorderPrivateWriter::appendCompressedSampleBuffers): 68 (WebCore::appendEndsPreviousSampleDurationMarker): 69 (WebCore::MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded): 70 (WebCore::MediaRecorderPrivateWriter::flushCompressedSampleBuffers): 71 (WebCore::MediaRecorderPrivateWriter::clear): 72 (WebCore::copySampleBufferWithCurrentTimeStamp): 73 (WebCore::MediaRecorderPrivateWriter::appendVideoSampleBuffer): 74 (WebCore::createAudioFormatDescription): 75 (WebCore::createAudioSampleBuffer): 76 (WebCore::MediaRecorderPrivateWriter::appendAudioSampleBuffer): 77 (WebCore::MediaRecorderPrivateWriter::stopRecording): 78 (WebCore::MediaRecorderPrivateWriter::appendData): 79 * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h: Copied from Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h. 80 * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm: Added. 81 (WebCore::VideoSampleBufferCompressor::create): 82 (WebCore::VideoSampleBufferCompressor::VideoSampleBufferCompressor): 83 (WebCore::VideoSampleBufferCompressor::~VideoSampleBufferCompressor): 84 (WebCore::VideoSampleBufferCompressor::initialize): 85 (WebCore::VideoSampleBufferCompressor::finish): 86 (WebCore::VideoSampleBufferCompressor::videoCompressionCallback): 87 (WebCore::VideoSampleBufferCompressor::initCompressionSession): 88 (WebCore::VideoSampleBufferCompressor::processSampleBuffer): 89 (WebCore::VideoSampleBufferCompressor::addSampleBuffer): 90 (WebCore::VideoSampleBufferCompressor::getOutputSampleBuffer): 91 (WebCore::VideoSampleBufferCompressor::takeOutputSampleBuffer): 92 1 93 2020-02-06 youenn fablet <youenn@apple.com> 2 94 -
trunk/Source/WebCore/Modules/mediarecorder/MediaRecorderProvider.cpp
r255819 r255910 35 35 std::unique_ptr<MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(const MediaStreamPrivate& stream) 36 36 { 37 #if HAVE(AVASSETWRITERDELEGATE) 37 38 return MediaRecorderPrivateAVFImpl::create(stream); 39 #else 40 UNUSED_PARAM(stream); 41 return nullptr; 42 #endif 38 43 } 39 44 -
trunk/Source/WebCore/PAL/ChangeLog
r255881 r255910 1 2020-02-06 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 Add soft link macros for VideoToolbox and AudioToolbox. 10 11 * PAL.xcodeproj/project.pbxproj: 12 * pal/cf/AudioToolboxSoftLink.cpp: Added. 13 * pal/cf/AudioToolboxSoftLink.h: Added. 14 * pal/cf/CoreMediaSoftLink.cpp: 15 * pal/cf/CoreMediaSoftLink.h: 16 * pal/cf/VideoToolboxSoftLink.cpp: Added. 17 * pal/cf/VideoToolboxSoftLink.h: Added. 18 1 19 2020-02-05 Jer Noble <jer.noble@apple.com> 2 20 -
trunk/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj
r255819 r255910 116 116 2E1342CD215AA10A007199D2 /* UIKitSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 2E1342CB215AA10A007199D2 /* UIKitSoftLink.mm */; }; 117 117 31308B1420A21705003FB929 /* SystemPreviewSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 31308B1320A21705003FB929 /* SystemPreviewSPI.h */; }; 118 416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */; }; 119 416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */; }; 120 416E995723DAEFF800E871CB /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */; }; 121 416E995823DAEFF800E871CB /* VideoToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */; }; 118 122 442956CD218A72DF0080DB54 /* RevealSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 442956CC218A72DE0080DB54 /* RevealSPI.h */; }; 119 123 4450FC9F21F5F602004DFA56 /* QuickLookSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */; }; … … 291 295 31308B1320A21705003FB929 /* SystemPreviewSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = SystemPreviewSPI.h; sourceTree = "<group>"; }; 292 296 37119A7820CCB5FF002C6DC9 /* WebKitTargetConditionals.xcconfig */ = {isa = PBXFileReference; lastKnownFileType = text.xcconfig; path = WebKitTargetConditionals.xcconfig; sourceTree = "<group>"; }; 297 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AudioToolboxSoftLink.cpp; sourceTree = "<group>"; }; 298 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioToolboxSoftLink.h; sourceTree = "<group>"; }; 299 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; }; 300 416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; }; 293 301 442956CC218A72DE0080DB54 /* RevealSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RevealSPI.h; sourceTree = "<group>"; }; 294 302 4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = QuickLookSoftLink.mm; sourceTree = "<group>"; }; … … 522 530 isa = PBXGroup; 523 531 children = ( 532 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */, 533 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */, 524 534 0CF99CA61F738436007EE793 /* CoreMediaSoftLink.cpp */, 525 535 0CF99CA71F738437007EE793 /* CoreMediaSoftLink.h */, 536 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */, 537 416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */, 526 538 ); 527 539 path = cf; … … 713 725 57FD318B22B35989008D0E8B /* AppSSOSoftLink.h in Headers */, 714 726 576CA9D622B854AB0030143C /* AppSSOSPI.h in Headers */, 727 416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */, 715 728 2D02E93C2056FAA700A13797 /* AudioToolboxSPI.h in Headers */, 716 729 572A107822B456F500F410C8 /* AuthKitSPI.h in Headers */, … … 835 848 0C5AF9221F43A4C7002EAC02 /* UIKitSPI.h in Headers */, 836 849 0C2DA1471F3BEB4900DBC317 /* URLFormattingSPI.h in Headers */, 850 416E995823DAEFF800E871CB /* VideoToolboxSoftLink.h in Headers */, 837 851 0C2DA1591F3BEB4900DBC317 /* WebFilterEvaluatorSPI.h in Headers */, 838 852 A10826F91F576292004772AC /* WebPanel.h in Headers */, … … 927 941 files = ( 928 942 57FD318A22B3593E008D0E8B /* AppSSOSoftLink.mm in Sources */, 943 416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */, 929 944 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */, 930 945 0C5FFF0F1F78D9DA009EFF1A /* ClockCM.mm in Sources */, … … 953 968 A3AB6E651F3D217F009C14B1 /* SystemSleepListenerMac.mm in Sources */, 954 969 2E1342CD215AA10A007199D2 /* UIKitSoftLink.mm in Sources */, 970 416E995723DAEFF800E871CB /* VideoToolboxSoftLink.cpp in Sources */, 955 971 A10826FA1F576292004772AC /* WebPanel.mm in Sources */, 956 972 ); -
trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp
r255819 r255910 47 47 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBlockBufferCopyDataBytes, OSStatus, (CMBlockBufferRef theSourceBuffer, size_t offsetToData, size_t dataLength, void* destination), (theSourceBuffer, offsetToData, dataLength, destination)) 48 48 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer)) 49 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength)) 49 50 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc)) 50 51 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMSampleBufferGetTypeID, CFTypeID, (void), ()) … … 134 135 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue)) 135 136 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue)) 137 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), ()) 136 138 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 137 139 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 138 140 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 141 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue)) 139 142 140 143 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef) … … 150 153 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMSampleAttachmentKey_IsDependedOnByOthers, CFStringRef) 151 154 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef) 155 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef) 156 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef) 157 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef) 152 158 153 159 SOFT_LINK_CONSTANT_FOR_SOURCE(PAL, CoreMedia, kCMTimebaseNotification_EffectiveRateChanged, CFStringRef) … … 164 170 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf)) 165 171 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc)) 172 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut)) 173 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc)) 174 166 175 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), ()) 167 176 SOFT_LINK_FUNCTION_FOR_SOURCE(PAL, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock)) -
trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h
r255819 r255910 50 50 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer)) 51 51 #define CMBlockBufferGetDataLength softLink_CoreMedia_CMBlockBufferGetDataLength 52 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength)) 53 #define CMBlockBufferReplaceDataBytes softLink_CoreMedia_CMBlockBufferReplaceDataBytes 52 54 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc)) 53 55 #define CMFormatDescriptionGetExtensions softLink_CoreMedia_CMFormatDescriptionGetExtensions … … 224 226 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 225 227 #define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp 228 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), ()) 229 #define CMBufferQueueGetCallbacksForUnsortedSampleBuffers softLink_CoreMedia_CMBufferQueueGetCallbacksForUnsortedSampleBuffers 226 230 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 227 231 #define CMBufferQueueGetEndPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetEndPresentationTimeStamp 228 232 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 229 233 #define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold 234 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue)) 235 #define CMBufferQueueMarkEndOfData softLink_CoreMedia_CMBufferQueueMarkEndOfData 230 236 231 237 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef) … … 257 263 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef) 258 264 #define kCMSampleBufferConsumerNotification_BufferConsumed get_CoreMedia_kCMSampleBufferConsumerNotification_BufferConsumed() 265 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef) 266 #define kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration get_CoreMedia_kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration() 267 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef) 268 #define kCMSampleBufferAttachmentKey_GradualDecoderRefresh get_CoreMedia_kCMSampleBufferAttachmentKey_GradualDecoderRefresh() 269 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef) 270 #define get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart() 271 272 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut)) 273 #define CMAudioFormatDescriptionGetMagicCookie softLink_CoreMedia_CMAudioFormatDescriptionGetMagicCookie 259 274 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc)) 260 275 #define CMAudioFormatDescriptionGetStreamBasicDescription softLink_CoreMedia_CMAudioFormatDescriptionGetStreamBasicDescription … … 265 280 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf)) 266 281 #define CMSampleBufferGetNumSamples softLink_CoreMedia_CMSampleBufferGetNumSamples 282 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc)) 283 #define CMAudioFormatDescriptionGetRichestDecodableFormat softLink_CoreMedia_CMAudioFormatDescriptionGetRichestDecodableFormat 267 284 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut)) 268 285 #define CMSampleBufferCopySampleBufferForRange softLink_CoreMedia_CMSampleBufferCopySampleBufferForRange -
trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj
r255819 r255910 1098 1098 416E6FE91BBD12E5000A6043 /* ReadableStreamBuiltins.h in Headers */ = {isa = PBXBuildFile; fileRef = 9B03D8061BB3110D00B764D8 /* ReadableStreamBuiltins.h */; settings = {ATTRIBUTES = (Private, ); }; }; 1099 1099 416E6FE91BBD12E5000A6053 /* WritableStreamBuiltins.h in Headers */ = {isa = PBXBuildFile; fileRef = 9B03D8061BB3110D00B764E8 /* WritableStreamBuiltins.h */; settings = {ATTRIBUTES = (Private, ); }; }; 1100 416F799023D750CF00829FC1 /* AudioSampleBufferCompressor.mm in Sources */ = {isa = PBXBuildFile; fileRef = 416F798F23D750CB00829FC1 /* AudioSampleBufferCompressor.mm */; }; 1100 1101 417253AB1354BBBC00360F2A /* MediaControlElements.h in Headers */ = {isa = PBXBuildFile; fileRef = 417253A91354BBBC00360F2A /* MediaControlElements.h */; }; 1101 1102 417612AF1E3A994000C3D81D /* LibWebRTCMediaEndpoint.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 417612AB1E3A993B00C3D81D /* LibWebRTCMediaEndpoint.cpp */; }; … … 1138 1139 41BF204922BA7BE80004F812 /* RealtimeVideoSource.h in Headers */ = {isa = PBXBuildFile; fileRef = 41BF204022B947160004F812 /* RealtimeVideoSource.h */; settings = {ATTRIBUTES = (Private, ); }; }; 1139 1140 41C760B10EDE03D300C1655F /* ScriptState.h in Headers */ = {isa = PBXBuildFile; fileRef = 41C760B00EDE03D300C1655F /* ScriptState.h */; settings = {ATTRIBUTES = (Private, ); }; }; 1141 41CD6F8C23D6E82100B16421 /* VideoSampleBufferCompressor.mm in Sources */ = {isa = PBXBuildFile; fileRef = 41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */; }; 1140 1142 41D015CA0F4B5C71004A662F /* ContentType.h in Headers */ = {isa = PBXBuildFile; fileRef = 41D015C80F4B5C71004A662F /* ContentType.h */; settings = {ATTRIBUTES = (Private, ); }; }; 1141 1143 41D129CE1F3D0EF600D15E47 /* WorkerGlobalScopeCaches.h in Headers */ = {isa = PBXBuildFile; fileRef = 41FB278D1F34C28200795487 /* WorkerGlobalScopeCaches.h */; }; … … 7361 7363 416E0B37209BC3C2004A95D9 /* FetchIdentifier.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FetchIdentifier.h; sourceTree = "<group>"; }; 7362 7364 416E29A5102FA962007FC14E /* WorkerReportingProxy.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WorkerReportingProxy.h; sourceTree = "<group>"; }; 7365 416F798D23D750CA00829FC1 /* AudioSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioSampleBufferCompressor.h; sourceTree = "<group>"; }; 7366 416F798F23D750CB00829FC1 /* AudioSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AudioSampleBufferCompressor.mm; sourceTree = "<group>"; }; 7363 7367 4170A2E91D8C0CC000318452 /* JSDOMWrapper.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = JSDOMWrapper.cpp; sourceTree = "<group>"; }; 7364 7368 417253A81354BBBC00360F2A /* MediaControlElements.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = MediaControlElements.cpp; sourceTree = "<group>"; }; … … 7468 7472 41C7E1061E6A54360027B4DE /* CanvasCaptureMediaStreamTrack.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CanvasCaptureMediaStreamTrack.h; sourceTree = "<group>"; }; 7469 7473 41C7E1081E6AA37C0027B4DE /* CanvasCaptureMediaStreamTrack.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = CanvasCaptureMediaStreamTrack.idl; sourceTree = "<group>"; }; 7474 41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoSampleBufferCompressor.h; sourceTree = "<group>"; }; 7475 41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = VideoSampleBufferCompressor.mm; sourceTree = "<group>"; }; 7470 7476 41CF8BE41D46222000707DC9 /* FetchBodyConsumer.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = FetchBodyConsumer.cpp; sourceTree = "<group>"; }; 7471 7477 41CF8BE51D46222000707DC9 /* FetchBodyConsumer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FetchBodyConsumer.h; sourceTree = "<group>"; }; … … 18890 18896 isa = PBXGroup; 18891 18897 children = ( 18898 416F798D23D750CA00829FC1 /* AudioSampleBufferCompressor.h */, 18899 416F798F23D750CB00829FC1 /* AudioSampleBufferCompressor.mm */, 18892 18900 4D73F94C218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.h */, 18893 18901 4D73F94D218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.mm */, 18902 41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */, 18903 41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */, 18894 18904 ); 18895 18905 path = cocoa; … … 33537 33547 CD0EEE0E14743F39003EAFA2 /* AudioDestinationIOS.cpp in Sources */, 33538 33548 CD5596911475B678001D0BD0 /* AudioFileReaderIOS.cpp in Sources */, 33549 416F799023D750CF00829FC1 /* AudioSampleBufferCompressor.mm in Sources */, 33539 33550 CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */, 33540 33551 CD8A7BBB197735FE00CBD643 /* AudioSourceProviderAVFObjC.mm in Sources */, … … 34269 34280 3FBC4AF3189881560046EE38 /* VideoFullscreenInterfaceAVKit.mm in Sources */, 34270 34281 52D5A18F1C54592300DE34A3 /* VideoFullscreenLayerManagerObjC.mm in Sources */, 34282 41CD6F8C23D6E82100B16421 /* VideoSampleBufferCompressor.mm in Sources */, 34271 34283 BE88E0DE1715D2A200658D98 /* VideoTrack.cpp in Sources */, 34272 34284 BE88E0E11715D2A200658D98 /* VideoTrackList.cpp in Sources */, -
trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp
r255819 r255910 27 27 #include "MediaRecorderPrivateAVFImpl.h" 28 28 29 #if ENABLE(MEDIA_STREAM) 29 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "AudioStreamDescription.h" 32 #include "MediaRecorderPrivateWriterCocoa.h" 32 33 #include "MediaSample.h" 33 34 #include "MediaStreamPrivate.h" … … 69 70 } 70 71 } 71 auto writer = MediaRecorderPrivateWriter::create(audioTrack, videoTrack); 72 73 int width = 0, height = 0; 74 if (videoTrack) { 75 auto& settings = videoTrack->settings(); 76 width = settings.width(); 77 height = settings.height(); 78 } 79 auto writer = MediaRecorderPrivateWriter::create(!!audioTrack, width, height); 72 80 if (!writer) 73 81 return nullptr; … … 120 128 } // namespace WebCore 121 129 122 #endif // ENABLE(MEDIA_STREAM) 130 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h
r255819 r255910 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 28 28 29 29 #include "MediaRecorderPrivate.h" … … 57 57 } // namespace WebCore 58 58 59 #endif // ENABLE(MEDIA_STREAM) 59 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h
r255819 r255910 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 28 29 #include "AudioStreamDescription.h" 28 30 29 31 #include "SharedBuffer.h" … … 36 38 #include <wtf/threads/BinarySemaphore.h> 37 39 40 #include <CoreAudio/CoreAudioTypes.h> 41 #include <CoreMedia/CMTime.h> 42 38 43 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 44 typedef const struct opaqueCMFormatDescription* CMFormatDescriptionRef; 45 typedef struct opaqueCMBufferQueueTriggerToken *CMBufferQueueTriggerToken; 39 46 40 47 OBJC_CLASS AVAssetWriter; 41 48 OBJC_CLASS AVAssetWriterInput; 49 OBJC_CLASS WebAVAssetWriterDelegate; 42 50 43 51 namespace WTF { … … 47 55 namespace WebCore { 48 56 57 class AudioSampleBufferCompressor; 49 58 class AudioStreamDescription; 50 59 class MediaStreamTrackPrivate; 51 60 class PlatformAudioData; 61 class VideoSampleBufferCompressor; 52 62 53 class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter > {63 class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter, WeakPtrFactoryInitialization::Eager> { 54 64 public: 55 static RefPtr<MediaRecorderPrivateWriter> create(const MediaStreamTrackPrivate* audioTrack, const MediaStreamTrackPrivate* videoTrack);56 65 static RefPtr<MediaRecorderPrivateWriter> create(bool hasAudio, int width, int height); 57 66 ~MediaRecorderPrivateWriter(); 58 59 bool setupWriter(); 60 bool setVideoInput(int width, int height); 61 bool setAudioInput(); 67 62 68 void appendVideoSampleBuffer(CMSampleBufferRef); 63 69 void appendAudioSampleBuffer(const PlatformAudioData&, const AudioStreamDescription&, const WTF::MediaTime&, size_t); … … 65 71 void fetchData(CompletionHandler<void(RefPtr<SharedBuffer>&&)>&&); 66 72 73 void appendData(const char*, size_t); 74 void appendData(Ref<SharedBuffer>&&); 75 67 76 private: 68 MediaRecorderPrivateWriter( RetainPtr<AVAssetWriter>&&, String&& path);77 MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo); 69 78 void clear(); 70 79 71 RetainPtr<AVAssetWriter> m_writer; 72 RetainPtr<AVAssetWriterInput> m_videoInput; 73 RetainPtr<AVAssetWriterInput> m_audioInput; 80 bool initialize(); 74 81 75 String m_path; 76 Lock m_videoLock; 77 Lock m_audioLock; 78 BinarySemaphore m_finishWritingSemaphore; 79 BinarySemaphore m_finishWritingAudioSemaphore; 80 BinarySemaphore m_finishWritingVideoSemaphore; 82 static void compressedVideoOutputBufferCallback(void*, CMBufferQueueTriggerToken); 83 static void compressedAudioOutputBufferCallback(void*, CMBufferQueueTriggerToken); 84 85 void startAssetWriter(); 86 void appendCompressedSampleBuffers(); 87 88 bool appendCompressedAudioSampleBuffer(); 89 bool appendCompressedVideoSampleBuffer(); 90 91 void processNewCompressedAudioSampleBuffers(); 92 void processNewCompressedVideoSampleBuffers(); 93 94 void flushCompressedSampleBuffers(CompletionHandler<void()>&&); 95 void appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&&); 96 81 97 bool m_hasStartedWriting { false }; 82 98 bool m_isStopped { false }; 83 bool m_isFirstAudioSample { true }; 84 dispatch_queue_t m_audioPullQueue; 85 dispatch_queue_t m_videoPullQueue; 86 Deque<RetainPtr<CMSampleBufferRef>> m_videoBufferPool; 87 Deque<RetainPtr<CMSampleBufferRef>> m_audioBufferPool; 99 100 RetainPtr<AVAssetWriter> m_writer; 88 101 89 102 bool m_isStopping { false }; 90 103 RefPtr<SharedBuffer> m_data; 91 104 CompletionHandler<void(RefPtr<SharedBuffer>&&)> m_fetchDataCompletionHandler; 105 106 bool m_hasAudio; 107 bool m_hasVideo; 108 109 RetainPtr<CMFormatDescriptionRef> m_audioFormatDescription; 110 std::unique_ptr<AudioSampleBufferCompressor> m_audioCompressor; 111 RetainPtr<AVAssetWriterInput> m_audioAssetWriterInput; 112 113 RetainPtr<CMFormatDescriptionRef> m_videoFormatDescription; 114 std::unique_ptr<VideoSampleBufferCompressor> m_videoCompressor; 115 RetainPtr<AVAssetWriterInput> m_videoAssetWriterInput; 116 CMTime m_lastVideoPresentationTime; 117 CMTime m_lastVideoDecodingTime; 118 bool m_hasEncodedVideoSamples { false }; 119 120 RetainPtr<WebAVAssetWriterDelegate> m_writerDelegate; 92 121 }; 93 122 94 123 } // namespace WebCore 95 124 96 #endif // ENABLE(MEDIA_STREAM) 125 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm
r255819 r255910 27 27 #include "MediaRecorderPrivateWriterCocoa.h" 28 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 30 29 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 31 #include "AudioSampleBufferCompressor.h" 31 32 #include "AudioStreamDescription.h" 32 33 #include "Logging.h" 33 34 #include "MediaStreamTrackPrivate.h" 35 #include "VideoSampleBufferCompressor.h" 34 36 #include "WebAudioBufferList.h" 35 37 #include <AVFoundation/AVAssetWriter.h> 36 38 #include <AVFoundation/AVAssetWriterInput.h> 39 #include <AVFoundation/AVAssetWriter_Private.h> 40 #include <pal/avfoundation/MediaTimeAVFoundation.h> 37 41 #include <pal/cf/CoreMediaSoftLink.h> 42 #include <wtf/BlockPtr.h> 38 43 #include <wtf/CompletionHandler.h> 39 44 #include <wtf/FileSystem.h> 45 #include <wtf/cf/TypeCastsCF.h> 40 46 41 47 #import <pal/cocoa/AVFoundationSoftLink.h> 42 48 43 #undef AVEncoderBitRateKey 44 #define AVEncoderBitRateKey getAVEncoderBitRateKeyWithFallback() 45 #undef AVFormatIDKey 46 #define AVFormatIDKey getAVFormatIDKeyWithFallback() 47 #undef AVNumberOfChannelsKey 48 #define AVNumberOfChannelsKey getAVNumberOfChannelsKeyWithFallback() 49 #undef AVSampleRateKey 50 #define AVSampleRateKey getAVSampleRateKeyWithFallback() 49 @interface WebAVAssetWriterDelegate : NSObject <AVAssetWriterDelegate> { 50 WeakPtr<WebCore::MediaRecorderPrivateWriter> m_writer; 51 } 52 53 - (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer; 54 - (void)close; 55 56 @end 57 58 @implementation WebAVAssetWriterDelegate { 59 }; 60 61 - (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer 62 { 63 ASSERT(isMainThread()); 64 self = [super init]; 65 if (self) 66 self->m_writer = makeWeakPtr(writer); 67 68 return self; 69 } 70 71 - (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedHeaderData:(NSData *)fragmentedHeaderData 72 { 73 UNUSED_PARAM(assetWriter); 74 if (!isMainThread()) { 75 if (auto size = [fragmentedHeaderData length]) { 76 callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedHeaderData bytes]), size)]() mutable { 77 if (protectedSelf->m_writer) 78 protectedSelf->m_writer->appendData(WTFMove(buffer)); 79 }); 80 } 81 return; 82 } 83 84 if (m_writer) 85 m_writer->appendData(static_cast<const char*>([fragmentedHeaderData bytes]), [fragmentedHeaderData length]); 86 } 87 88 - (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedMediaData:(NSData *)fragmentedMediaData fragmentedMediaDataReport:(AVFragmentedMediaDataReport *)fragmentedMediaDataReport 89 { 90 UNUSED_PARAM(assetWriter); 91 UNUSED_PARAM(fragmentedMediaDataReport); 92 if (!isMainThread()) { 93 if (auto size = [fragmentedMediaData length]) { 94 callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedMediaData bytes]), size)]() mutable { 95 if (protectedSelf->m_writer) 96 protectedSelf->m_writer->appendData(WTFMove(buffer)); 97 }); 98 } 99 return; 100 } 101 102 if (m_writer) 103 m_writer->appendData(static_cast<const char*>([fragmentedMediaData bytes]), [fragmentedMediaData length]); 104 } 105 106 - (void)close 107 { 108 m_writer = nullptr; 109 } 110 111 @end 51 112 52 113 namespace WebCore { … … 54 115 using namespace PAL; 55 116 56 static NSString *getAVFormatIDKeyWithFallback()57 {58 if (PAL::canLoad_AVFoundation_AVFormatIDKey())59 return PAL::get_AVFoundation_AVFormatIDKey();60 61 RELEASE_LOG_ERROR(Media, "Failed to load AVFormatIDKey");62 return @"AVFormatIDKey";63 }64 65 static NSString *getAVNumberOfChannelsKeyWithFallback()66 {67 if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey())68 return PAL::get_AVFoundation_AVNumberOfChannelsKey();69 70 RELEASE_LOG_ERROR(Media, "Failed to load AVNumberOfChannelsKey");71 return @"AVNumberOfChannelsKey";72 }73 74 static NSString *getAVSampleRateKeyWithFallback()75 {76 if (PAL::canLoad_AVFoundation_AVSampleRateKey())77 return PAL::get_AVFoundation_AVSampleRateKey();78 79 RELEASE_LOG_ERROR(Media, "Failed to load AVSampleRateKey");80 return @"AVSampleRateKey";81 }82 83 static NSString *getAVEncoderBitRateKeyWithFallback()84 {85 if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey())86 return PAL::get_AVFoundation_AVEncoderBitRateKey();87 88 RELEASE_LOG_ERROR(Media, "Failed to load AVEncoderBitRateKey");89 return @"AVEncoderBitRateKey";90 }91 92 RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(const MediaStreamTrackPrivate* audioTrack, const MediaStreamTrackPrivate* videoTrack)93 {94 int width = 0, height = 0;95 if (videoTrack) {96 auto& settings = videoTrack->settings();97 width = settings.width();98 height = settings.height();99 }100 return create(!!audioTrack, width, height);101 }102 103 117 RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(bool hasAudio, int width, int height) 104 118 { 105 NSString *directory = FileSystem::createTemporaryDirectory(@"videos"); 106 NSString *filename = [NSString stringWithFormat:@"/%lld.mp4", CMClockGetTime(CMClockGetHostTimeClock()).value]; 107 NSString *path = [directory stringByAppendingString:filename]; 108 109 NSURL *outputURL = [NSURL fileURLWithPath:path]; 110 String filePath = [path UTF8String]; 119 auto writer = adoptRef(*new MediaRecorderPrivateWriter(hasAudio, width && height)); 120 if (!writer->initialize()) 121 return nullptr; 122 return writer; 123 } 124 125 void MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken) 126 { 127 auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter); 128 writer->processNewCompressedVideoSampleBuffers(); 129 } 130 131 void MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken) 132 { 133 auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter); 134 writer->processNewCompressedAudioSampleBuffers(); 135 } 136 137 MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo) 138 : m_hasAudio(hasAudio) 139 , m_hasVideo(hasVideo) 140 { 141 } 142 143 MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter() 144 { 145 clear(); 146 } 147 148 bool MediaRecorderPrivateWriter::initialize() 149 { 111 150 NSError *error = nil; 112 auto avAssetWriter = adoptNS([PAL::allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]);151 m_writer = adoptNS([PAL::allocAVAssetWriterInstance() initWithFileType:AVFileTypeMPEG4 error:&error]); 113 152 if (error) { 114 153 RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)error.code); 115 return nullptr; 116 } 117 118 auto writer = adoptRef(*new MediaRecorderPrivateWriter(WTFMove(avAssetWriter), WTFMove(filePath))); 119 120 if (hasAudio && !writer->setAudioInput()) 121 return nullptr; 122 123 if (width && height) { 124 if (!writer->setVideoInput(width, height)) 125 return nullptr; 126 } 127 128 return WTFMove(writer); 129 } 130 131 MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(RetainPtr<AVAssetWriter>&& avAssetWriter, String&& filePath) 132 : m_writer(WTFMove(avAssetWriter)) 133 , m_path(WTFMove(filePath)) 134 { 135 } 136 137 MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter() 138 { 139 clear(); 154 return false; 155 } 156 157 m_writerDelegate = adoptNS([[WebAVAssetWriterDelegate alloc] initWithWriter: this]); 158 [m_writer.get() setDelegate:m_writerDelegate.get()]; 159 160 if (m_hasAudio) { 161 m_audioCompressor = AudioSampleBufferCompressor::create(compressedAudioOutputBufferCallback, this); 162 if (!m_audioCompressor) 163 return false; 164 } 165 if (m_hasVideo) { 166 m_videoCompressor = VideoSampleBufferCompressor::create(kCMVideoCodecType_H264, compressedVideoOutputBufferCallback, this); 167 if (!m_videoCompressor) 168 return false; 169 } 170 return true; 171 } 172 173 void MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers() 174 { 175 ASSERT(m_hasVideo); 176 if (!m_videoFormatDescription) { 177 m_videoFormatDescription = CMSampleBufferGetFormatDescription(m_videoCompressor->getOutputSampleBuffer()); 178 callOnMainThread([weakThis = makeWeakPtr(this), this] { 179 if (!weakThis) 180 return; 181 182 if (m_hasAudio && !m_audioFormatDescription) 183 return; 184 185 startAssetWriter(); 186 }); 187 } 188 if (!m_hasStartedWriting) 189 return; 190 appendCompressedSampleBuffers(); 191 } 192 193 void MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers() 194 { 195 ASSERT(m_hasAudio); 196 if (!m_audioFormatDescription) { 197 m_audioFormatDescription = CMSampleBufferGetFormatDescription(m_audioCompressor->getOutputSampleBuffer()); 198 callOnMainThread([weakThis = makeWeakPtr(this), this] { 199 if (!weakThis) 200 return; 201 202 if (m_hasVideo && !m_videoFormatDescription) 203 return; 204 205 startAssetWriter(); 206 }); 207 } 208 if (!m_hasStartedWriting) 209 return; 210 appendCompressedSampleBuffers(); 211 } 212 213 void MediaRecorderPrivateWriter::startAssetWriter() 214 { 215 if (m_hasVideo) { 216 m_videoAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:m_videoFormatDescription.get()]); 217 [m_videoAssetWriterInput setExpectsMediaDataInRealTime:true]; 218 if (![m_writer.get() canAddInput:m_videoAssetWriterInput.get()]) { 219 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for video"); 220 return; 221 } 222 [m_writer.get() addInput:m_videoAssetWriterInput.get()]; 223 } 224 225 if (m_hasAudio) { 226 m_audioAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:m_audioFormatDescription.get()]); 227 [m_audioAssetWriterInput setExpectsMediaDataInRealTime:true]; 228 if (![m_writer.get() canAddInput:m_audioAssetWriterInput.get()]) { 229 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for audio"); 230 return; 231 } 232 [m_writer.get() addInput:m_audioAssetWriterInput.get()]; 233 } 234 235 if (![m_writer.get() startWriting]) { 236 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed startWriting"); 237 return; 238 } 239 240 [m_writer.get() startSessionAtSourceTime:kCMTimeZero]; 241 242 appendCompressedSampleBuffers(); 243 244 m_hasStartedWriting = true; 245 } 246 247 bool MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer() 248 { 249 if (!m_audioCompressor) 250 return false; 251 252 if (![m_audioAssetWriterInput isReadyForMoreMediaData]) 253 return false; 254 255 auto buffer = m_audioCompressor->takeOutputSampleBuffer(); 256 if (!buffer) 257 return false; 258 259 [m_audioAssetWriterInput.get() appendSampleBuffer:buffer.get()]; 260 return true; 261 } 262 263 bool MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer() 264 { 265 if (!m_videoCompressor) 266 return false; 267 268 if (![m_videoAssetWriterInput isReadyForMoreMediaData]) 269 return false; 270 271 auto buffer = m_videoCompressor->takeOutputSampleBuffer(); 272 if (!buffer) 273 return false; 274 275 m_lastVideoPresentationTime = CMSampleBufferGetPresentationTimeStamp(buffer.get()); 276 m_lastVideoDecodingTime = CMSampleBufferGetDecodeTimeStamp(buffer.get()); 277 m_hasEncodedVideoSamples = true; 278 279 [m_videoAssetWriterInput.get() appendSampleBuffer:buffer.get()]; 280 return true; 281 } 282 283 void MediaRecorderPrivateWriter::appendCompressedSampleBuffers() 284 { 285 while (appendCompressedVideoSampleBuffer() || appendCompressedAudioSampleBuffer()) { }; 286 } 287 288 static inline void appendEndsPreviousSampleDurationMarker(AVAssetWriterInput *assetWriterInput, CMTime presentationTimeStamp, CMTime decodingTimeStamp) 289 { 290 CMSampleTimingInfo timingInfo = { kCMTimeInvalid, presentationTimeStamp, decodingTimeStamp}; 291 292 CMSampleBufferRef buffer = NULL; 293 auto error = CMSampleBufferCreate(kCFAllocatorDefault, NULL, true, NULL, NULL, NULL, 0, 1, &timingInfo, 0, NULL, &buffer); 294 if (error) { 295 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendEndsPreviousSampleDurationMarker failed CMSampleBufferCreate with %d", error); 296 return; 297 } 298 auto sampleBuffer = adoptCF(buffer); 299 300 CMSetAttachment(sampleBuffer.get(), kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, kCFBooleanTrue, kCMAttachmentMode_ShouldPropagate); 301 if (![assetWriterInput appendSampleBuffer:sampleBuffer.get()]) 302 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendSampleBuffer to writer input failed"); 303 } 304 305 void MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&& completionHandler) 306 { 307 if (!m_hasEncodedVideoSamples) { 308 completionHandler(); 309 return; 310 } 311 if ([m_videoAssetWriterInput isReadyForMoreMediaData]) { 312 appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime); 313 completionHandler(); 314 return; 315 } 316 317 auto block = makeBlockPtr([this, weakThis = makeWeakPtr(this), completionHandler = WTFMove(completionHandler)]() mutable { 318 if (weakThis) { 319 appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime); 320 [m_videoAssetWriterInput markAsFinished]; 321 } 322 completionHandler(); 323 }); 324 [m_videoAssetWriterInput requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:block.get()]; 325 } 326 327 void MediaRecorderPrivateWriter::flushCompressedSampleBuffers(CompletionHandler<void()>&& completionHandler) 328 { 329 appendCompressedSampleBuffers(); 330 appendEndOfVideoSampleDurationIfNeeded(WTFMove(completionHandler)); 140 331 } 141 332 142 333 void MediaRecorderPrivateWriter::clear() 143 334 { 144 if (m_videoInput) {145 m_videoInput.clear();146 dispatch_release(m_videoPullQueue);147 }148 if (m_audioInput) {149 m_audioInput.clear();150 dispatch_release(m_audioPullQueue);151 }152 335 if (m_writer) 153 336 m_writer.clear(); … … 158 341 } 159 342 160 bool MediaRecorderPrivateWriter::setVideoInput(int width, int height)161 {162 ASSERT(!m_videoInput);163 164 NSDictionary *compressionProperties = @{165 AVVideoAverageBitRateKey : [NSNumber numberWithInt:width * height * 12],166 AVVideoExpectedSourceFrameRateKey : @(30),167 AVVideoMaxKeyFrameIntervalKey : @(120),168 AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel169 };170 171 NSDictionary *videoSettings = @{172 AVVideoCodecKey: AVVideoCodecH264,173 AVVideoWidthKey: [NSNumber numberWithInt:width],174 AVVideoHeightKey: [NSNumber numberWithInt:height],175 AVVideoCompressionPropertiesKey: compressionProperties176 };177 178 m_videoInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]);179 [m_videoInput setExpectsMediaDataInRealTime:true];180 181 if (![m_writer canAddInput:m_videoInput.get()]) {182 m_videoInput = nullptr;183 RELEASE_LOG_ERROR(MediaStream, "the video input is not allowed to add to the AVAssetWriter");184 return false;185 }186 [m_writer addInput:m_videoInput.get()];187 m_videoPullQueue = dispatch_queue_create("WebCoreVideoRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);188 return true;189 }190 191 bool MediaRecorderPrivateWriter::setAudioInput()192 {193 ASSERT(!m_audioInput);194 195 NSDictionary *audioSettings = @{196 AVEncoderBitRateKey : @(28000),197 AVFormatIDKey : @(kAudioFormatMPEG4AAC),198 AVNumberOfChannelsKey : @(1),199 AVSampleRateKey : @(22050)200 };201 202 m_audioInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]);203 [m_audioInput setExpectsMediaDataInRealTime:true];204 205 if (![m_writer canAddInput:m_audioInput.get()]) {206 m_audioInput = nullptr;207 RELEASE_LOG_ERROR(MediaStream, "the audio input is not allowed to add to the AVAssetWriter");208 return false;209 }210 [m_writer addInput:m_audioInput.get()];211 m_audioPullQueue = dispatch_queue_create("WebCoreAudioRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);212 return true;213 }214 343 215 344 static inline RetainPtr<CMSampleBufferRef> copySampleBufferWithCurrentTimeStamp(CMSampleBufferRef originalBuffer) … … 218 347 CMItemCount count = 0; 219 348 CMSampleBufferGetSampleTimingInfoArray(originalBuffer, 0, nil, &count); 220 349 221 350 Vector<CMSampleTimingInfo> timeInfo(count); 222 351 CMSampleBufferGetSampleTimingInfoArray(originalBuffer, count, timeInfo.data(), &count); 223 224 for ( CMItemCounti = 0; i < count; i++) {352 353 for (auto i = 0; i < count; i++) { 225 354 timeInfo[i].decodeTimeStamp = kCMTimeInvalid; 226 355 timeInfo[i].presentationTimeStamp = startTime; 227 356 } 228 357 229 358 CMSampleBufferRef newBuffer = nullptr; 230 auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer); 231 if (error) 232 return nullptr; 359 if (auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer)) { 360 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferCreateCopyWithNewTiming failed with %d", error); 361 return nullptr; 362 } 233 363 return adoptCF(newBuffer); 234 364 } … … 236 366 void MediaRecorderPrivateWriter::appendVideoSampleBuffer(CMSampleBufferRef sampleBuffer) 237 367 { 238 ASSERT(m_videoInput); 239 if (m_isStopped) 240 return; 241 242 if (!m_hasStartedWriting) { 243 if (![m_writer startWriting]) { 244 m_isStopped = true; 245 RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)[m_writer error]); 246 return; 247 } 248 [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())]; 249 m_hasStartedWriting = true; 250 RefPtr<MediaRecorderPrivateWriter> protectedThis = this; 251 [m_videoInput requestMediaDataWhenReadyOnQueue:m_videoPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] { 252 do { 253 if (![m_videoInput isReadyForMoreMediaData]) 254 break; 255 auto locker = holdLock(m_videoLock); 256 if (m_videoBufferPool.isEmpty()) 257 break; 258 auto buffer = m_videoBufferPool.takeFirst(); 259 locker.unlockEarly(); 260 if (![m_videoInput appendSampleBuffer:buffer.get()]) 261 break; 262 } while (true); 263 if (m_isStopped && m_videoBufferPool.isEmpty()) { 264 [m_videoInput markAsFinished]; 265 m_finishWritingVideoSemaphore.signal(); 266 } 267 }]; 268 return; 269 } 270 auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer); 271 if (!bufferWithCurrentTime) 272 return; 273 274 auto locker = holdLock(m_videoLock); 275 m_videoBufferPool.append(WTFMove(bufferWithCurrentTime)); 368 // FIXME: We should not set the timestamps if they are already set. 369 if (auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer)) 370 m_videoCompressor->addSampleBuffer(bufferWithCurrentTime.get()); 276 371 } 277 372 … … 281 376 CMFormatDescriptionRef format = nullptr; 282 377 auto error = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, basicDescription, 0, NULL, 0, NULL, NULL, &format); 283 if (error) 284 return nullptr; 378 if (error) { 379 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMAudioFormatDescriptionCreate failed with %d", error); 380 return nullptr; 381 } 285 382 return adoptCF(format); 286 383 } 287 384 288 static inline RetainPtr<CMSampleBufferRef> createAudioSampleBufferWithPacketDescriptions(CMFormatDescriptionRef format, size_t sampleCount) 289 { 290 CMTime startTime = CMClockGetTime(CMClockGetHostTimeClock()); 291 CMSampleBufferRef sampleBuffer = nullptr; 292 auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format, sampleCount, startTime, NULL, &sampleBuffer); 293 if (error) 294 return nullptr; 295 return adoptCF(sampleBuffer); 296 } 297 298 void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime&, size_t sampleCount) 299 { 300 ASSERT(m_audioInput); 301 if ((!m_hasStartedWriting && m_videoInput) || m_isStopped) 302 return; 385 static inline RetainPtr<CMSampleBufferRef> createAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount) 386 { 303 387 auto format = createAudioFormatDescription(description); 304 388 if (!format) 305 return; 306 if (m_isFirstAudioSample) { 307 if (!m_videoInput) { 308 // audio-only recording. 309 if (![m_writer startWriting]) { 310 m_isStopped = true; 311 return; 312 } 313 [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())]; 314 m_hasStartedWriting = true; 315 } 316 m_isFirstAudioSample = false; 317 RefPtr<MediaRecorderPrivateWriter> protectedThis = this; 318 [m_audioInput requestMediaDataWhenReadyOnQueue:m_audioPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] { 319 do { 320 if (![m_audioInput isReadyForMoreMediaData]) 321 break; 322 auto locker = holdLock(m_audioLock); 323 if (m_audioBufferPool.isEmpty()) 324 break; 325 auto buffer = m_audioBufferPool.takeFirst(); 326 locker.unlockEarly(); 327 [m_audioInput appendSampleBuffer:buffer.get()]; 328 } while (true); 329 if (m_isStopped && m_audioBufferPool.isEmpty()) { 330 [m_audioInput markAsFinished]; 331 m_finishWritingAudioSemaphore.signal(); 332 } 389 return nullptr; 390 391 CMSampleBufferRef sampleBuffer = nullptr; 392 auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format.get(), sampleCount, toCMTime(time), NULL, &sampleBuffer); 393 if (error) { 394 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter createAudioSampleBufferWithPacketDescriptions failed with %d", error); 395 return nullptr; 396 } 397 auto buffer = adoptCF(sampleBuffer); 398 399 error = CMSampleBufferSetDataBufferFromAudioBufferList(buffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list()); 400 if (error) { 401 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferSetDataBufferFromAudioBufferList failed with %d", error); 402 return nullptr; 403 } 404 return buffer; 405 } 406 407 void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount) 408 { 409 if (auto sampleBuffer = createAudioSampleBuffer(data, description, time, sampleCount)) 410 m_audioCompressor->addSampleBuffer(sampleBuffer.get()); 411 } 412 413 void MediaRecorderPrivateWriter::stopRecording() 414 { 415 if (m_isStopped) 416 return; 417 418 m_isStopped = true; 419 420 if (m_videoCompressor) 421 m_videoCompressor->finish(); 422 if (m_audioCompressor) 423 m_audioCompressor->finish(); 424 425 if (!m_hasStartedWriting) 426 return; 427 ASSERT([m_writer status] == AVAssetWriterStatusWriting); 428 429 m_isStopping = true; 430 431 flushCompressedSampleBuffers([this, weakThis = makeWeakPtr(this)]() mutable { 432 if (!weakThis) 433 return; 434 435 [m_writer flush]; 436 [m_writer finishWritingWithCompletionHandler:[this, weakThis = WTFMove(weakThis)]() mutable { 437 callOnMainThread([this, weakThis = WTFMove(weakThis)]() mutable { 438 if (!weakThis) 439 return; 440 441 m_isStopping = false; 442 if (m_fetchDataCompletionHandler) { 443 auto buffer = WTFMove(m_data); 444 m_fetchDataCompletionHandler(WTFMove(buffer)); 445 } 446 447 m_isStopped = false; 448 m_hasStartedWriting = false; 449 clear(); 450 }); 333 451 }]; 334 } 335 336 auto sampleBuffer = createAudioSampleBufferWithPacketDescriptions(format.get(), sampleCount); 337 if (!sampleBuffer) 338 return; 339 auto error = CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list()); 340 if (error) 341 return; 342 343 auto locker = holdLock(m_audioLock); 344 m_audioBufferPool.append(WTFMove(sampleBuffer)); 345 } 346 347 void MediaRecorderPrivateWriter::stopRecording() 348 { 349 if (m_isStopped) 350 return; 351 352 m_isStopped = true; 353 if (!m_hasStartedWriting) 354 return; 355 ASSERT([m_writer status] == AVAssetWriterStatusWriting); 356 if (m_videoInput) 357 m_finishWritingVideoSemaphore.wait(); 358 359 if (m_audioInput) 360 m_finishWritingAudioSemaphore.wait(); 361 362 m_isStopping = true; 363 [m_writer finishWritingWithCompletionHandler:[this, weakPtr = makeWeakPtr(*this)]() mutable { 364 callOnMainThread([this, weakPtr = WTFMove(weakPtr), buffer = SharedBuffer::createWithContentsOfFile(m_path)]() mutable { 365 if (!weakPtr) 366 return; 367 368 m_isStopping = false; 369 if (m_fetchDataCompletionHandler) 370 m_fetchDataCompletionHandler(WTFMove(buffer)); 371 else 372 m_data = WTFMove(buffer); 373 374 m_isStopped = false; 375 m_hasStartedWriting = false; 376 m_isFirstAudioSample = true; 377 clear(); 378 }); 379 m_finishWritingSemaphore.signal(); 380 }]; 381 m_finishWritingSemaphore.wait(); 452 }); 382 453 } 383 454 … … 393 464 } 394 465 466 void MediaRecorderPrivateWriter::appendData(const char* data, size_t size) 467 { 468 if (!m_data) { 469 m_data = SharedBuffer::create(data, size); 470 return; 471 } 472 m_data->append(data, size); 473 } 474 475 void MediaRecorderPrivateWriter::appendData(Ref<SharedBuffer>&& buffer) 476 { 477 if (!m_data) { 478 m_data = WTFMove(buffer); 479 return; 480 } 481 m_data->append(WTFMove(buffer)); 482 } 483 395 484 } // namespace WebCore 396 485 397 #endif // ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)486 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h
r255909 r255910 1 1 /* 2 * Copyright (C) 20 18Apple Inc. All rights reserved.2 * Copyright (C) 2020 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 28 28 29 #include "MediaRecorderPrivate.h" 30 #include "MediaRecorderPrivateWriterCocoa.h" 29 #include <CoreMedia/CoreMedia.h> 30 #include <VideoToolbox/VTErrors.h> 31 32 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 33 typedef struct OpaqueVTCompressionSession *VTCompressionSessionRef; 31 34 32 35 namespace WebCore { 33 36 34 class MediaStreamPrivate; 35 36 class MediaRecorderPrivateAVFImpl final : public MediaRecorderPrivate { 37 class VideoSampleBufferCompressor { 37 38 WTF_MAKE_FAST_ALLOCATED; 38 39 public: 39 static std::unique_ptr<MediaRecorderPrivateAVFImpl> create(const MediaStreamPrivate&); 40 static std::unique_ptr<VideoSampleBufferCompressor> create(CMVideoCodecType, CMBufferQueueTriggerCallback, void* callbackObject); 41 ~VideoSampleBufferCompressor(); 42 43 void finish(); 44 void addSampleBuffer(CMSampleBufferRef); 45 CMSampleBufferRef getOutputSampleBuffer(); 46 RetainPtr<CMSampleBufferRef> takeOutputSampleBuffer(); 40 47 41 48 private: 42 MediaRecorderPrivateAVFImpl(Ref<MediaRecorderPrivateWriter>&&, String&& audioTrackId, String&& videoTrackId);49 explicit VideoSampleBufferCompressor(CMVideoCodecType); 43 50 44 friend std::unique_ptr<MediaRecorderPrivateAVFImpl> std::make_unique<MediaRecorderPrivateAVFImpl>(Ref<MediaRecorderPrivateWriter>&&, String&&, String&&);51 bool initialize(CMBufferQueueTriggerCallback, void* callbackObject); 45 52 46 void sampleBufferUpdated(const MediaStreamTrackPrivate&, MediaSample&) final; 47 void audioSamplesAvailable(const MediaStreamTrackPrivate&, const WTF::MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) final; 48 void fetchData(CompletionHandler<void(RefPtr<SharedBuffer>&&, const String&)>&&) final; 49 const String& mimeType(); 50 void stopRecording(); 51 52 Ref<MediaRecorderPrivateWriter> m_writer; 53 String m_recordedAudioTrackID; 54 String m_recordedVideoTrackID; 53 void processSampleBuffer(CMSampleBufferRef); 54 bool initCompressionSession(CMVideoFormatDescriptionRef); 55 56 static void videoCompressionCallback(void *refCon, void*, OSStatus, VTEncodeInfoFlags, CMSampleBufferRef); 57 58 dispatch_queue_t m_serialDispatchQueue; 59 RetainPtr<CMBufferQueueRef> m_outputBufferQueue; 60 RetainPtr<VTCompressionSessionRef> m_vtSession; 61 62 bool m_isEncoding { false }; 63 64 CMVideoCodecType m_outputCodecType; 65 float m_maxKeyFrameIntervalDuration { 2.0 }; 66 unsigned m_expectedFrameRate { 30 }; 55 67 }; 56 68 57 } // namespace WebCore69 } 58 70 59 #endif // ENABLE(MEDIA_STREAM)71 #endif -
trunk/Source/WebKit/ChangeLog
r255909 r255910 1 2020-02-06 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 Enable RemoteMediaRecorder only for systems supporting AVAssetWriterDelegate. 10 11 * GPUProcess/GPUConnectionToWebProcess.cpp: 12 (WebKit::GPUConnectionToWebProcess::didReceiveMessage): 13 * GPUProcess/GPUConnectionToWebProcess.h: 14 * GPUProcess/webrtc/RemoteMediaRecorder.cpp: 15 * GPUProcess/webrtc/RemoteMediaRecorder.h: 16 * GPUProcess/webrtc/RemoteMediaRecorder.messages.in: 17 * GPUProcess/webrtc/RemoteMediaRecorderManager.cpp: 18 * GPUProcess/webrtc/RemoteMediaRecorderManager.h: 19 * GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in: 20 * GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h: 21 * WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp: 22 * WebProcess/GPU/webrtc/MediaRecorderPrivate.h: 23 * WebProcess/GPU/webrtc/MediaRecorderProvider.cpp: 24 (WebKit::MediaRecorderProvider::createMediaRecorderPrivate): 25 1 26 2020-02-06 youenn fablet <youenn@apple.com> 2 27 -
trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.cpp
r255819 r255910 150 150 } 151 151 152 #if HAVE(AVASSETWRITERDELEGATE) 152 153 RemoteMediaRecorderManager& GPUConnectionToWebProcess::mediaRecorderManager() 153 154 { … … 157 158 return *m_remoteMediaRecorderManager; 158 159 } 160 #endif 159 161 160 162 #if ENABLE(VIDEO_TRACK) … … 175 177 } 176 178 #endif 177 #endif 179 #endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 178 180 179 181 #if PLATFORM(COCOA) && USE(LIBWEBRTC) … … 204 206 return; 205 207 } 208 #if HAVE(AVASSETWRITERDELEGATE) 206 209 if (decoder.messageReceiverName() == Messages::RemoteMediaRecorderManager::messageReceiverName()) { 207 210 mediaRecorderManager().didReceiveMessageFromWebProcess(connection, decoder); … … 212 215 return; 213 216 } 217 #endif // HAVE(AVASSETWRITERDELEGATE) 214 218 #if PLATFORM(COCOA) && ENABLE(VIDEO_TRACK) 215 219 if (decoder.messageReceiverName() == Messages::RemoteAudioMediaStreamTrackRendererManager::messageReceiverName()) { … … 229 233 return; 230 234 } 231 #endif 232 #endif 235 #endif // PLATFORM(COCOA) && ENABLE(VIDEO_TRACK) 236 #endif // ENABLE(MEDIA_STREAM) 233 237 #if PLATFORM(COCOA) && USE(LIBWEBRTC) 234 238 if (decoder.messageReceiverName() == Messages::LibWebRTCCodecsProxy::messageReceiverName()) { -
trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.h
r255819 r255910 80 80 #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 81 81 UserMediaCaptureManagerProxy& userMediaCaptureManagerProxy(); 82 #if HAVE(AVASSETWRITERDELEGATE) 82 83 RemoteMediaRecorderManager& mediaRecorderManager(); 84 #endif 83 85 #if ENABLE(VIDEO_TRACK) 84 86 RemoteAudioMediaStreamTrackRendererManager& audioTrackRendererManager(); … … 103 105 #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 104 106 std::unique_ptr<UserMediaCaptureManagerProxy> m_userMediaCaptureManagerProxy; 107 #if HAVE(AVASSETWRITERDELEGATE) 105 108 std::unique_ptr<RemoteMediaRecorderManager> m_remoteMediaRecorderManager; 109 #endif 106 110 #if ENABLE(VIDEO_TRACK) 107 111 std::unique_ptr<RemoteAudioMediaStreamTrackRendererManager> m_audioTrackRendererManager; -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.cpp
r255819 r255910 27 27 #include "RemoteMediaRecorder.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "SharedRingBufferStorage.h" … … 137 137 } 138 138 139 #endif 139 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.h
r255819 r255910 26 26 #pragma once 27 27 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 29 29 30 30 #include "MediaRecorderIdentifier.h" … … 83 83 } 84 84 85 #endif 85 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.messages.in
r255819 r255910 22 22 # THE POSSIBILITY OF SUCH DAMAGE. 23 23 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 25 25 26 26 messages -> RemoteMediaRecorder NotRefCounted { -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.cpp
r255819 r255910 27 27 #include "RemoteMediaRecorderManager.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "DataReference.h" -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.h
r255819 r255910 28 28 #pragma once 29 29 30 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 30 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 31 31 32 32 #include "MediaRecorderIdentifier.h" -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in
r255819 r255910 22 22 # THE POSSIBILITY OF SUCH DAMAGE. 23 23 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 25 25 26 26 messages -> RemoteMediaRecorderManager NotRefCounted { -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h
r255819 r255910 31 31 #include "RemoteSampleBufferDisplayLayerManagerMessagesReplies.h" 32 32 #include "SampleBufferDisplayLayerIdentifier.h" 33 #include <WebCore/IntSize.h> 33 34 #include <wtf/HashMap.h> 34 35 -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp
r255819 r255910 27 27 #include "MediaRecorderPrivate.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "GPUProcessConnection.h" … … 147 147 } 148 148 149 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 149 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.h
r255819 r255910 26 26 #pragma once 27 27 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 29 29 30 30 #include "MediaRecorderIdentifier.h" … … 74 74 } 75 75 76 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 76 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 77 77 -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderProvider.cpp
r255819 r255910 38 38 std::unique_ptr<WebCore::MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(const MediaStreamPrivate& stream) 39 39 { 40 #if ENABLE(GPU_PROCESS) 40 #if ENABLE(GPU_PROCESS) && HAVE(AVASSETWRITERDELEGATE) 41 41 if (m_useGPUProcess) 42 42 return makeUnique<MediaRecorderPrivate>(stream);
Note: See TracChangeset
for help on using the changeset viewer.