Changeset 262708 in webkit
- Timestamp:
- Jun 8, 2020 4:49:15 AM (4 years ago)
- Location:
- trunk
- Files:
-
- 7 added
- 33 edited
- 1 copied
Legend:
- Unmodified
- Added
- Removed
-
trunk/LayoutTests/ChangeLog
r262706 r262708 1 2020-06-08 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 6 Reviewed by Eric Carlson. 7 8 Disable tests on all platforms except the ones supporting AVAssetWriterDelegate. 9 10 * TestExpectations: 11 * http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html: 12 Remove web audio generation since there seems to be some unstability in web audio -> stream -> media recorder. 13 which should be fixed as follow-up specific patches. 14 * platform/mac/TestExpectations: 15 Enable running tests. 16 1 17 2020-06-08 Diego Pino Garcia <dpino@igalia.com> 2 18 -
trunk/LayoutTests/TestExpectations
r262669 r262708 3300 3300 webgl/1.0.3/conformance/extensions/webgl-draw-buffers.html [ Skip ] 3301 3301 3302 webkit.org/b/197673 http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable.html [ Pass Failure Timeout ] 3302 # Not supported by default 3303 http/wpt/mediarecorder [ Skip ] 3304 imported/w3c/web-platform-tests/mediacapture-record [ Skip ] 3305 fast/history/page-cache-media-recorder.html [ Skip ] 3303 3306 3304 3307 # WebGL 2 Conformance Suite rules for regular bots post ANGLE backend adoption. -
trunk/LayoutTests/http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html
r262663 r262708 57 57 58 58 async_test(t => { 59 const ac = new AudioContext();60 const osc = ac.createOscillator();61 const dest = ac.createMediaStreamDestination();62 const audio = dest.stream;63 osc.connect(dest);64 65 59 const video = createVideoStream(); 66 assert_equals(video.getAudioTracks().length, 0, "video mediastream starts with no audio track");67 assert_equals(audio.getAudioTracks().length, 1, "audio mediastream starts with one audio track");68 video.addTrack(audio.getAudioTracks()[0]);69 assert_equals(video.getAudioTracks().length, 1, "video mediastream starts with one audio track");70 60 const recorder = new MediaRecorder(video); 71 61 let mode = 0; -
trunk/LayoutTests/platform/mac/TestExpectations
r262688 r262708 1709 1709 [ Catalina+ ] fast/text/design-system-ui-16.html [ Pass ] 1710 1710 1711 [ Catalina+ ] http/wpt/mediarecorder [ Pass Failure ] 1712 [ Catalina+ ] imported/w3c/web-platform-tests/mediacapture-record [ Pass Failure ] 1713 [ Catalina+ ] fast/history/page-cache-media-recorder.html [ Pass Failure ] 1714 1711 1715 webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ] 1712 1716 -
trunk/Source/WTF/ChangeLog
r262700 r262708 1 2020-06-08 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 6 Reviewed by Eric Carlson. 7 8 * wtf/PlatformHave.h: 9 1 10 2020-06-07 Andy Estes <aestes@apple.com> 2 11 -
trunk/Source/WTF/wtf/PlatformHave.h
r262700 r262708 543 543 #endif 544 544 545 #if ((PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101500) || PLATFORM(IOS)) && (defined __has_include && __has_include(<AVFoundation/AVAssetWriter_Private.h>)) 546 #define HAVE_AVASSETWRITERDELEGATE 1 547 #endif 548 545 549 #if PLATFORM(IOS_FAMILY) && !PLATFORM(WATCHOS) && !PLATFORM(APPLETV) 546 550 #define HAVE_SYSTEM_FONT_STYLE_TITLE_0 1 -
trunk/Source/WebCore/ChangeLog
r262707 r262708 1 2020-06-08 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 AVAssetWriterDelegate allows to grab recorded data whenever wanted. 10 This delegate requires passing compressed samples to AVAssetWriter. 11 Implement video encoding and audio encoding in dedicated classes and use these classes before adding buffers to AVAssetWriter. 12 These classes are AudioSampleBufferCompressor and VideoSampleBufferCompressor. 13 They support AAC and H264 so far and should be further improved to support more encoding options. 14 15 Instantiate real writer only for platforms supporting AVAssetWriterDelegate, since it is not supported everywhere. 16 The writer, doing the pacakging, is receiving compressed buffer from the audio/video compressors. 17 It then sends data when being request to flush to its delegate, which will send data to the MediaRecorderPrivateWriter. 18 The MediaRecorderPrivateWriter stores the data in a SharedBuffer until MediaRecorder asks for data. 19 20 Note that, whenever we request data, we flush the writer and insert an end of video sample to make sure video data gets flushed. 21 Therefore data should not be requested too fast to get adequate video compression. 22 23 Covered by existing tests. 24 25 * Modules/mediarecorder/MediaRecorderProvider.cpp: 26 (WebCore::MediaRecorderProvider::createMediaRecorderPrivate): 27 * WebCore.xcodeproj/project.pbxproj: 28 * platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp: 29 (WebCore::MediaRecorderPrivateAVFImpl::create): 30 * platform/mediarecorder/MediaRecorderPrivateAVFImpl.h: 31 * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.h: Added. 32 * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm: Added. 33 (WebCore::AudioSampleBufferCompressor::create): 34 (WebCore::AudioSampleBufferCompressor::AudioSampleBufferCompressor): 35 (WebCore::AudioSampleBufferCompressor::~AudioSampleBufferCompressor): 36 (WebCore::AudioSampleBufferCompressor::initialize): 37 (WebCore::AudioSampleBufferCompressor::finish): 38 (WebCore::AudioSampleBufferCompressor::initAudioConverterForSourceFormatDescription): 39 (WebCore::AudioSampleBufferCompressor::computeBufferSizeForAudioFormat): 40 (WebCore::AudioSampleBufferCompressor::attachPrimingTrimsIfNeeded): 41 (WebCore::AudioSampleBufferCompressor::gradualDecoderRefreshCount): 42 (WebCore::AudioSampleBufferCompressor::sampleBufferWithNumPackets): 43 (WebCore::AudioSampleBufferCompressor::audioConverterComplexInputDataProc): 44 (WebCore::AudioSampleBufferCompressor::provideSourceDataNumOutputPackets): 45 (WebCore::AudioSampleBufferCompressor::processSampleBuffersUntilLowWaterTime): 46 (WebCore::AudioSampleBufferCompressor::processSampleBuffer): 47 (WebCore::AudioSampleBufferCompressor::addSampleBuffer): 48 (WebCore::AudioSampleBufferCompressor::getOutputSampleBuffer): 49 (WebCore::AudioSampleBufferCompressor::takeOutputSampleBuffer): 50 * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h: 51 * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm: 52 (-[WebAVAssetWriterDelegate initWithWriter:]): 53 (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedHeaderData:]): 54 (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedMediaData:fragmentedMediaDataReport:]): 55 (-[WebAVAssetWriterDelegate close]): 56 (WebCore::MediaRecorderPrivateWriter::create): 57 (WebCore::MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback): 58 (WebCore::MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback): 59 (WebCore::MediaRecorderPrivateWriter::MediaRecorderPrivateWriter): 60 (WebCore::MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter): 61 (WebCore::MediaRecorderPrivateWriter::initialize): 62 (WebCore::MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers): 63 (WebCore::MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers): 64 (WebCore::MediaRecorderPrivateWriter::startAssetWriter): 65 (WebCore::MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer): 66 (WebCore::MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer): 67 (WebCore::MediaRecorderPrivateWriter::appendCompressedSampleBuffers): 68 (WebCore::appendEndsPreviousSampleDurationMarker): 69 (WebCore::MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded): 70 (WebCore::MediaRecorderPrivateWriter::flushCompressedSampleBuffers): 71 (WebCore::MediaRecorderPrivateWriter::clear): 72 (WebCore::copySampleBufferWithCurrentTimeStamp): 73 (WebCore::MediaRecorderPrivateWriter::appendVideoSampleBuffer): 74 (WebCore::createAudioFormatDescription): 75 (WebCore::createAudioSampleBuffer): 76 (WebCore::MediaRecorderPrivateWriter::appendAudioSampleBuffer): 77 (WebCore::MediaRecorderPrivateWriter::stopRecording): 78 (WebCore::MediaRecorderPrivateWriter::appendData): 79 * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h: Copied from Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h. 80 * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm: Added. 81 (WebCore::VideoSampleBufferCompressor::create): 82 (WebCore::VideoSampleBufferCompressor::VideoSampleBufferCompressor): 83 (WebCore::VideoSampleBufferCompressor::~VideoSampleBufferCompressor): 84 (WebCore::VideoSampleBufferCompressor::initialize): 85 (WebCore::VideoSampleBufferCompressor::finish): 86 (WebCore::VideoSampleBufferCompressor::videoCompressionCallback): 87 (WebCore::VideoSampleBufferCompressor::initCompressionSession): 88 (WebCore::VideoSampleBufferCompressor::processSampleBuffer): 89 (WebCore::VideoSampleBufferCompressor::addSampleBuffer): 90 (WebCore::VideoSampleBufferCompressor::getOutputSampleBuffer): 91 (WebCore::VideoSampleBufferCompressor::takeOutputSampleBuffer): 92 1 93 2020-06-08 Youenn Fablet <youenn@apple.com> 2 94 -
trunk/Source/WebCore/Modules/mediarecorder/MediaRecorderProvider.cpp
r262663 r262708 35 35 std::unique_ptr<MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(MediaStreamPrivate& stream) 36 36 { 37 #if HAVE(AVASSETWRITERDELEGATE) 37 38 return MediaRecorderPrivateAVFImpl::create(stream); 39 #else 40 UNUSED_PARAM(stream); 41 return nullptr; 42 #endif 38 43 } 39 44 -
trunk/Source/WebCore/PAL/ChangeLog
r262695 r262708 1 2020-06-08 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 Add soft link macros for VideoToolbox and AudioToolbox. 10 11 * PAL.xcodeproj/project.pbxproj: 12 * pal/cf/AudioToolboxSoftLink.cpp: Added. 13 * pal/cf/AudioToolboxSoftLink.h: Added. 14 * pal/cf/CoreMediaSoftLink.cpp: 15 * pal/cf/CoreMediaSoftLink.h: 16 * pal/cf/VideoToolboxSoftLink.cpp: Added. 17 * pal/cf/VideoToolboxSoftLink.h: Added. 18 1 19 2020-06-07 Philippe Normand <pnormand@igalia.com> 2 20 -
trunk/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj
r262663 r262708 120 120 2E1342CD215AA10A007199D2 /* UIKitSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 2E1342CB215AA10A007199D2 /* UIKitSoftLink.mm */; }; 121 121 31308B1420A21705003FB929 /* SystemPreviewSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 31308B1320A21705003FB929 /* SystemPreviewSPI.h */; }; 122 416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */; }; 123 416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */; }; 124 41E1F344248A6A000022D5DE /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */; }; 122 125 442956CD218A72DF0080DB54 /* RevealSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 442956CC218A72DE0080DB54 /* RevealSPI.h */; }; 123 126 4450FC9F21F5F602004DFA56 /* QuickLookSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */; }; … … 301 304 31308B1320A21705003FB929 /* SystemPreviewSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = SystemPreviewSPI.h; sourceTree = "<group>"; }; 302 305 37119A7820CCB5FF002C6DC9 /* WebKitTargetConditionals.xcconfig */ = {isa = PBXFileReference; lastKnownFileType = text.xcconfig; path = WebKitTargetConditionals.xcconfig; sourceTree = "<group>"; }; 306 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AudioToolboxSoftLink.cpp; sourceTree = "<group>"; }; 307 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioToolboxSoftLink.h; sourceTree = "<group>"; }; 308 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; }; 309 416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; }; 303 310 442956CC218A72DE0080DB54 /* RevealSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RevealSPI.h; sourceTree = "<group>"; }; 304 311 4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = QuickLookSoftLink.mm; sourceTree = "<group>"; }; … … 542 549 isa = PBXGroup; 543 550 children = ( 551 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */, 552 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */, 544 553 0CF99CA61F738436007EE793 /* CoreMediaSoftLink.cpp */, 545 554 0CF99CA71F738437007EE793 /* CoreMediaSoftLink.h */, 555 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */, 556 416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */, 546 557 ); 547 558 path = cf; … … 736 747 57FD318B22B35989008D0E8B /* AppSSOSoftLink.h in Headers */, 737 748 576CA9D622B854AB0030143C /* AppSSOSPI.h in Headers */, 749 416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */, 738 750 2D02E93C2056FAA700A13797 /* AudioToolboxSPI.h in Headers */, 739 751 572A107822B456F500F410C8 /* AuthKitSPI.h in Headers */, … … 954 966 293EE4A824154F8F0047493D /* AccessibilitySupportSoftLink.cpp in Sources */, 955 967 57FD318A22B3593E008D0E8B /* AppSSOSoftLink.mm in Sources */, 968 416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */, 956 969 077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */, 957 970 0C5FFF0F1F78D9DA009EFF1A /* ClockCM.mm in Sources */, … … 968 981 5C7C787423AC3E770065F47E /* ManagedConfigurationSoftLink.mm in Sources */, 969 982 0CF99CA41F736375007EE793 /* MediaTimeAVFoundation.cpp in Sources */, 983 41E1F344248A6A000022D5DE /* VideoToolboxSoftLink.cpp in Sources */, 970 984 CDACB3602387425B0018D7CE /* MediaToolboxSoftLink.cpp in Sources */, 971 985 A1F63CA021A4DBF7006FB43B /* PassKitSoftLink.mm in Sources */, -
trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp
r262663 r262708 47 47 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferCopyDataBytes, OSStatus, (CMBlockBufferRef theSourceBuffer, size_t offsetToData, size_t dataLength, void* destination), (theSourceBuffer, offsetToData, dataLength, destination), PAL_EXPORT) 48 48 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer), PAL_EXPORT) 49 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength), PAL_EXPORT) 49 50 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc), PAL_EXPORT) 50 51 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMSampleBufferGetTypeID, CFTypeID, (void), (), PAL_EXPORT) … … 134 135 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue), PAL_EXPORT) 135 136 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue), PAL_EXPORT) 137 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), (), PAL_EXPORT) 136 138 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue), PAL_EXPORT) 137 139 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue), PAL_EXPORT) 138 140 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut), PAL_EXPORT) 141 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue), PAL_EXPORT) 139 142 140 143 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef, PAL_EXPORT) … … 150 153 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleAttachmentKey_IsDependedOnByOthers, CFStringRef, PAL_EXPORT) 151 154 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef, PAL_EXPORT) 155 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef, PAL_EXPORT) 156 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef, PAL_EXPORT) 157 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef, PAL_EXPORT) 152 158 153 159 SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMTimebaseNotification_EffectiveRateChanged, CFStringRef, PAL_EXPORT) … … 164 170 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf), PAL_EXPORT) 165 171 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc), PAL_EXPORT) 172 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut), PAL_EXPORT) 173 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc), PAL_EXPORT) 166 174 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), (), PAL_EXPORT) 167 175 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock), PAL_EXPORT) -
trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h
r262663 r262708 50 50 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer)) 51 51 #define CMBlockBufferGetDataLength softLink_CoreMedia_CMBlockBufferGetDataLength 52 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength)) 53 #define CMBlockBufferReplaceDataBytes softLink_CoreMedia_CMBlockBufferReplaceDataBytes 52 54 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc)) 53 55 #define CMFormatDescriptionGetExtensions softLink_CoreMedia_CMFormatDescriptionGetExtensions … … 226 228 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 227 229 #define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp 230 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), ()) 231 #define CMBufferQueueGetCallbacksForUnsortedSampleBuffers softLink_CoreMedia_CMBufferQueueGetCallbacksForUnsortedSampleBuffers 228 232 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue)) 229 233 #define CMBufferQueueGetEndPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetEndPresentationTimeStamp 230 234 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut)) 231 235 #define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold 236 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue)) 237 #define CMBufferQueueMarkEndOfData softLink_CoreMedia_CMBufferQueueMarkEndOfData 232 238 233 239 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef) … … 259 265 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef) 260 266 #define kCMSampleBufferConsumerNotification_BufferConsumed get_CoreMedia_kCMSampleBufferConsumerNotification_BufferConsumed() 267 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef) 268 #define kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration get_CoreMedia_kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration() 269 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef) 270 #define kCMSampleBufferAttachmentKey_GradualDecoderRefresh get_CoreMedia_kCMSampleBufferAttachmentKey_GradualDecoderRefresh() 271 SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef) 272 #define get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart() 273 274 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut)) 275 #define CMAudioFormatDescriptionGetMagicCookie softLink_CoreMedia_CMAudioFormatDescriptionGetMagicCookie 261 276 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc)) 262 277 #define CMAudioFormatDescriptionGetStreamBasicDescription softLink_CoreMedia_CMAudioFormatDescriptionGetStreamBasicDescription … … 267 282 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf)) 268 283 #define CMSampleBufferGetNumSamples softLink_CoreMedia_CMSampleBufferGetNumSamples 284 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc)) 285 #define CMAudioFormatDescriptionGetRichestDecodableFormat softLink_CoreMedia_CMAudioFormatDescriptionGetRichestDecodableFormat 269 286 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut)) 270 287 #define CMSampleBufferCopySampleBufferForRange softLink_CoreMedia_CMSampleBufferCopySampleBufferForRange -
trunk/Source/WebCore/SourcesCocoa.txt
r262682 r262708 493 493 494 494 platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp 495 platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm 495 496 platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm 497 platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm 496 498 497 499 platform/mediasession/mac/MediaSessionInterruptionProviderMac.mm -
trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj
r262695 r262708 7588 7588 41C7E1061E6A54360027B4DE /* CanvasCaptureMediaStreamTrack.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CanvasCaptureMediaStreamTrack.h; sourceTree = "<group>"; }; 7589 7589 41C7E1081E6AA37C0027B4DE /* CanvasCaptureMediaStreamTrack.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = CanvasCaptureMediaStreamTrack.idl; sourceTree = "<group>"; }; 7590 41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoSampleBufferCompressor.h; sourceTree = "<group>"; }; 7591 41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = VideoSampleBufferCompressor.mm; sourceTree = "<group>"; }; 7590 7592 41CF8BE41D46222000707DC9 /* FetchBodyConsumer.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = FetchBodyConsumer.cpp; sourceTree = "<group>"; }; 7591 7593 41CF8BE51D46222000707DC9 /* FetchBodyConsumer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FetchBodyConsumer.h; sourceTree = "<group>"; }; … … 7616 7618 41E1B1CB0FF5986900576B3B /* AbstractWorker.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AbstractWorker.h; sourceTree = "<group>"; }; 7617 7619 41E1B1CC0FF5986900576B3B /* AbstractWorker.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = AbstractWorker.idl; sourceTree = "<group>"; }; 7620 41E1F33D248A62B60022D5DE /* AudioSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AudioSampleBufferCompressor.mm; sourceTree = "<group>"; }; 7621 41E1F33F248A62B60022D5DE /* AudioSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioSampleBufferCompressor.h; sourceTree = "<group>"; }; 7618 7622 41E408381DCB747900EFCE19 /* PeerConnectionBackend.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PeerConnectionBackend.cpp; sourceTree = "<group>"; }; 7619 7623 41E593FD214865A900D3CB61 /* RTCPriorityType.idl */ = {isa = PBXFileReference; lastKnownFileType = text; path = RTCPriorityType.idl; sourceTree = "<group>"; }; … … 19357 19361 isa = PBXGroup; 19358 19362 children = ( 19363 41E1F33F248A62B60022D5DE /* AudioSampleBufferCompressor.h */, 19364 41E1F33D248A62B60022D5DE /* AudioSampleBufferCompressor.mm */, 19359 19365 4D73F94C218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.h */, 19360 19366 4D73F94D218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.mm */, 19367 41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */, 19368 41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */, 19361 19369 ); 19362 19370 path = cocoa; … … 34313 34321 CD0EEE0E14743F39003EAFA2 /* AudioDestinationIOS.cpp in Sources */, 34314 34322 CD5596911475B678001D0BD0 /* AudioFileReaderIOS.cpp in Sources */, 34315 41E1F343248A69D40022D5DE /* AudioSampleBufferCompressor.mm in Sources */,34316 34323 CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */, 34317 34324 CD8A7BBB197735FE00CBD643 /* AudioSourceProviderAVFObjC.mm in Sources */, … … 35030 35037 3FBC4AF3189881560046EE38 /* VideoFullscreenInterfaceAVKit.mm in Sources */, 35031 35038 52D5A18F1C54592300DE34A3 /* VideoLayerManagerObjC.mm in Sources */, 35032 41E1F342248A69D00022D5DE /* VideoSampleBufferCompressor.mm in Sources */,35033 35039 CD336F6717FA0AC600DDDCD0 /* VideoTrackPrivateAVFObjC.cpp in Sources */, 35034 35040 CD8B5A42180D149A008B8E65 /* VideoTrackPrivateMediaSourceAVFObjC.mm in Sources */, -
trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp
r262663 r262708 27 27 #include "MediaRecorderPrivateAVFImpl.h" 28 28 29 #if ENABLE(MEDIA_STREAM) 29 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "AudioStreamDescription.h" 32 #include "MediaRecorderPrivateWriterCocoa.h" 32 33 #include "MediaSample.h" 33 34 #include "MediaStreamPrivate.h" … … 113 114 } // namespace WebCore 114 115 115 #endif // ENABLE(MEDIA_STREAM) 116 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h
r262663 r262708 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 28 28 29 29 #include "MediaRecorderPrivate.h" … … 61 61 } // namespace WebCore 62 62 63 #endif // ENABLE(MEDIA_STREAM) 63 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h
r262663 r262708 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 28 29 #include "AudioStreamDescription.h" 28 30 29 31 #include "SharedBuffer.h" … … 36 38 #include <wtf/threads/BinarySemaphore.h> 37 39 40 #include <CoreAudio/CoreAudioTypes.h> 41 #include <CoreMedia/CMTime.h> 42 38 43 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 44 typedef const struct opaqueCMFormatDescription* CMFormatDescriptionRef; 45 typedef struct opaqueCMBufferQueueTriggerToken *CMBufferQueueTriggerToken; 39 46 40 47 OBJC_CLASS AVAssetWriter; 41 48 OBJC_CLASS AVAssetWriterInput; 49 OBJC_CLASS WebAVAssetWriterDelegate; 42 50 43 51 namespace WTF { … … 47 55 namespace WebCore { 48 56 57 class AudioSampleBufferCompressor; 49 58 class AudioStreamDescription; 50 59 class MediaStreamTrackPrivate; 51 60 class PlatformAudioData; 61 class VideoSampleBufferCompressor; 52 62 53 class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter > {63 class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter, WeakPtrFactoryInitialization::Eager> { 54 64 public: 55 65 static RefPtr<MediaRecorderPrivateWriter> create(const MediaStreamTrackPrivate* audioTrack, const MediaStreamTrackPrivate* videoTrack); 56 66 static RefPtr<MediaRecorderPrivateWriter> create(bool hasAudio, int width, int height); 57 67 ~MediaRecorderPrivateWriter(); 58 59 bool setupWriter(); 60 bool setVideoInput(int width, int height); 61 bool setAudioInput(); 68 62 69 void appendVideoSampleBuffer(CMSampleBufferRef); 63 70 void appendAudioSampleBuffer(const PlatformAudioData&, const AudioStreamDescription&, const WTF::MediaTime&, size_t); … … 65 72 void fetchData(CompletionHandler<void(RefPtr<SharedBuffer>&&)>&&); 66 73 74 void appendData(const char*, size_t); 75 void appendData(Ref<SharedBuffer>&&); 76 67 77 private: 68 MediaRecorderPrivateWriter( RetainPtr<AVAssetWriter>&&, String&& path);78 MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo); 69 79 void clear(); 70 80 71 RetainPtr<AVAssetWriter> m_writer; 72 RetainPtr<AVAssetWriterInput> m_videoInput; 73 RetainPtr<AVAssetWriterInput> m_audioInput; 81 bool initialize(); 74 82 75 String m_path; 76 Lock m_videoLock; 77 Lock m_audioLock; 78 BinarySemaphore m_finishWritingSemaphore; 79 BinarySemaphore m_finishWritingAudioSemaphore; 80 BinarySemaphore m_finishWritingVideoSemaphore; 83 static void compressedVideoOutputBufferCallback(void*, CMBufferQueueTriggerToken); 84 static void compressedAudioOutputBufferCallback(void*, CMBufferQueueTriggerToken); 85 86 void startAssetWriter(); 87 void appendCompressedSampleBuffers(); 88 89 bool appendCompressedAudioSampleBuffer(); 90 bool appendCompressedVideoSampleBuffer(); 91 92 void processNewCompressedAudioSampleBuffers(); 93 void processNewCompressedVideoSampleBuffers(); 94 95 void flushCompressedSampleBuffers(CompletionHandler<void()>&&); 96 void appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&&); 97 81 98 bool m_hasStartedWriting { false }; 82 99 bool m_isStopped { false }; 83 bool m_isFirstAudioSample { true }; 84 dispatch_queue_t m_audioPullQueue; 85 dispatch_queue_t m_videoPullQueue; 86 Deque<RetainPtr<CMSampleBufferRef>> m_videoBufferPool; 87 Deque<RetainPtr<CMSampleBufferRef>> m_audioBufferPool; 100 101 RetainPtr<AVAssetWriter> m_writer; 88 102 89 103 bool m_isStopping { false }; 90 104 RefPtr<SharedBuffer> m_data; 91 105 CompletionHandler<void(RefPtr<SharedBuffer>&&)> m_fetchDataCompletionHandler; 106 107 bool m_hasAudio; 108 bool m_hasVideo; 109 110 RetainPtr<CMFormatDescriptionRef> m_audioFormatDescription; 111 std::unique_ptr<AudioSampleBufferCompressor> m_audioCompressor; 112 RetainPtr<AVAssetWriterInput> m_audioAssetWriterInput; 113 114 RetainPtr<CMFormatDescriptionRef> m_videoFormatDescription; 115 std::unique_ptr<VideoSampleBufferCompressor> m_videoCompressor; 116 RetainPtr<AVAssetWriterInput> m_videoAssetWriterInput; 117 CMTime m_lastVideoPresentationTime; 118 CMTime m_lastVideoDecodingTime; 119 bool m_hasEncodedVideoSamples { false }; 120 121 RetainPtr<WebAVAssetWriterDelegate> m_writerDelegate; 92 122 }; 93 123 94 124 } // namespace WebCore 95 125 96 #endif // ENABLE(MEDIA_STREAM) 126 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm
r262663 r262708 24 24 */ 25 25 26 #import "config.h" 27 #import "MediaRecorderPrivateWriterCocoa.h" 28 29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 30 31 #import "AudioStreamDescription.h" 32 #import "Logging.h" 33 #import "MediaStreamTrackPrivate.h" 34 #import "WebAudioBufferList.h" 35 #import <AVFoundation/AVAssetWriter.h> 36 #import <AVFoundation/AVAssetWriterInput.h> 37 #import <wtf/CompletionHandler.h> 38 #import <wtf/FileSystem.h> 39 40 #import <pal/cf/CoreMediaSoftLink.h> 41 #import <pal/cocoa/AVFoundationSoftLink.h> 42 43 #undef AVEncoderBitRateKey 44 #define AVEncoderBitRateKey getAVEncoderBitRateKeyWithFallback() 45 #undef AVFormatIDKey 46 #define AVFormatIDKey getAVFormatIDKeyWithFallback() 47 #undef AVNumberOfChannelsKey 48 #define AVNumberOfChannelsKey getAVNumberOfChannelsKeyWithFallback() 49 #undef AVSampleRateKey 50 #define AVSampleRateKey getAVSampleRateKeyWithFallback() 26 #include "config.h" 27 #include "MediaRecorderPrivateWriterCocoa.h" 28 29 #if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 31 #include "AudioSampleBufferCompressor.h" 32 #include "AudioStreamDescription.h" 33 #include "Logging.h" 34 #include "MediaStreamTrackPrivate.h" 35 #include "VideoSampleBufferCompressor.h" 36 #include "WebAudioBufferList.h" 37 #include <AVFoundation/AVAssetWriter.h> 38 #include <AVFoundation/AVAssetWriterInput.h> 39 #include <AVFoundation/AVAssetWriter_Private.h> 40 #include <pal/avfoundation/MediaTimeAVFoundation.h> 41 #include <wtf/BlockPtr.h> 42 #include <wtf/CompletionHandler.h> 43 #include <wtf/FileSystem.h> 44 #include <wtf/cf/TypeCastsCF.h> 45 46 #include <pal/cf/CoreMediaSoftLink.h> 47 #include <pal/cocoa/AVFoundationSoftLink.h> 48 49 @interface WebAVAssetWriterDelegate : NSObject <AVAssetWriterDelegate> { 50 WeakPtr<WebCore::MediaRecorderPrivateWriter> m_writer; 51 } 52 53 - (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer; 54 - (void)close; 55 56 @end 57 58 @implementation WebAVAssetWriterDelegate { 59 }; 60 61 - (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer 62 { 63 ASSERT(isMainThread()); 64 self = [super init]; 65 if (self) 66 self->m_writer = makeWeakPtr(writer); 67 68 return self; 69 } 70 71 - (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedHeaderData:(NSData *)fragmentedHeaderData 72 { 73 UNUSED_PARAM(assetWriter); 74 if (!isMainThread()) { 75 if (auto size = [fragmentedHeaderData length]) { 76 callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedHeaderData bytes]), size)]() mutable { 77 if (protectedSelf->m_writer) 78 protectedSelf->m_writer->appendData(WTFMove(buffer)); 79 }); 80 } 81 return; 82 } 83 84 if (m_writer) 85 m_writer->appendData(static_cast<const char*>([fragmentedHeaderData bytes]), [fragmentedHeaderData length]); 86 } 87 88 - (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedMediaData:(NSData *)fragmentedMediaData fragmentedMediaDataReport:(AVFragmentedMediaDataReport *)fragmentedMediaDataReport 89 { 90 UNUSED_PARAM(assetWriter); 91 UNUSED_PARAM(fragmentedMediaDataReport); 92 if (!isMainThread()) { 93 if (auto size = [fragmentedMediaData length]) { 94 callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedMediaData bytes]), size)]() mutable { 95 if (protectedSelf->m_writer) 96 protectedSelf->m_writer->appendData(WTFMove(buffer)); 97 }); 98 } 99 return; 100 } 101 102 if (m_writer) 103 m_writer->appendData(static_cast<const char*>([fragmentedMediaData bytes]), [fragmentedMediaData length]); 104 } 105 106 - (void)close 107 { 108 m_writer = nullptr; 109 } 110 111 @end 51 112 52 113 namespace WebCore { … … 54 115 using namespace PAL; 55 116 56 static NSString *getAVFormatIDKeyWithFallback() 57 { 58 if (PAL::canLoad_AVFoundation_AVFormatIDKey()) 59 return PAL::get_AVFoundation_AVFormatIDKey(); 60 61 RELEASE_LOG_ERROR(Media, "Failed to load AVFormatIDKey"); 62 return @"AVFormatIDKey"; 63 } 64 65 static NSString *getAVNumberOfChannelsKeyWithFallback() 66 { 67 if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey()) 68 return PAL::get_AVFoundation_AVNumberOfChannelsKey(); 69 70 RELEASE_LOG_ERROR(Media, "Failed to load AVNumberOfChannelsKey"); 71 return @"AVNumberOfChannelsKey"; 72 } 73 74 static NSString *getAVSampleRateKeyWithFallback() 75 { 76 if (PAL::canLoad_AVFoundation_AVSampleRateKey()) 77 return PAL::get_AVFoundation_AVSampleRateKey(); 78 79 RELEASE_LOG_ERROR(Media, "Failed to load AVSampleRateKey"); 80 return @"AVSampleRateKey"; 81 } 82 83 static NSString *getAVEncoderBitRateKeyWithFallback() 84 { 85 if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey()) 86 return PAL::get_AVFoundation_AVEncoderBitRateKey(); 87 88 RELEASE_LOG_ERROR(Media, "Failed to load AVEncoderBitRateKey"); 89 return @"AVEncoderBitRateKey"; 117 RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(bool hasAudio, int width, int height) 118 { 119 auto writer = adoptRef(*new MediaRecorderPrivateWriter(hasAudio, width && height)); 120 if (!writer->initialize()) 121 return nullptr; 122 return writer; 90 123 } 91 124 … … 101 134 } 102 135 103 RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(bool hasAudio, int width, int height) 104 { 105 NSString *directory = FileSystem::createTemporaryDirectory(@"videos"); 106 NSString *filename = [NSString stringWithFormat:@"/%lld.mp4", CMClockGetTime(CMClockGetHostTimeClock()).value]; 107 NSString *path = [directory stringByAppendingString:filename]; 108 109 NSURL *outputURL = [NSURL fileURLWithPath:path]; 110 String filePath = [path UTF8String]; 136 void MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken) 137 { 138 auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter); 139 writer->processNewCompressedVideoSampleBuffers(); 140 } 141 142 void MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken) 143 { 144 auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter); 145 writer->processNewCompressedAudioSampleBuffers(); 146 } 147 148 MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo) 149 : m_hasAudio(hasAudio) 150 , m_hasVideo(hasVideo) 151 { 152 } 153 154 MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter() 155 { 156 clear(); 157 } 158 159 bool MediaRecorderPrivateWriter::initialize() 160 { 111 161 NSError *error = nil; 112 auto avAssetWriter = adoptNS([PAL::allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]); 162 ALLOW_DEPRECATED_DECLARATIONS_BEGIN 163 m_writer = adoptNS([PAL::allocAVAssetWriterInstance() initWithFileType:AVFileTypeMPEG4 error:&error]); 164 ALLOW_DEPRECATED_DECLARATIONS_END 113 165 if (error) { 114 166 RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)error.code); 115 return nullptr; 116 } 117 118 auto writer = adoptRef(*new MediaRecorderPrivateWriter(WTFMove(avAssetWriter), WTFMove(filePath))); 119 120 if (hasAudio && !writer->setAudioInput()) 121 return nullptr; 122 123 if (width && height) { 124 if (!writer->setVideoInput(width, height)) 125 return nullptr; 126 } 127 128 return WTFMove(writer); 129 } 130 131 MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(RetainPtr<AVAssetWriter>&& avAssetWriter, String&& filePath) 132 : m_writer(WTFMove(avAssetWriter)) 133 , m_path(WTFMove(filePath)) 134 { 135 } 136 137 MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter() 138 { 139 clear(); 167 return false; 168 } 169 170 m_writerDelegate = adoptNS([[WebAVAssetWriterDelegate alloc] initWithWriter: this]); 171 [m_writer.get() setDelegate:m_writerDelegate.get()]; 172 173 if (m_hasAudio) { 174 m_audioCompressor = AudioSampleBufferCompressor::create(compressedAudioOutputBufferCallback, this); 175 if (!m_audioCompressor) 176 return false; 177 } 178 if (m_hasVideo) { 179 m_videoCompressor = VideoSampleBufferCompressor::create(kCMVideoCodecType_H264, compressedVideoOutputBufferCallback, this); 180 if (!m_videoCompressor) 181 return false; 182 } 183 return true; 184 } 185 186 void MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers() 187 { 188 ASSERT(m_hasVideo); 189 if (!m_videoFormatDescription) { 190 m_videoFormatDescription = CMSampleBufferGetFormatDescription(m_videoCompressor->getOutputSampleBuffer()); 191 callOnMainThread([weakThis = makeWeakPtr(this), this] { 192 if (!weakThis) 193 return; 194 195 if (m_hasAudio && !m_audioFormatDescription) 196 return; 197 198 startAssetWriter(); 199 }); 200 } 201 if (!m_hasStartedWriting) 202 return; 203 appendCompressedSampleBuffers(); 204 } 205 206 void MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers() 207 { 208 ASSERT(m_hasAudio); 209 if (!m_audioFormatDescription) { 210 m_audioFormatDescription = CMSampleBufferGetFormatDescription(m_audioCompressor->getOutputSampleBuffer()); 211 callOnMainThread([weakThis = makeWeakPtr(this), this] { 212 if (!weakThis) 213 return; 214 215 if (m_hasVideo && !m_videoFormatDescription) 216 return; 217 218 startAssetWriter(); 219 }); 220 } 221 if (!m_hasStartedWriting) 222 return; 223 appendCompressedSampleBuffers(); 224 } 225 226 void MediaRecorderPrivateWriter::startAssetWriter() 227 { 228 if (m_hasVideo) { 229 m_videoAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:m_videoFormatDescription.get()]); 230 [m_videoAssetWriterInput setExpectsMediaDataInRealTime:true]; 231 if (![m_writer.get() canAddInput:m_videoAssetWriterInput.get()]) { 232 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for video"); 233 return; 234 } 235 [m_writer.get() addInput:m_videoAssetWriterInput.get()]; 236 } 237 238 if (m_hasAudio) { 239 m_audioAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:m_audioFormatDescription.get()]); 240 [m_audioAssetWriterInput setExpectsMediaDataInRealTime:true]; 241 if (![m_writer.get() canAddInput:m_audioAssetWriterInput.get()]) { 242 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for audio"); 243 return; 244 } 245 [m_writer.get() addInput:m_audioAssetWriterInput.get()]; 246 } 247 248 if (![m_writer.get() startWriting]) { 249 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed startWriting"); 250 return; 251 } 252 253 [m_writer.get() startSessionAtSourceTime:kCMTimeZero]; 254 255 appendCompressedSampleBuffers(); 256 257 m_hasStartedWriting = true; 258 } 259 260 bool MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer() 261 { 262 if (!m_audioCompressor) 263 return false; 264 265 if (![m_audioAssetWriterInput isReadyForMoreMediaData]) 266 return false; 267 268 auto buffer = m_audioCompressor->takeOutputSampleBuffer(); 269 if (!buffer) 270 return false; 271 272 [m_audioAssetWriterInput.get() appendSampleBuffer:buffer.get()]; 273 return true; 274 } 275 276 bool MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer() 277 { 278 if (!m_videoCompressor) 279 return false; 280 281 if (![m_videoAssetWriterInput isReadyForMoreMediaData]) 282 return false; 283 284 auto buffer = m_videoCompressor->takeOutputSampleBuffer(); 285 if (!buffer) 286 return false; 287 288 m_lastVideoPresentationTime = CMSampleBufferGetPresentationTimeStamp(buffer.get()); 289 m_lastVideoDecodingTime = CMSampleBufferGetDecodeTimeStamp(buffer.get()); 290 m_hasEncodedVideoSamples = true; 291 292 [m_videoAssetWriterInput.get() appendSampleBuffer:buffer.get()]; 293 return true; 294 } 295 296 void MediaRecorderPrivateWriter::appendCompressedSampleBuffers() 297 { 298 while (appendCompressedVideoSampleBuffer() || appendCompressedAudioSampleBuffer()) { }; 299 } 300 301 static inline void appendEndsPreviousSampleDurationMarker(AVAssetWriterInput *assetWriterInput, CMTime presentationTimeStamp, CMTime decodingTimeStamp) 302 { 303 CMSampleTimingInfo timingInfo = { kCMTimeInvalid, presentationTimeStamp, decodingTimeStamp}; 304 305 CMSampleBufferRef buffer = NULL; 306 auto error = CMSampleBufferCreate(kCFAllocatorDefault, NULL, true, NULL, NULL, NULL, 0, 1, &timingInfo, 0, NULL, &buffer); 307 if (error) { 308 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendEndsPreviousSampleDurationMarker failed CMSampleBufferCreate with %d", error); 309 return; 310 } 311 auto sampleBuffer = adoptCF(buffer); 312 313 CMSetAttachment(sampleBuffer.get(), kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, kCFBooleanTrue, kCMAttachmentMode_ShouldPropagate); 314 if (![assetWriterInput appendSampleBuffer:sampleBuffer.get()]) 315 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendSampleBuffer to writer input failed"); 316 } 317 318 void MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&& completionHandler) 319 { 320 if (!m_hasEncodedVideoSamples) { 321 completionHandler(); 322 return; 323 } 324 if ([m_videoAssetWriterInput isReadyForMoreMediaData]) { 325 appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime); 326 completionHandler(); 327 return; 328 } 329 330 auto block = makeBlockPtr([this, weakThis = makeWeakPtr(this), completionHandler = WTFMove(completionHandler)]() mutable { 331 if (weakThis) { 332 appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime); 333 [m_videoAssetWriterInput markAsFinished]; 334 } 335 completionHandler(); 336 }); 337 [m_videoAssetWriterInput requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:block.get()]; 338 } 339 340 void MediaRecorderPrivateWriter::flushCompressedSampleBuffers(CompletionHandler<void()>&& completionHandler) 341 { 342 appendCompressedSampleBuffers(); 343 appendEndOfVideoSampleDurationIfNeeded(WTFMove(completionHandler)); 140 344 } 141 345 142 346 void MediaRecorderPrivateWriter::clear() 143 347 { 144 if (m_videoInput) {145 m_videoInput.clear();146 dispatch_release(m_videoPullQueue);147 }148 if (m_audioInput) {149 m_audioInput.clear();150 dispatch_release(m_audioPullQueue);151 }152 348 if (m_writer) 153 349 m_writer.clear(); … … 158 354 } 159 355 160 bool MediaRecorderPrivateWriter::setVideoInput(int width, int height)161 {162 ASSERT(!m_videoInput);163 164 NSDictionary *compressionProperties = @{165 AVVideoAverageBitRateKey : @(width * height * 12),166 AVVideoExpectedSourceFrameRateKey : @(30),167 AVVideoMaxKeyFrameIntervalKey : @(120),168 AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel169 };170 171 NSDictionary *videoSettings = @{172 AVVideoCodecKey: AVVideoCodecH264,173 AVVideoWidthKey: @(width),174 AVVideoHeightKey: @(height),175 AVVideoCompressionPropertiesKey: compressionProperties176 };177 178 m_videoInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]);179 [m_videoInput setExpectsMediaDataInRealTime:true];180 181 if (![m_writer canAddInput:m_videoInput.get()]) {182 m_videoInput = nullptr;183 RELEASE_LOG_ERROR(MediaStream, "the video input is not allowed to add to the AVAssetWriter");184 return false;185 }186 [m_writer addInput:m_videoInput.get()];187 m_videoPullQueue = dispatch_queue_create("WebCoreVideoRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);188 return true;189 }190 191 bool MediaRecorderPrivateWriter::setAudioInput()192 {193 ASSERT(!m_audioInput);194 195 NSDictionary *audioSettings = @{196 AVEncoderBitRateKey : @(28000),197 AVFormatIDKey : @(kAudioFormatMPEG4AAC),198 AVNumberOfChannelsKey : @(1),199 AVSampleRateKey : @(22050)200 };201 202 m_audioInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]);203 [m_audioInput setExpectsMediaDataInRealTime:true];204 205 if (![m_writer canAddInput:m_audioInput.get()]) {206 m_audioInput = nullptr;207 RELEASE_LOG_ERROR(MediaStream, "the audio input is not allowed to add to the AVAssetWriter");208 return false;209 }210 [m_writer addInput:m_audioInput.get()];211 m_audioPullQueue = dispatch_queue_create("WebCoreAudioRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);212 return true;213 }214 356 215 357 static inline RetainPtr<CMSampleBufferRef> copySampleBufferWithCurrentTimeStamp(CMSampleBufferRef originalBuffer) … … 218 360 CMItemCount count = 0; 219 361 CMSampleBufferGetSampleTimingInfoArray(originalBuffer, 0, nil, &count); 220 362 221 363 Vector<CMSampleTimingInfo> timeInfo(count); 222 364 CMSampleBufferGetSampleTimingInfoArray(originalBuffer, count, timeInfo.data(), &count); 223 224 for ( CMItemCounti = 0; i < count; i++) {365 366 for (auto i = 0; i < count; i++) { 225 367 timeInfo[i].decodeTimeStamp = kCMTimeInvalid; 226 368 timeInfo[i].presentationTimeStamp = startTime; 227 369 } 228 370 229 371 CMSampleBufferRef newBuffer = nullptr; 230 auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer);231 if (error)372 if (auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer)) { 373 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferCreateCopyWithNewTiming failed with %d", error); 232 374 return nullptr; 375 } 233 376 return adoptCF(newBuffer); 234 377 } … … 236 379 void MediaRecorderPrivateWriter::appendVideoSampleBuffer(CMSampleBufferRef sampleBuffer) 237 380 { 238 ASSERT(m_videoInput); 239 if (m_isStopped) 240 return; 241 242 if (!m_hasStartedWriting) { 243 if (![m_writer startWriting]) { 244 m_isStopped = true; 245 RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)[m_writer error]); 246 return; 247 } 248 [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())]; 249 m_hasStartedWriting = true; 250 RefPtr<MediaRecorderPrivateWriter> protectedThis = this; 251 [m_videoInput requestMediaDataWhenReadyOnQueue:m_videoPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] { 252 do { 253 if (![m_videoInput isReadyForMoreMediaData]) 254 break; 255 auto locker = holdLock(m_videoLock); 256 if (m_videoBufferPool.isEmpty()) 257 break; 258 auto buffer = m_videoBufferPool.takeFirst(); 259 locker.unlockEarly(); 260 if (![m_videoInput appendSampleBuffer:buffer.get()]) 261 break; 262 } while (true); 263 if (m_isStopped && m_videoBufferPool.isEmpty()) { 264 [m_videoInput markAsFinished]; 265 m_finishWritingVideoSemaphore.signal(); 266 } 267 }]; 268 return; 269 } 270 auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer); 271 if (!bufferWithCurrentTime) 272 return; 273 274 auto locker = holdLock(m_videoLock); 275 m_videoBufferPool.append(WTFMove(bufferWithCurrentTime)); 381 // FIXME: We should not set the timestamps if they are already set. 382 if (auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer)) 383 m_videoCompressor->addSampleBuffer(bufferWithCurrentTime.get()); 276 384 } 277 385 … … 281 389 CMFormatDescriptionRef format = nullptr; 282 390 auto error = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, basicDescription, 0, NULL, 0, NULL, NULL, &format); 283 if (error) 391 if (error) { 392 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMAudioFormatDescriptionCreate failed with %d", error); 284 393 return nullptr; 394 } 285 395 return adoptCF(format); 286 396 } 287 397 288 static inline RetainPtr<CMSampleBufferRef> createAudioSampleBufferWithPacketDescriptions(CMFormatDescriptionRef format, size_t sampleCount) 289 { 290 CMTime startTime = CMClockGetTime(CMClockGetHostTimeClock()); 291 CMSampleBufferRef sampleBuffer = nullptr; 292 auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format, sampleCount, startTime, NULL, &sampleBuffer); 293 if (error) 294 return nullptr; 295 return adoptCF(sampleBuffer); 296 } 297 298 void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime&, size_t sampleCount) 299 { 300 ASSERT(m_audioInput); 301 if ((!m_hasStartedWriting && m_videoInput) || m_isStopped) 302 return; 398 static inline RetainPtr<CMSampleBufferRef> createAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount) 399 { 303 400 auto format = createAudioFormatDescription(description); 304 401 if (!format) 305 return; 306 if (m_isFirstAudioSample) { 307 if (!m_videoInput) { 308 // audio-only recording. 309 if (![m_writer startWriting]) { 310 m_isStopped = true; 311 return; 312 } 313 [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())]; 314 m_hasStartedWriting = true; 315 } 316 m_isFirstAudioSample = false; 317 RefPtr<MediaRecorderPrivateWriter> protectedThis = this; 318 [m_audioInput requestMediaDataWhenReadyOnQueue:m_audioPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] { 319 do { 320 if (![m_audioInput isReadyForMoreMediaData]) 321 break; 322 auto locker = holdLock(m_audioLock); 323 if (m_audioBufferPool.isEmpty()) 324 break; 325 auto buffer = m_audioBufferPool.takeFirst(); 326 locker.unlockEarly(); 327 [m_audioInput appendSampleBuffer:buffer.get()]; 328 } while (true); 329 if (m_isStopped && m_audioBufferPool.isEmpty()) { 330 [m_audioInput markAsFinished]; 331 m_finishWritingAudioSemaphore.signal(); 332 } 402 return nullptr; 403 404 CMSampleBufferRef sampleBuffer = nullptr; 405 auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format.get(), sampleCount, toCMTime(time), NULL, &sampleBuffer); 406 if (error) { 407 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter createAudioSampleBufferWithPacketDescriptions failed with %d", error); 408 return nullptr; 409 } 410 auto buffer = adoptCF(sampleBuffer); 411 412 error = CMSampleBufferSetDataBufferFromAudioBufferList(buffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list()); 413 if (error) { 414 RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferSetDataBufferFromAudioBufferList failed with %d", error); 415 return nullptr; 416 } 417 return buffer; 418 } 419 420 void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount) 421 { 422 if (auto sampleBuffer = createAudioSampleBuffer(data, description, time, sampleCount)) 423 m_audioCompressor->addSampleBuffer(sampleBuffer.get()); 424 } 425 426 void MediaRecorderPrivateWriter::stopRecording() 427 { 428 if (m_isStopped) 429 return; 430 431 m_isStopped = true; 432 433 if (m_videoCompressor) 434 m_videoCompressor->finish(); 435 if (m_audioCompressor) 436 m_audioCompressor->finish(); 437 438 if (!m_hasStartedWriting) 439 return; 440 ASSERT([m_writer status] == AVAssetWriterStatusWriting); 441 442 m_isStopping = true; 443 444 flushCompressedSampleBuffers([this, weakThis = makeWeakPtr(this)]() mutable { 445 if (!weakThis) 446 return; 447 448 ALLOW_DEPRECATED_DECLARATIONS_BEGIN 449 [m_writer flush]; 450 ALLOW_DEPRECATED_DECLARATIONS_END 451 [m_writer finishWritingWithCompletionHandler:[this, weakThis = WTFMove(weakThis)]() mutable { 452 callOnMainThread([this, weakThis = WTFMove(weakThis)]() mutable { 453 if (!weakThis) 454 return; 455 456 m_isStopping = false; 457 if (m_fetchDataCompletionHandler) { 458 auto buffer = WTFMove(m_data); 459 m_fetchDataCompletionHandler(WTFMove(buffer)); 460 } 461 462 m_isStopped = false; 463 m_hasStartedWriting = false; 464 clear(); 465 }); 333 466 }]; 334 } 335 336 auto sampleBuffer = createAudioSampleBufferWithPacketDescriptions(format.get(), sampleCount); 337 if (!sampleBuffer) 338 return; 339 auto error = CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list()); 340 if (error) 341 return; 342 343 auto locker = holdLock(m_audioLock); 344 m_audioBufferPool.append(WTFMove(sampleBuffer)); 345 } 346 347 void MediaRecorderPrivateWriter::stopRecording() 348 { 349 if (m_isStopped) 350 return; 351 352 m_isStopped = true; 353 if (!m_hasStartedWriting) 354 return; 355 ASSERT([m_writer status] == AVAssetWriterStatusWriting); 356 if (m_videoInput) 357 m_finishWritingVideoSemaphore.wait(); 358 359 if (m_audioInput) 360 m_finishWritingAudioSemaphore.wait(); 361 362 m_isStopping = true; 363 [m_writer finishWritingWithCompletionHandler:[this, weakPtr = makeWeakPtr(*this)]() mutable { 364 callOnMainThread([this, weakPtr = WTFMove(weakPtr), buffer = SharedBuffer::createWithContentsOfFile(m_path)]() mutable { 365 if (!weakPtr) 366 return; 367 368 m_isStopping = false; 369 if (m_fetchDataCompletionHandler) 370 m_fetchDataCompletionHandler(WTFMove(buffer)); 371 else 372 m_data = WTFMove(buffer); 373 374 m_isStopped = false; 375 m_hasStartedWriting = false; 376 m_isFirstAudioSample = true; 377 clear(); 378 }); 379 m_finishWritingSemaphore.signal(); 380 }]; 381 m_finishWritingSemaphore.wait(); 467 }); 382 468 } 383 469 … … 393 479 } 394 480 481 void MediaRecorderPrivateWriter::appendData(const char* data, size_t size) 482 { 483 if (!m_data) { 484 m_data = SharedBuffer::create(data, size); 485 return; 486 } 487 m_data->append(data, size); 488 } 489 490 void MediaRecorderPrivateWriter::appendData(Ref<SharedBuffer>&& buffer) 491 { 492 if (!m_data) { 493 m_data = WTFMove(buffer); 494 return; 495 } 496 m_data->append(WTFMove(buffer)); 497 } 498 395 499 } // namespace WebCore 396 500 397 #endif // ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)501 #endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebCore/platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h
r262707 r262708 1 1 /* 2 * Copyright (C) 20 18Apple Inc. All rights reserved.2 * Copyright (C) 2020 Apple Inc. All rights reserved. 3 3 * 4 4 * Redistribution and use in source and binary forms, with or without … … 25 25 #pragma once 26 26 27 #if ENABLE(MEDIA_STREAM) 27 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION) 28 28 29 #include "MediaRecorderPrivate.h" 30 #include "MediaRecorderPrivateWriterCocoa.h" 29 #include <CoreMedia/CoreMedia.h> 30 #include <VideoToolbox/VTErrors.h> 31 32 typedef struct opaqueCMSampleBuffer *CMSampleBufferRef; 33 typedef struct OpaqueVTCompressionSession *VTCompressionSessionRef; 31 34 32 35 namespace WebCore { 33 36 34 class MediaStreamPrivate; 35 36 class MediaRecorderPrivateAVFImpl final 37 : public MediaRecorderPrivate { 37 class VideoSampleBufferCompressor { 38 38 WTF_MAKE_FAST_ALLOCATED; 39 39 public: 40 static std::unique_ptr<MediaRecorderPrivateAVFImpl> create(MediaStreamPrivate&); 41 ~MediaRecorderPrivateAVFImpl(); 40 static std::unique_ptr<VideoSampleBufferCompressor> create(CMVideoCodecType, CMBufferQueueTriggerCallback, void* callbackObject); 41 ~VideoSampleBufferCompressor(); 42 43 void finish(); 44 void addSampleBuffer(CMSampleBufferRef); 45 CMSampleBufferRef getOutputSampleBuffer(); 46 RetainPtr<CMSampleBufferRef> takeOutputSampleBuffer(); 42 47 43 48 private: 44 MediaRecorderPrivateAVFImpl(Ref<MediaRecorderPrivateWriter>&&, String&& audioTrackId, String&& videoTrackId);49 explicit VideoSampleBufferCompressor(CMVideoCodecType); 45 50 46 friend std::unique_ptr<MediaRecorderPrivateAVFImpl> std::make_unique<MediaRecorderPrivateAVFImpl>(Ref<MediaRecorderPrivateWriter>&&, String&&, String&&);51 bool initialize(CMBufferQueueTriggerCallback, void* callbackObject); 47 52 48 // MediaRecorderPrivate 49 void videoSampleAvailable(MediaSample&) final; 50 void fetchData(FetchDataCallback&&) final; 51 void audioSamplesAvailable(const WTF::MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) final; 53 void processSampleBuffer(CMSampleBufferRef); 54 bool initCompressionSession(CMVideoFormatDescriptionRef); 52 55 53 const String& mimeType(); 54 void stopRecording(); 56 static void videoCompressionCallback(void *refCon, void*, OSStatus, VTEncodeInfoFlags, CMSampleBufferRef); 55 57 56 Ref<MediaRecorderPrivateWriter> m_writer; 57 String m_recordedAudioTrackID; 58 String m_recordedVideoTrackID; 58 dispatch_queue_t m_serialDispatchQueue; 59 RetainPtr<CMBufferQueueRef> m_outputBufferQueue; 60 RetainPtr<VTCompressionSessionRef> m_vtSession; 61 62 bool m_isEncoding { false }; 63 64 CMVideoCodecType m_outputCodecType; 65 float m_maxKeyFrameIntervalDuration { 2.0 }; 66 unsigned m_expectedFrameRate { 30 }; 59 67 }; 60 68 61 } // namespace WebCore69 } 62 70 63 #endif // ENABLE(MEDIA_STREAM)71 #endif -
trunk/Source/WebCore/platform/network/cocoa/ResourceRequestCocoa.mm
r261153 r262708 37 37 #import <pal/spi/cf/CFNetworkSPI.h> 38 38 #import <wtf/FileSystem.h> 39 #import <wtf/cocoa/VectorCocoa.h> 39 40 #import <wtf/text/CString.h> 40 41 -
trunk/Source/WebCore/testing/Internals.cpp
r262695 r262708 574 574 RuntimeEnabledFeatures::sharedFeatures().setInterruptAudioOnPageVisibilityChangeEnabled(false); 575 575 WebCore::MediaRecorder::setCustomPrivateRecorderCreator(nullptr); 576 page.mediaRecorderProvider().setUseGPUProcess(true);577 576 #endif 578 577 -
trunk/Source/WebKit/ChangeLog
r262703 r262708 1 2020-06-08 youenn fablet <youenn@apple.com> 2 3 [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder 4 https://bugs.webkit.org/show_bug.cgi?id=206582 5 <rdar://problem/58985368> 6 7 Reviewed by Eric Carlson. 8 9 Enable RemoteMediaRecorder only for systems supporting AVAssetWriterDelegate. 10 11 * GPUProcess/GPUConnectionToWebProcess.cpp: 12 (WebKit::GPUConnectionToWebProcess::didReceiveMessage): 13 * GPUProcess/GPUConnectionToWebProcess.h: 14 * GPUProcess/webrtc/RemoteMediaRecorder.cpp: 15 * GPUProcess/webrtc/RemoteMediaRecorder.h: 16 * GPUProcess/webrtc/RemoteMediaRecorder.messages.in: 17 * GPUProcess/webrtc/RemoteMediaRecorderManager.cpp: 18 * GPUProcess/webrtc/RemoteMediaRecorderManager.h: 19 * GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in: 20 * GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h: 21 * WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp: 22 * WebProcess/GPU/webrtc/MediaRecorderPrivate.h: 23 * WebProcess/GPU/webrtc/MediaRecorderProvider.cpp: 24 (WebKit::MediaRecorderProvider::createMediaRecorderPrivate): 25 1 26 2020-06-07 Lauro Moura <lmoura@igalia.com> 2 27 -
trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.cpp
r262695 r262708 229 229 } 230 230 231 #if HAVE(AVASSETWRITERDELEGATE) 231 232 RemoteMediaRecorderManager& GPUConnectionToWebProcess::mediaRecorderManager() 232 233 { … … 236 237 return *m_remoteMediaRecorderManager; 237 238 } 239 #endif 238 240 239 241 RemoteAudioMediaStreamTrackRendererManager& GPUConnectionToWebProcess::audioTrackRendererManager() … … 252 254 return *m_sampleBufferDisplayLayerManager; 253 255 } 254 #endif 256 #endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 255 257 256 258 #if PLATFORM(COCOA) && USE(LIBWEBRTC) … … 370 372 return true; 371 373 } 374 #if HAVE(AVASSETWRITERDELEGATE) 372 375 if (decoder.messageReceiverName() == Messages::RemoteMediaRecorderManager::messageReceiverName()) { 373 376 mediaRecorderManager().didReceiveMessageFromWebProcess(connection, decoder); … … 378 381 return true; 379 382 } 383 #endif // HAVE(AVASSETWRITERDELEGATE) 380 384 if (decoder.messageReceiverName() == Messages::RemoteAudioMediaStreamTrackRendererManager::messageReceiverName()) { 381 385 audioTrackRendererManager().didReceiveMessageFromWebProcess(connection, decoder); … … 394 398 return true; 395 399 } 396 #endif 400 #endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 397 401 #if PLATFORM(COCOA) && USE(LIBWEBRTC) 398 402 if (decoder.messageReceiverName() == Messages::LibWebRTCCodecsProxy::messageReceiverName()) { -
trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.h
r262695 r262708 117 117 #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 118 118 UserMediaCaptureManagerProxy& userMediaCaptureManagerProxy(); 119 #if HAVE(AVASSETWRITERDELEGATE) 119 120 RemoteMediaRecorderManager& mediaRecorderManager(); 121 #endif 120 122 RemoteAudioMediaStreamTrackRendererManager& audioTrackRendererManager(); 121 123 RemoteSampleBufferDisplayLayerManager& sampleBufferDisplayLayerManager(); … … 167 169 #if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM) 168 170 std::unique_ptr<UserMediaCaptureManagerProxy> m_userMediaCaptureManagerProxy; 171 #if HAVE(AVASSETWRITERDELEGATE) 169 172 std::unique_ptr<RemoteMediaRecorderManager> m_remoteMediaRecorderManager; 173 #endif 170 174 std::unique_ptr<RemoteAudioMediaStreamTrackRendererManager> m_audioTrackRendererManager; 171 175 std::unique_ptr<RemoteSampleBufferDisplayLayerManager> m_sampleBufferDisplayLayerManager; -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.cpp
r262663 r262708 27 27 #include "RemoteMediaRecorder.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "SharedRingBufferStorage.h" … … 136 136 } 137 137 138 #endif 138 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.h
r262663 r262708 26 26 #pragma once 27 27 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 29 29 30 30 #include "MediaRecorderIdentifier.h" … … 83 83 } 84 84 85 #endif 85 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.messages.in
r262663 r262708 22 22 # THE POSSIBILITY OF SUCH DAMAGE. 23 23 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 25 25 26 26 messages -> RemoteMediaRecorder NotRefCounted { -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.cpp
r262663 r262708 27 27 #include "RemoteMediaRecorderManager.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "DataReference.h" -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.h
r262663 r262708 28 28 #pragma once 29 29 30 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 30 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 31 31 32 32 #include "MediaRecorderIdentifier.h" -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in
r262663 r262708 22 22 # THE POSSIBILITY OF SUCH DAMAGE. 23 23 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 25 25 26 26 messages -> RemoteMediaRecorderManager NotRefCounted { -
trunk/Source/WebKit/GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h
r262695 r262708 31 31 #include "RemoteSampleBufferDisplayLayerManagerMessagesReplies.h" 32 32 #include "SampleBufferDisplayLayerIdentifier.h" 33 #include <WebCore/IntSize.h> 33 34 #include <wtf/HashMap.h> 34 35 -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp
r262663 r262708 27 27 #include "MediaRecorderPrivate.h" 28 28 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 30 30 31 31 #include "DataReference.h" … … 136 136 } 137 137 138 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 138 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.h
r262663 r262708 26 26 #pragma once 27 27 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 29 29 30 30 #include "MediaRecorderIdentifier.h" … … 77 77 } 78 78 79 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) 79 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE) 80 80 -
trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderProvider.cpp
r262663 r262708 37 37 std::unique_ptr<WebCore::MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(MediaStreamPrivate& stream) 38 38 { 39 #if ENABLE(GPU_PROCESS) 39 #if ENABLE(GPU_PROCESS) && HAVE(AVASSETWRITERDELEGATE) 40 40 if (m_useGPUProcess) 41 41 return makeUnique<MediaRecorderPrivate>(stream);
Note: See TracChangeset
for help on using the changeset viewer.