Changeset 262708 in webkit


Ignore:
Timestamp:
Jun 8, 2020 4:49:15 AM (4 years ago)
Author:
youenn@apple.com
Message:

[Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
https://bugs.webkit.org/show_bug.cgi?id=206582
Source/WebCore:

<rdar://problem/58985368>

Reviewed by Eric Carlson.

AVAssetWriterDelegate allows to grab recorded data whenever wanted.
This delegate requires passing compressed samples to AVAssetWriter.
Implement video encoding and audio encoding in dedicated classes and use these classes before adding buffers to AVAssetWriter.
These classes are AudioSampleBufferCompressor and VideoSampleBufferCompressor.
They support AAC and H264 so far and should be further improved to support more encoding options.

Instantiate real writer only for platforms supporting AVAssetWriterDelegate, since it is not supported everywhere.
The writer, doing the pacakging, is receiving compressed buffer from the audio/video compressors.
It then sends data when being request to flush to its delegate, which will send data to the MediaRecorderPrivateWriter.
The MediaRecorderPrivateWriter stores the data in a SharedBuffer until MediaRecorder asks for data.

Note that, whenever we request data, we flush the writer and insert an end of video sample to make sure video data gets flushed.
Therefore data should not be requested too fast to get adequate video compression.

Covered by existing tests.

  • Modules/mediarecorder/MediaRecorderProvider.cpp:

(WebCore::MediaRecorderProvider::createMediaRecorderPrivate):

  • WebCore.xcodeproj/project.pbxproj:
  • platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp:

(WebCore::MediaRecorderPrivateAVFImpl::create):

  • platform/mediarecorder/MediaRecorderPrivateAVFImpl.h:
  • platform/mediarecorder/cocoa/AudioSampleBufferCompressor.h: Added.
  • platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm: Added.

(WebCore::AudioSampleBufferCompressor::create):
(WebCore::AudioSampleBufferCompressor::AudioSampleBufferCompressor):
(WebCore::AudioSampleBufferCompressor::~AudioSampleBufferCompressor):
(WebCore::AudioSampleBufferCompressor::initialize):
(WebCore::AudioSampleBufferCompressor::finish):
(WebCore::AudioSampleBufferCompressor::initAudioConverterForSourceFormatDescription):
(WebCore::AudioSampleBufferCompressor::computeBufferSizeForAudioFormat):
(WebCore::AudioSampleBufferCompressor::attachPrimingTrimsIfNeeded):
(WebCore::AudioSampleBufferCompressor::gradualDecoderRefreshCount):
(WebCore::AudioSampleBufferCompressor::sampleBufferWithNumPackets):
(WebCore::AudioSampleBufferCompressor::audioConverterComplexInputDataProc):
(WebCore::AudioSampleBufferCompressor::provideSourceDataNumOutputPackets):
(WebCore::AudioSampleBufferCompressor::processSampleBuffersUntilLowWaterTime):
(WebCore::AudioSampleBufferCompressor::processSampleBuffer):
(WebCore::AudioSampleBufferCompressor::addSampleBuffer):
(WebCore::AudioSampleBufferCompressor::getOutputSampleBuffer):
(WebCore::AudioSampleBufferCompressor::takeOutputSampleBuffer):

  • platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h:
  • platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm:

(-[WebAVAssetWriterDelegate initWithWriter:]):
(-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedHeaderData:]):
(-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedMediaData:fragmentedMediaDataReport:]):
(-[WebAVAssetWriterDelegate close]):
(WebCore::MediaRecorderPrivateWriter::create):
(WebCore::MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback):
(WebCore::MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback):
(WebCore::MediaRecorderPrivateWriter::MediaRecorderPrivateWriter):
(WebCore::MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter):
(WebCore::MediaRecorderPrivateWriter::initialize):
(WebCore::MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers):
(WebCore::MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers):
(WebCore::MediaRecorderPrivateWriter::startAssetWriter):
(WebCore::MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer):
(WebCore::MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer):
(WebCore::MediaRecorderPrivateWriter::appendCompressedSampleBuffers):
(WebCore::appendEndsPreviousSampleDurationMarker):
(WebCore::MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded):
(WebCore::MediaRecorderPrivateWriter::flushCompressedSampleBuffers):
(WebCore::MediaRecorderPrivateWriter::clear):
(WebCore::copySampleBufferWithCurrentTimeStamp):
(WebCore::MediaRecorderPrivateWriter::appendVideoSampleBuffer):
(WebCore::createAudioFormatDescription):
(WebCore::createAudioSampleBuffer):
(WebCore::MediaRecorderPrivateWriter::appendAudioSampleBuffer):
(WebCore::MediaRecorderPrivateWriter::stopRecording):
(WebCore::MediaRecorderPrivateWriter::appendData):

  • platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h: Copied from Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h.
  • platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm: Added.

(WebCore::VideoSampleBufferCompressor::create):
(WebCore::VideoSampleBufferCompressor::VideoSampleBufferCompressor):
(WebCore::VideoSampleBufferCompressor::~VideoSampleBufferCompressor):
(WebCore::VideoSampleBufferCompressor::initialize):
(WebCore::VideoSampleBufferCompressor::finish):
(WebCore::VideoSampleBufferCompressor::videoCompressionCallback):
(WebCore::VideoSampleBufferCompressor::initCompressionSession):
(WebCore::VideoSampleBufferCompressor::processSampleBuffer):
(WebCore::VideoSampleBufferCompressor::addSampleBuffer):
(WebCore::VideoSampleBufferCompressor::getOutputSampleBuffer):
(WebCore::VideoSampleBufferCompressor::takeOutputSampleBuffer):

Source/WebCore/PAL:

<rdar://problem/58985368>

Reviewed by Eric Carlson.

Add soft link macros for VideoToolbox and AudioToolbox.

  • PAL.xcodeproj/project.pbxproj:
  • pal/cf/AudioToolboxSoftLink.cpp: Added.
  • pal/cf/AudioToolboxSoftLink.h: Added.
  • pal/cf/CoreMediaSoftLink.cpp:
  • pal/cf/CoreMediaSoftLink.h:
  • pal/cf/VideoToolboxSoftLink.cpp: Added.
  • pal/cf/VideoToolboxSoftLink.h: Added.

Source/WebKit:

<rdar://problem/58985368>

Reviewed by Eric Carlson.

Enable RemoteMediaRecorder only for systems supporting AVAssetWriterDelegate.

  • GPUProcess/GPUConnectionToWebProcess.cpp:

(WebKit::GPUConnectionToWebProcess::didReceiveMessage):

  • GPUProcess/GPUConnectionToWebProcess.h:
  • GPUProcess/webrtc/RemoteMediaRecorder.cpp:
  • GPUProcess/webrtc/RemoteMediaRecorder.h:
  • GPUProcess/webrtc/RemoteMediaRecorder.messages.in:
  • GPUProcess/webrtc/RemoteMediaRecorderManager.cpp:
  • GPUProcess/webrtc/RemoteMediaRecorderManager.h:
  • GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in:
  • GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h:
  • WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp:
  • WebProcess/GPU/webrtc/MediaRecorderPrivate.h:
  • WebProcess/GPU/webrtc/MediaRecorderProvider.cpp:

(WebKit::MediaRecorderProvider::createMediaRecorderPrivate):

Source/WTF:

Reviewed by Eric Carlson.

  • wtf/PlatformHave.h:

LayoutTests:

Reviewed by Eric Carlson.

Disable tests on all platforms except the ones supporting AVAssetWriterDelegate.

  • TestExpectations:
  • http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html:

Remove web audio generation since there seems to be some unstability in web audio -> stream -> media recorder.
which should be fixed as follow-up specific patches.

  • platform/mac/TestExpectations:

Enable running tests.

Location:
trunk
Files:
7 added
33 edited
1 copied

Legend:

Unmodified
Added
Removed
  • trunk/LayoutTests/ChangeLog

    r262706 r262708  
     12020-06-08  youenn fablet  <youenn@apple.com>
     2
     3        [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
     4        https://bugs.webkit.org/show_bug.cgi?id=206582
     5
     6        Reviewed by Eric Carlson.
     7
     8        Disable tests on all platforms except the ones supporting AVAssetWriterDelegate.
     9
     10        * TestExpectations:
     11        * http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html:
     12        Remove web audio generation since there seems to be some unstability in web audio -> stream -> media recorder.
     13        which should be fixed as follow-up specific patches.
     14        * platform/mac/TestExpectations:
     15        Enable running tests.
     16
    1172020-06-08  Diego Pino Garcia  <dpino@igalia.com>
    218
  • trunk/LayoutTests/TestExpectations

    r262669 r262708  
    33003300webgl/1.0.3/conformance/extensions/webgl-draw-buffers.html [ Skip ]
    33013301
    3302 webkit.org/b/197673 http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable.html [ Pass Failure Timeout ]
     3302# Not supported by default
     3303http/wpt/mediarecorder [ Skip ]
     3304imported/w3c/web-platform-tests/mediacapture-record [ Skip ]
     3305fast/history/page-cache-media-recorder.html [ Skip ]
    33033306
    33043307# WebGL 2 Conformance Suite rules for regular bots post ANGLE backend adoption.
  • trunk/LayoutTests/http/wpt/mediarecorder/MediaRecorder-AV-audio-video-dataavailable-gpuprocess.html

    r262663 r262708  
    5757
    5858    async_test(t => {
    59         const ac = new AudioContext();
    60         const osc = ac.createOscillator();
    61         const dest = ac.createMediaStreamDestination();
    62         const audio = dest.stream;
    63         osc.connect(dest);
    64 
    6559        const video = createVideoStream();
    66         assert_equals(video.getAudioTracks().length, 0, "video mediastream starts with no audio track");
    67         assert_equals(audio.getAudioTracks().length, 1, "audio mediastream starts with one audio track");
    68         video.addTrack(audio.getAudioTracks()[0]);
    69         assert_equals(video.getAudioTracks().length, 1, "video mediastream starts with one audio track");
    7060        const recorder = new MediaRecorder(video);
    7161        let mode = 0;
  • trunk/LayoutTests/platform/mac/TestExpectations

    r262688 r262708  
    17091709[ Catalina+ ] fast/text/design-system-ui-16.html [ Pass ]
    17101710
     1711[ Catalina+ ] http/wpt/mediarecorder [ Pass Failure ]
     1712[ Catalina+ ] imported/w3c/web-platform-tests/mediacapture-record [ Pass Failure ]
     1713[ Catalina+ ] fast/history/page-cache-media-recorder.html [ Pass Failure ]
     1714
    17111715webkit.org/b/200128 imported/w3c/web-platform-tests/html/semantics/embedded-content/the-video-element/video_timeupdate_on_seek.html [ Timeout Pass ]
    17121716
  • trunk/Source/WTF/ChangeLog

    r262700 r262708  
     12020-06-08  youenn fablet  <youenn@apple.com>
     2
     3        [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
     4        https://bugs.webkit.org/show_bug.cgi?id=206582
     5
     6        Reviewed by Eric Carlson.
     7
     8        * wtf/PlatformHave.h:
     9
    1102020-06-07  Andy Estes  <aestes@apple.com>
    211
  • trunk/Source/WTF/wtf/PlatformHave.h

    r262700 r262708  
    543543#endif
    544544
     545#if ((PLATFORM(MAC) && __MAC_OS_X_VERSION_MIN_REQUIRED >= 101500) || PLATFORM(IOS)) && (defined __has_include && __has_include(<AVFoundation/AVAssetWriter_Private.h>))
     546#define HAVE_AVASSETWRITERDELEGATE 1
     547#endif
     548
    545549#if PLATFORM(IOS_FAMILY) && !PLATFORM(WATCHOS) && !PLATFORM(APPLETV)
    546550#define HAVE_SYSTEM_FONT_STYLE_TITLE_0 1
  • trunk/Source/WebCore/ChangeLog

    r262707 r262708  
     12020-06-08  youenn fablet  <youenn@apple.com>
     2
     3        [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
     4        https://bugs.webkit.org/show_bug.cgi?id=206582
     5        <rdar://problem/58985368>
     6
     7        Reviewed by Eric Carlson.
     8
     9        AVAssetWriterDelegate allows to grab recorded data whenever wanted.
     10        This delegate requires passing compressed samples to AVAssetWriter.
     11        Implement video encoding and audio encoding in dedicated classes and use these classes before adding buffers to AVAssetWriter.
     12        These classes are AudioSampleBufferCompressor and VideoSampleBufferCompressor.
     13        They support AAC and H264 so far and should be further improved to support more encoding options.
     14
     15        Instantiate real writer only for platforms supporting AVAssetWriterDelegate, since it is not supported everywhere.
     16        The writer, doing the pacakging, is receiving compressed buffer from the audio/video compressors.
     17        It then sends data when being request to flush to its delegate, which will send data to the MediaRecorderPrivateWriter.
     18        The MediaRecorderPrivateWriter stores the data in a SharedBuffer until MediaRecorder asks for data.
     19
     20        Note that, whenever we request data, we flush the writer and insert an end of video sample to make sure video data gets flushed.
     21        Therefore data should not be requested too fast to get adequate video compression.
     22
     23        Covered by existing tests.
     24
     25        * Modules/mediarecorder/MediaRecorderProvider.cpp:
     26        (WebCore::MediaRecorderProvider::createMediaRecorderPrivate):
     27        * WebCore.xcodeproj/project.pbxproj:
     28        * platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp:
     29        (WebCore::MediaRecorderPrivateAVFImpl::create):
     30        * platform/mediarecorder/MediaRecorderPrivateAVFImpl.h:
     31        * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.h: Added.
     32        * platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm: Added.
     33        (WebCore::AudioSampleBufferCompressor::create):
     34        (WebCore::AudioSampleBufferCompressor::AudioSampleBufferCompressor):
     35        (WebCore::AudioSampleBufferCompressor::~AudioSampleBufferCompressor):
     36        (WebCore::AudioSampleBufferCompressor::initialize):
     37        (WebCore::AudioSampleBufferCompressor::finish):
     38        (WebCore::AudioSampleBufferCompressor::initAudioConverterForSourceFormatDescription):
     39        (WebCore::AudioSampleBufferCompressor::computeBufferSizeForAudioFormat):
     40        (WebCore::AudioSampleBufferCompressor::attachPrimingTrimsIfNeeded):
     41        (WebCore::AudioSampleBufferCompressor::gradualDecoderRefreshCount):
     42        (WebCore::AudioSampleBufferCompressor::sampleBufferWithNumPackets):
     43        (WebCore::AudioSampleBufferCompressor::audioConverterComplexInputDataProc):
     44        (WebCore::AudioSampleBufferCompressor::provideSourceDataNumOutputPackets):
     45        (WebCore::AudioSampleBufferCompressor::processSampleBuffersUntilLowWaterTime):
     46        (WebCore::AudioSampleBufferCompressor::processSampleBuffer):
     47        (WebCore::AudioSampleBufferCompressor::addSampleBuffer):
     48        (WebCore::AudioSampleBufferCompressor::getOutputSampleBuffer):
     49        (WebCore::AudioSampleBufferCompressor::takeOutputSampleBuffer):
     50        * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h:
     51        * platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm:
     52        (-[WebAVAssetWriterDelegate initWithWriter:]):
     53        (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedHeaderData:]):
     54        (-[WebAVAssetWriterDelegate assetWriter:didProduceFragmentedMediaData:fragmentedMediaDataReport:]):
     55        (-[WebAVAssetWriterDelegate close]):
     56        (WebCore::MediaRecorderPrivateWriter::create):
     57        (WebCore::MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback):
     58        (WebCore::MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback):
     59        (WebCore::MediaRecorderPrivateWriter::MediaRecorderPrivateWriter):
     60        (WebCore::MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter):
     61        (WebCore::MediaRecorderPrivateWriter::initialize):
     62        (WebCore::MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers):
     63        (WebCore::MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers):
     64        (WebCore::MediaRecorderPrivateWriter::startAssetWriter):
     65        (WebCore::MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer):
     66        (WebCore::MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer):
     67        (WebCore::MediaRecorderPrivateWriter::appendCompressedSampleBuffers):
     68        (WebCore::appendEndsPreviousSampleDurationMarker):
     69        (WebCore::MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded):
     70        (WebCore::MediaRecorderPrivateWriter::flushCompressedSampleBuffers):
     71        (WebCore::MediaRecorderPrivateWriter::clear):
     72        (WebCore::copySampleBufferWithCurrentTimeStamp):
     73        (WebCore::MediaRecorderPrivateWriter::appendVideoSampleBuffer):
     74        (WebCore::createAudioFormatDescription):
     75        (WebCore::createAudioSampleBuffer):
     76        (WebCore::MediaRecorderPrivateWriter::appendAudioSampleBuffer):
     77        (WebCore::MediaRecorderPrivateWriter::stopRecording):
     78        (WebCore::MediaRecorderPrivateWriter::appendData):
     79        * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h: Copied from Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h.
     80        * platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm: Added.
     81        (WebCore::VideoSampleBufferCompressor::create):
     82        (WebCore::VideoSampleBufferCompressor::VideoSampleBufferCompressor):
     83        (WebCore::VideoSampleBufferCompressor::~VideoSampleBufferCompressor):
     84        (WebCore::VideoSampleBufferCompressor::initialize):
     85        (WebCore::VideoSampleBufferCompressor::finish):
     86        (WebCore::VideoSampleBufferCompressor::videoCompressionCallback):
     87        (WebCore::VideoSampleBufferCompressor::initCompressionSession):
     88        (WebCore::VideoSampleBufferCompressor::processSampleBuffer):
     89        (WebCore::VideoSampleBufferCompressor::addSampleBuffer):
     90        (WebCore::VideoSampleBufferCompressor::getOutputSampleBuffer):
     91        (WebCore::VideoSampleBufferCompressor::takeOutputSampleBuffer):
     92
    1932020-06-08  Youenn Fablet  <youenn@apple.com>
    294
  • trunk/Source/WebCore/Modules/mediarecorder/MediaRecorderProvider.cpp

    r262663 r262708  
    3535std::unique_ptr<MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(MediaStreamPrivate& stream)
    3636{
     37#if HAVE(AVASSETWRITERDELEGATE)
    3738    return MediaRecorderPrivateAVFImpl::create(stream);
     39#else
     40    UNUSED_PARAM(stream);
     41    return nullptr;
     42#endif
    3843}
    3944
  • trunk/Source/WebCore/PAL/ChangeLog

    r262695 r262708  
     12020-06-08  youenn fablet  <youenn@apple.com>
     2
     3        [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
     4        https://bugs.webkit.org/show_bug.cgi?id=206582
     5        <rdar://problem/58985368>
     6
     7        Reviewed by Eric Carlson.
     8
     9        Add soft link macros for VideoToolbox and AudioToolbox.
     10
     11        * PAL.xcodeproj/project.pbxproj:
     12        * pal/cf/AudioToolboxSoftLink.cpp: Added.
     13        * pal/cf/AudioToolboxSoftLink.h: Added.
     14        * pal/cf/CoreMediaSoftLink.cpp:
     15        * pal/cf/CoreMediaSoftLink.h:
     16        * pal/cf/VideoToolboxSoftLink.cpp: Added.
     17        * pal/cf/VideoToolboxSoftLink.h: Added.
     18
    1192020-06-07  Philippe Normand  <pnormand@igalia.com>
    220
  • trunk/Source/WebCore/PAL/PAL.xcodeproj/project.pbxproj

    r262663 r262708  
    120120                2E1342CD215AA10A007199D2 /* UIKitSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 2E1342CB215AA10A007199D2 /* UIKitSoftLink.mm */; };
    121121                31308B1420A21705003FB929 /* SystemPreviewSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 31308B1320A21705003FB929 /* SystemPreviewSPI.h */; };
     122                416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */; };
     123                416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */ = {isa = PBXBuildFile; fileRef = 416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */; };
     124                41E1F344248A6A000022D5DE /* VideoToolboxSoftLink.cpp in Sources */ = {isa = PBXBuildFile; fileRef = 416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */; };
    122125                442956CD218A72DF0080DB54 /* RevealSPI.h in Headers */ = {isa = PBXBuildFile; fileRef = 442956CC218A72DE0080DB54 /* RevealSPI.h */; };
    123126                4450FC9F21F5F602004DFA56 /* QuickLookSoftLink.mm in Sources */ = {isa = PBXBuildFile; fileRef = 4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */; };
     
    301304                31308B1320A21705003FB929 /* SystemPreviewSPI.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = SystemPreviewSPI.h; sourceTree = "<group>"; };
    302305                37119A7820CCB5FF002C6DC9 /* WebKitTargetConditionals.xcconfig */ = {isa = PBXFileReference; lastKnownFileType = text.xcconfig; path = WebKitTargetConditionals.xcconfig; sourceTree = "<group>"; };
     306                416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = AudioToolboxSoftLink.cpp; sourceTree = "<group>"; };
     307                416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioToolboxSoftLink.h; sourceTree = "<group>"; };
     308                416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = VideoToolboxSoftLink.cpp; sourceTree = "<group>"; };
     309                416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoToolboxSoftLink.h; sourceTree = "<group>"; };
    303310                442956CC218A72DE0080DB54 /* RevealSPI.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = RevealSPI.h; sourceTree = "<group>"; };
    304311                4450FC9D21F5F602004DFA56 /* QuickLookSoftLink.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = QuickLookSoftLink.mm; sourceTree = "<group>"; };
     
    542549                        isa = PBXGroup;
    543550                        children = (
     551                                416E995123DAE6BD00E871CB /* AudioToolboxSoftLink.cpp */,
     552                                416E995223DAE6BE00E871CB /* AudioToolboxSoftLink.h */,
    544553                                0CF99CA61F738436007EE793 /* CoreMediaSoftLink.cpp */,
    545554                                0CF99CA71F738437007EE793 /* CoreMediaSoftLink.h */,
     555                                416E995523DAEFF700E871CB /* VideoToolboxSoftLink.cpp */,
     556                                416E995623DAEFF700E871CB /* VideoToolboxSoftLink.h */,
    546557                        );
    547558                        path = cf;
     
    736747                                57FD318B22B35989008D0E8B /* AppSSOSoftLink.h in Headers */,
    737748                                576CA9D622B854AB0030143C /* AppSSOSPI.h in Headers */,
     749                                416E995423DAE6BE00E871CB /* AudioToolboxSoftLink.h in Headers */,
    738750                                2D02E93C2056FAA700A13797 /* AudioToolboxSPI.h in Headers */,
    739751                                572A107822B456F500F410C8 /* AuthKitSPI.h in Headers */,
     
    954966                                293EE4A824154F8F0047493D /* AccessibilitySupportSoftLink.cpp in Sources */,
    955967                                57FD318A22B3593E008D0E8B /* AppSSOSoftLink.mm in Sources */,
     968                                416E995323DAE6BE00E871CB /* AudioToolboxSoftLink.cpp in Sources */,
    956969                                077E87B1226A460200A2AFF0 /* AVFoundationSoftLink.mm in Sources */,
    957970                                0C5FFF0F1F78D9DA009EFF1A /* ClockCM.mm in Sources */,
     
    968981                                5C7C787423AC3E770065F47E /* ManagedConfigurationSoftLink.mm in Sources */,
    969982                                0CF99CA41F736375007EE793 /* MediaTimeAVFoundation.cpp in Sources */,
     983                                41E1F344248A6A000022D5DE /* VideoToolboxSoftLink.cpp in Sources */,
    970984                                CDACB3602387425B0018D7CE /* MediaToolboxSoftLink.cpp in Sources */,
    971985                                A1F63CA021A4DBF7006FB43B /* PassKitSoftLink.mm in Sources */,
  • trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp

    r262663 r262708  
    4747SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferCopyDataBytes, OSStatus, (CMBlockBufferRef theSourceBuffer, size_t offsetToData, size_t dataLength, void* destination), (theSourceBuffer, offsetToData, dataLength, destination), PAL_EXPORT)
    4848SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer), PAL_EXPORT)
     49SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength), PAL_EXPORT)
    4950SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc), PAL_EXPORT)
    5051SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMSampleBufferGetTypeID, CFTypeID, (void), (), PAL_EXPORT)
     
    134135SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueIsEmpty, Boolean, (CMBufferQueueRef queue), (queue), PAL_EXPORT)
    135136SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetBufferCount, CMItemCount, (CMBufferQueueRef queue), (queue), PAL_EXPORT)
     137SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), (), PAL_EXPORT)
    136138SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue), PAL_EXPORT)
    137139SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue), PAL_EXPORT)
    138140SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut), PAL_EXPORT)
     141SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue), PAL_EXPORT)
    139142
    140143SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef, PAL_EXPORT)
     
    150153SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleAttachmentKey_IsDependedOnByOthers, CFStringRef, PAL_EXPORT)
    151154SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef, PAL_EXPORT)
     155SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef, PAL_EXPORT)
     156SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef, PAL_EXPORT)
     157SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef, PAL_EXPORT)
    152158
    153159SOFT_LINK_CONSTANT_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, kCMTimebaseNotification_EffectiveRateChanged, CFStringRef, PAL_EXPORT)
     
    164170SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMSampleBufferSetDataReady, OSStatus, (CMSampleBufferRef sbuf), (sbuf), PAL_EXPORT)
    165171SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionCreate, OSStatus, (CFAllocatorRef allocator, const AudioStreamBasicDescription* asbd, size_t layoutSize, const AudioChannelLayout* layout, size_t magicCookieSize, const void* magicCookie, CFDictionaryRef extensions, CMAudioFormatDescriptionRef* outDesc), (allocator, asbd, layoutSize, layout, magicCookieSize, magicCookie, extensions, outDesc), PAL_EXPORT)
     172SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut), PAL_EXPORT)
     173SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc), PAL_EXPORT)
    166174SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMClockGetHostTimeClock, CMClockRef, (void), (), PAL_EXPORT)
    167175SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMClockGetTime, CMTime, (CMClockRef clock), (clock), PAL_EXPORT)
  • trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h

    r262663 r262708  
    5050SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferGetDataLength, size_t, (CMBlockBufferRef theBuffer), (theBuffer))
    5151#define CMBlockBufferGetDataLength softLink_CoreMedia_CMBlockBufferGetDataLength
     52SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferReplaceDataBytes, OSStatus, (const void* sourceBytes, CMBlockBufferRef destinationBuffer, size_t offsetIntoDestination, size_t dataLength), (sourceBytes, destinationBuffer, offsetIntoDestination, dataLength))
     53#define CMBlockBufferReplaceDataBytes softLink_CoreMedia_CMBlockBufferReplaceDataBytes
    5254SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMFormatDescriptionGetExtensions, CFDictionaryRef, (CMFormatDescriptionRef desc), (desc))
    5355#define CMFormatDescriptionGetExtensions softLink_CoreMedia_CMFormatDescriptionGetExtensions
     
    226228SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetFirstPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
    227229#define CMBufferQueueGetFirstPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetFirstPresentationTimeStamp
     230SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetCallbacksForUnsortedSampleBuffers, const CMBufferCallbacks *, (), ())
     231#define CMBufferQueueGetCallbacksForUnsortedSampleBuffers softLink_CoreMedia_CMBufferQueueGetCallbacksForUnsortedSampleBuffers
    228232SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueGetEndPresentationTimeStamp, CMTime, (CMBufferQueueRef queue), (queue))
    229233#define CMBufferQueueGetEndPresentationTimeStamp softLink_CoreMedia_CMBufferQueueGetEndPresentationTimeStamp
    230234SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueInstallTriggerWithIntegerThreshold, OSStatus, (CMBufferQueueRef queue, CMBufferQueueTriggerCallback triggerCallback, void* triggerRefcon, CMBufferQueueTriggerCondition triggerCondition, CMItemCount triggerThreshold, CMBufferQueueTriggerToken* triggerTokenOut), (queue, triggerCallback, triggerRefcon, triggerCondition, triggerThreshold, triggerTokenOut))
    231235#define CMBufferQueueInstallTriggerWithIntegerThreshold softLink_CoreMedia_CMBufferQueueInstallTriggerWithIntegerThreshold
     236SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBufferQueueMarkEndOfData, OSStatus, (CMBufferQueueRef queue), (queue))
     237#define CMBufferQueueMarkEndOfData softLink_CoreMedia_CMBufferQueueMarkEndOfData
    232238
    233239SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleAttachmentKey_DoNotDisplay, CFStringRef)
     
    259265SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferConsumerNotification_BufferConsumed, CFStringRef)
    260266#define kCMSampleBufferConsumerNotification_BufferConsumed get_CoreMedia_kCMSampleBufferConsumerNotification_BufferConsumed()
     267SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, CFStringRef)
     268#define kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration get_CoreMedia_kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration()
     269SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, kCMSampleBufferAttachmentKey_GradualDecoderRefresh, CFStringRef)
     270#define kCMSampleBufferAttachmentKey_GradualDecoderRefresh get_CoreMedia_kCMSampleBufferAttachmentKey_GradualDecoderRefresh()
     271SOFT_LINK_CONSTANT_FOR_HEADER(PAL, CoreMedia, get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart, CFStringRef)
     272#define get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart get_CoreMedia_kCMSampleBufferAttachmentKey_TrimDurationAtStart()
     273
     274SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetMagicCookie, const void*, (CMAudioFormatDescriptionRef desc, size_t* sizeOut), (desc, sizeOut))
     275#define CMAudioFormatDescriptionGetMagicCookie softLink_CoreMedia_CMAudioFormatDescriptionGetMagicCookie
    261276SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetStreamBasicDescription, const AudioStreamBasicDescription *, (CMAudioFormatDescriptionRef desc), (desc))
    262277#define CMAudioFormatDescriptionGetStreamBasicDescription softLink_CoreMedia_CMAudioFormatDescriptionGetStreamBasicDescription
     
    267282SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferGetNumSamples, CMItemCount, (CMSampleBufferRef sbuf), (sbuf))
    268283#define CMSampleBufferGetNumSamples softLink_CoreMedia_CMSampleBufferGetNumSamples
     284SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMAudioFormatDescriptionGetRichestDecodableFormat, const AudioFormatListItem *, (CMAudioFormatDescriptionRef desc), (desc))
     285#define CMAudioFormatDescriptionGetRichestDecodableFormat softLink_CoreMedia_CMAudioFormatDescriptionGetRichestDecodableFormat
    269286SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMSampleBufferCopySampleBufferForRange, OSStatus, (CFAllocatorRef allocator, CMSampleBufferRef sbuf, CFRange sampleRange, CMSampleBufferRef* sBufOut), (allocator, sbuf, sampleRange, sBufOut))
    270287#define CMSampleBufferCopySampleBufferForRange softLink_CoreMedia_CMSampleBufferCopySampleBufferForRange
  • trunk/Source/WebCore/SourcesCocoa.txt

    r262682 r262708  
    493493
    494494platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp
     495platform/mediarecorder/cocoa/AudioSampleBufferCompressor.mm
    495496platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm
     497platform/mediarecorder/cocoa/VideoSampleBufferCompressor.mm
    496498
    497499platform/mediasession/mac/MediaSessionInterruptionProviderMac.mm
  • trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj

    r262695 r262708  
    75887588                41C7E1061E6A54360027B4DE /* CanvasCaptureMediaStreamTrack.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = CanvasCaptureMediaStreamTrack.h; sourceTree = "<group>"; };
    75897589                41C7E1081E6AA37C0027B4DE /* CanvasCaptureMediaStreamTrack.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = CanvasCaptureMediaStreamTrack.idl; sourceTree = "<group>"; };
     7590                41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = VideoSampleBufferCompressor.h; sourceTree = "<group>"; };
     7591                41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = VideoSampleBufferCompressor.mm; sourceTree = "<group>"; };
    75907592                41CF8BE41D46222000707DC9 /* FetchBodyConsumer.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = FetchBodyConsumer.cpp; sourceTree = "<group>"; };
    75917593                41CF8BE51D46222000707DC9 /* FetchBodyConsumer.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = FetchBodyConsumer.h; sourceTree = "<group>"; };
     
    76167618                41E1B1CB0FF5986900576B3B /* AbstractWorker.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AbstractWorker.h; sourceTree = "<group>"; };
    76177619                41E1B1CC0FF5986900576B3B /* AbstractWorker.idl */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = text; path = AbstractWorker.idl; sourceTree = "<group>"; };
     7620                41E1F33D248A62B60022D5DE /* AudioSampleBufferCompressor.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = AudioSampleBufferCompressor.mm; sourceTree = "<group>"; };
     7621                41E1F33F248A62B60022D5DE /* AudioSampleBufferCompressor.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = AudioSampleBufferCompressor.h; sourceTree = "<group>"; };
    76187622                41E408381DCB747900EFCE19 /* PeerConnectionBackend.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = PeerConnectionBackend.cpp; sourceTree = "<group>"; };
    76197623                41E593FD214865A900D3CB61 /* RTCPriorityType.idl */ = {isa = PBXFileReference; lastKnownFileType = text; path = RTCPriorityType.idl; sourceTree = "<group>"; };
     
    1935719361                        isa = PBXGroup;
    1935819362                        children = (
     19363                                41E1F33F248A62B60022D5DE /* AudioSampleBufferCompressor.h */,
     19364                                41E1F33D248A62B60022D5DE /* AudioSampleBufferCompressor.mm */,
    1935919365                                4D73F94C218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.h */,
    1936019366                                4D73F94D218C4A87003A3ED6 /* MediaRecorderPrivateWriterCocoa.mm */,
     19367                                41CD6F8923D6E81C00B16421 /* VideoSampleBufferCompressor.h */,
     19368                                41CD6F8B23D6E81D00B16421 /* VideoSampleBufferCompressor.mm */,
    1936119369                        );
    1936219370                        path = cocoa;
     
    3431334321                                CD0EEE0E14743F39003EAFA2 /* AudioDestinationIOS.cpp in Sources */,
    3431434322                                CD5596911475B678001D0BD0 /* AudioFileReaderIOS.cpp in Sources */,
    34315                                 41E1F343248A69D40022D5DE /* AudioSampleBufferCompressor.mm in Sources */,
    3431634323                                CDA79827170A279100D45C55 /* AudioSessionIOS.mm in Sources */,
    3431734324                                CD8A7BBB197735FE00CBD643 /* AudioSourceProviderAVFObjC.mm in Sources */,
     
    3503035037                                3FBC4AF3189881560046EE38 /* VideoFullscreenInterfaceAVKit.mm in Sources */,
    3503135038                                52D5A18F1C54592300DE34A3 /* VideoLayerManagerObjC.mm in Sources */,
    35032                                 41E1F342248A69D00022D5DE /* VideoSampleBufferCompressor.mm in Sources */,
    3503335039                                CD336F6717FA0AC600DDDCD0 /* VideoTrackPrivateAVFObjC.cpp in Sources */,
    3503435040                                CD8B5A42180D149A008B8E65 /* VideoTrackPrivateMediaSourceAVFObjC.mm in Sources */,
  • trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.cpp

    r262663 r262708  
    2727#include "MediaRecorderPrivateAVFImpl.h"
    2828
    29 #if ENABLE(MEDIA_STREAM)
     29#if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    3030
    3131#include "AudioStreamDescription.h"
     32#include "MediaRecorderPrivateWriterCocoa.h"
    3233#include "MediaSample.h"
    3334#include "MediaStreamPrivate.h"
     
    113114} // namespace WebCore
    114115
    115 #endif // ENABLE(MEDIA_STREAM)
     116#endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebCore/platform/mediarecorder/MediaRecorderPrivateAVFImpl.h

    r262663 r262708  
    2525#pragma once
    2626
    27 #if ENABLE(MEDIA_STREAM)
     27#if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    2828
    2929#include "MediaRecorderPrivate.h"
     
    6161} // namespace WebCore
    6262
    63 #endif // ENABLE(MEDIA_STREAM)
     63#endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.h

    r262663 r262708  
    2525#pragma once
    2626
    27 #if ENABLE(MEDIA_STREAM)
     27#if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
     28
     29#include "AudioStreamDescription.h"
    2830
    2931#include "SharedBuffer.h"
     
    3638#include <wtf/threads/BinarySemaphore.h>
    3739
     40#include <CoreAudio/CoreAudioTypes.h>
     41#include <CoreMedia/CMTime.h>
     42
    3843typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
     44typedef const struct opaqueCMFormatDescription* CMFormatDescriptionRef;
     45typedef struct opaqueCMBufferQueueTriggerToken *CMBufferQueueTriggerToken;
    3946
    4047OBJC_CLASS AVAssetWriter;
    4148OBJC_CLASS AVAssetWriterInput;
     49OBJC_CLASS WebAVAssetWriterDelegate;
    4250
    4351namespace WTF {
     
    4755namespace WebCore {
    4856
     57class AudioSampleBufferCompressor;
    4958class AudioStreamDescription;
    5059class MediaStreamTrackPrivate;
    5160class PlatformAudioData;
     61class VideoSampleBufferCompressor;
    5262
    53 class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter> {
     63class WEBCORE_EXPORT MediaRecorderPrivateWriter : public ThreadSafeRefCounted<MediaRecorderPrivateWriter, WTF::DestructionThread::Main>, public CanMakeWeakPtr<MediaRecorderPrivateWriter, WeakPtrFactoryInitialization::Eager> {
    5464public:
    5565    static RefPtr<MediaRecorderPrivateWriter> create(const MediaStreamTrackPrivate* audioTrack, const MediaStreamTrackPrivate* videoTrack);
    5666    static RefPtr<MediaRecorderPrivateWriter> create(bool hasAudio, int width, int height);
    5767    ~MediaRecorderPrivateWriter();
    58    
    59     bool setupWriter();
    60     bool setVideoInput(int width, int height);
    61     bool setAudioInput();
     68
    6269    void appendVideoSampleBuffer(CMSampleBufferRef);
    6370    void appendAudioSampleBuffer(const PlatformAudioData&, const AudioStreamDescription&, const WTF::MediaTime&, size_t);
     
    6572    void fetchData(CompletionHandler<void(RefPtr<SharedBuffer>&&)>&&);
    6673
     74    void appendData(const char*, size_t);
     75    void appendData(Ref<SharedBuffer>&&);
     76
    6777private:
    68     MediaRecorderPrivateWriter(RetainPtr<AVAssetWriter>&&, String&& path);
     78    MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo);
    6979    void clear();
    7080
    71     RetainPtr<AVAssetWriter> m_writer;
    72     RetainPtr<AVAssetWriterInput> m_videoInput;
    73     RetainPtr<AVAssetWriterInput> m_audioInput;
     81    bool initialize();
    7482
    75     String m_path;
    76     Lock m_videoLock;
    77     Lock m_audioLock;
    78     BinarySemaphore m_finishWritingSemaphore;
    79     BinarySemaphore m_finishWritingAudioSemaphore;
    80     BinarySemaphore m_finishWritingVideoSemaphore;
     83    static void compressedVideoOutputBufferCallback(void*, CMBufferQueueTriggerToken);
     84    static void compressedAudioOutputBufferCallback(void*, CMBufferQueueTriggerToken);
     85
     86    void startAssetWriter();
     87    void appendCompressedSampleBuffers();
     88
     89    bool appendCompressedAudioSampleBuffer();
     90    bool appendCompressedVideoSampleBuffer();
     91
     92    void processNewCompressedAudioSampleBuffers();
     93    void processNewCompressedVideoSampleBuffers();
     94
     95    void flushCompressedSampleBuffers(CompletionHandler<void()>&&);
     96    void appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&&);
     97
    8198    bool m_hasStartedWriting { false };
    8299    bool m_isStopped { false };
    83     bool m_isFirstAudioSample { true };
    84     dispatch_queue_t m_audioPullQueue;
    85     dispatch_queue_t m_videoPullQueue;
    86     Deque<RetainPtr<CMSampleBufferRef>> m_videoBufferPool;
    87     Deque<RetainPtr<CMSampleBufferRef>> m_audioBufferPool;
     100
     101    RetainPtr<AVAssetWriter> m_writer;
    88102
    89103    bool m_isStopping { false };
    90104    RefPtr<SharedBuffer> m_data;
    91105    CompletionHandler<void(RefPtr<SharedBuffer>&&)> m_fetchDataCompletionHandler;
     106
     107    bool m_hasAudio;
     108    bool m_hasVideo;
     109
     110    RetainPtr<CMFormatDescriptionRef> m_audioFormatDescription;
     111    std::unique_ptr<AudioSampleBufferCompressor> m_audioCompressor;
     112    RetainPtr<AVAssetWriterInput> m_audioAssetWriterInput;
     113
     114    RetainPtr<CMFormatDescriptionRef> m_videoFormatDescription;
     115    std::unique_ptr<VideoSampleBufferCompressor> m_videoCompressor;
     116    RetainPtr<AVAssetWriterInput> m_videoAssetWriterInput;
     117    CMTime m_lastVideoPresentationTime;
     118    CMTime m_lastVideoDecodingTime;
     119    bool m_hasEncodedVideoSamples { false };
     120
     121    RetainPtr<WebAVAssetWriterDelegate> m_writerDelegate;
    92122};
    93123
    94124} // namespace WebCore
    95125
    96 #endif // ENABLE(MEDIA_STREAM)
     126#endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebCore/platform/mediarecorder/cocoa/MediaRecorderPrivateWriterCocoa.mm

    r262663 r262708  
    2424 */
    2525
    26 #import "config.h"
    27 #import "MediaRecorderPrivateWriterCocoa.h"
    28 
    29 #if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    30 
    31 #import "AudioStreamDescription.h"
    32 #import "Logging.h"
    33 #import "MediaStreamTrackPrivate.h"
    34 #import "WebAudioBufferList.h"
    35 #import <AVFoundation/AVAssetWriter.h>
    36 #import <AVFoundation/AVAssetWriterInput.h>
    37 #import <wtf/CompletionHandler.h>
    38 #import <wtf/FileSystem.h>
    39 
    40 #import <pal/cf/CoreMediaSoftLink.h>
    41 #import <pal/cocoa/AVFoundationSoftLink.h>
    42 
    43 #undef AVEncoderBitRateKey
    44 #define AVEncoderBitRateKey getAVEncoderBitRateKeyWithFallback()
    45 #undef AVFormatIDKey
    46 #define AVFormatIDKey getAVFormatIDKeyWithFallback()
    47 #undef AVNumberOfChannelsKey
    48 #define AVNumberOfChannelsKey getAVNumberOfChannelsKeyWithFallback()
    49 #undef AVSampleRateKey
    50 #define AVSampleRateKey getAVSampleRateKeyWithFallback()
     26#include "config.h"
     27#include "MediaRecorderPrivateWriterCocoa.h"
     28
     29#if ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
     30
     31#include "AudioSampleBufferCompressor.h"
     32#include "AudioStreamDescription.h"
     33#include "Logging.h"
     34#include "MediaStreamTrackPrivate.h"
     35#include "VideoSampleBufferCompressor.h"
     36#include "WebAudioBufferList.h"
     37#include <AVFoundation/AVAssetWriter.h>
     38#include <AVFoundation/AVAssetWriterInput.h>
     39#include <AVFoundation/AVAssetWriter_Private.h>
     40#include <pal/avfoundation/MediaTimeAVFoundation.h>
     41#include <wtf/BlockPtr.h>
     42#include <wtf/CompletionHandler.h>
     43#include <wtf/FileSystem.h>
     44#include <wtf/cf/TypeCastsCF.h>
     45
     46#include <pal/cf/CoreMediaSoftLink.h>
     47#include <pal/cocoa/AVFoundationSoftLink.h>
     48
     49@interface WebAVAssetWriterDelegate : NSObject <AVAssetWriterDelegate> {
     50    WeakPtr<WebCore::MediaRecorderPrivateWriter> m_writer;
     51}
     52
     53- (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer;
     54- (void)close;
     55
     56@end
     57
     58@implementation WebAVAssetWriterDelegate {
     59};
     60
     61- (instancetype)initWithWriter:(WebCore::MediaRecorderPrivateWriter*)writer
     62{
     63    ASSERT(isMainThread());
     64    self = [super init];
     65    if (self)
     66        self->m_writer = makeWeakPtr(writer);
     67
     68    return self;
     69}
     70
     71- (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedHeaderData:(NSData *)fragmentedHeaderData
     72{
     73    UNUSED_PARAM(assetWriter);
     74    if (!isMainThread()) {
     75        if (auto size = [fragmentedHeaderData length]) {
     76            callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedHeaderData bytes]), size)]() mutable {
     77                if (protectedSelf->m_writer)
     78                    protectedSelf->m_writer->appendData(WTFMove(buffer));
     79            });
     80        }
     81        return;
     82    }
     83
     84    if (m_writer)
     85        m_writer->appendData(static_cast<const char*>([fragmentedHeaderData bytes]), [fragmentedHeaderData length]);
     86}
     87
     88- (void)assetWriter:(AVAssetWriter *)assetWriter didProduceFragmentedMediaData:(NSData *)fragmentedMediaData fragmentedMediaDataReport:(AVFragmentedMediaDataReport *)fragmentedMediaDataReport
     89{
     90    UNUSED_PARAM(assetWriter);
     91    UNUSED_PARAM(fragmentedMediaDataReport);
     92    if (!isMainThread()) {
     93        if (auto size = [fragmentedMediaData length]) {
     94            callOnMainThread([protectedSelf = RetainPtr<WebAVAssetWriterDelegate>(self), buffer = WebCore::SharedBuffer::create(static_cast<const char*>([fragmentedMediaData bytes]), size)]() mutable {
     95                if (protectedSelf->m_writer)
     96                    protectedSelf->m_writer->appendData(WTFMove(buffer));
     97            });
     98        }
     99        return;
     100    }
     101
     102    if (m_writer)
     103        m_writer->appendData(static_cast<const char*>([fragmentedMediaData bytes]), [fragmentedMediaData length]);
     104}
     105
     106- (void)close
     107{
     108    m_writer = nullptr;
     109}
     110
     111@end
    51112
    52113namespace WebCore {
     
    54115using namespace PAL;
    55116
    56 static NSString *getAVFormatIDKeyWithFallback()
    57 {
    58     if (PAL::canLoad_AVFoundation_AVFormatIDKey())
    59         return PAL::get_AVFoundation_AVFormatIDKey();
    60 
    61     RELEASE_LOG_ERROR(Media, "Failed to load AVFormatIDKey");
    62     return @"AVFormatIDKey";
    63 }
    64 
    65 static NSString *getAVNumberOfChannelsKeyWithFallback()
    66 {
    67     if (PAL::canLoad_AVFoundation_AVNumberOfChannelsKey())
    68         return PAL::get_AVFoundation_AVNumberOfChannelsKey();
    69 
    70     RELEASE_LOG_ERROR(Media, "Failed to load AVNumberOfChannelsKey");
    71     return @"AVNumberOfChannelsKey";
    72 }
    73 
    74 static NSString *getAVSampleRateKeyWithFallback()
    75 {
    76     if (PAL::canLoad_AVFoundation_AVSampleRateKey())
    77         return PAL::get_AVFoundation_AVSampleRateKey();
    78 
    79     RELEASE_LOG_ERROR(Media, "Failed to load AVSampleRateKey");
    80     return @"AVSampleRateKey";
    81 }
    82 
    83 static NSString *getAVEncoderBitRateKeyWithFallback()
    84 {
    85     if (PAL::canLoad_AVFoundation_AVEncoderBitRateKey())
    86         return PAL::get_AVFoundation_AVEncoderBitRateKey();
    87 
    88     RELEASE_LOG_ERROR(Media, "Failed to load AVEncoderBitRateKey");
    89     return @"AVEncoderBitRateKey";
     117RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(bool hasAudio, int width, int height)
     118{
     119    auto writer = adoptRef(*new MediaRecorderPrivateWriter(hasAudio, width && height));
     120    if (!writer->initialize())
     121        return nullptr;
     122    return writer;
    90123}
    91124
     
    101134}
    102135
    103 RefPtr<MediaRecorderPrivateWriter> MediaRecorderPrivateWriter::create(bool hasAudio, int width, int height)
    104 {
    105     NSString *directory = FileSystem::createTemporaryDirectory(@"videos");
    106     NSString *filename = [NSString stringWithFormat:@"/%lld.mp4", CMClockGetTime(CMClockGetHostTimeClock()).value];
    107     NSString *path = [directory stringByAppendingString:filename];
    108 
    109     NSURL *outputURL = [NSURL fileURLWithPath:path];
    110     String filePath = [path UTF8String];
     136void MediaRecorderPrivateWriter::compressedVideoOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken)
     137{
     138    auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter);
     139    writer->processNewCompressedVideoSampleBuffers();
     140}
     141
     142void MediaRecorderPrivateWriter::compressedAudioOutputBufferCallback(void *mediaRecorderPrivateWriter, CMBufferQueueTriggerToken)
     143{
     144    auto *writer = static_cast<MediaRecorderPrivateWriter*>(mediaRecorderPrivateWriter);
     145    writer->processNewCompressedAudioSampleBuffers();
     146}
     147
     148MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(bool hasAudio, bool hasVideo)
     149    : m_hasAudio(hasAudio)
     150    , m_hasVideo(hasVideo)
     151{
     152}
     153
     154MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter()
     155{
     156    clear();
     157}
     158
     159bool MediaRecorderPrivateWriter::initialize()
     160{
    111161    NSError *error = nil;
    112     auto avAssetWriter = adoptNS([PAL::allocAVAssetWriterInstance() initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error]);
     162    ALLOW_DEPRECATED_DECLARATIONS_BEGIN
     163    m_writer = adoptNS([PAL::allocAVAssetWriterInstance() initWithFileType:AVFileTypeMPEG4 error:&error]);
     164    ALLOW_DEPRECATED_DECLARATIONS_END
    113165    if (error) {
    114166        RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)error.code);
    115         return nullptr;
    116     }
    117 
    118     auto writer = adoptRef(*new MediaRecorderPrivateWriter(WTFMove(avAssetWriter), WTFMove(filePath)));
    119 
    120     if (hasAudio && !writer->setAudioInput())
    121         return nullptr;
    122 
    123     if (width && height) {
    124         if (!writer->setVideoInput(width, height))
    125             return nullptr;
    126     }
    127 
    128     return WTFMove(writer);
    129 }
    130 
    131 MediaRecorderPrivateWriter::MediaRecorderPrivateWriter(RetainPtr<AVAssetWriter>&& avAssetWriter, String&& filePath)
    132     : m_writer(WTFMove(avAssetWriter))
    133     , m_path(WTFMove(filePath))
    134 {
    135 }
    136 
    137 MediaRecorderPrivateWriter::~MediaRecorderPrivateWriter()
    138 {
    139     clear();
     167        return false;
     168    }
     169
     170    m_writerDelegate = adoptNS([[WebAVAssetWriterDelegate alloc] initWithWriter: this]);
     171    [m_writer.get() setDelegate:m_writerDelegate.get()];
     172
     173    if (m_hasAudio) {
     174        m_audioCompressor = AudioSampleBufferCompressor::create(compressedAudioOutputBufferCallback, this);
     175        if (!m_audioCompressor)
     176            return false;
     177    }
     178    if (m_hasVideo) {
     179        m_videoCompressor = VideoSampleBufferCompressor::create(kCMVideoCodecType_H264, compressedVideoOutputBufferCallback, this);
     180        if (!m_videoCompressor)
     181            return false;
     182    }
     183    return true;
     184}
     185
     186void MediaRecorderPrivateWriter::processNewCompressedVideoSampleBuffers()
     187{
     188    ASSERT(m_hasVideo);
     189    if (!m_videoFormatDescription) {
     190        m_videoFormatDescription = CMSampleBufferGetFormatDescription(m_videoCompressor->getOutputSampleBuffer());
     191        callOnMainThread([weakThis = makeWeakPtr(this), this] {
     192            if (!weakThis)
     193                return;
     194
     195            if (m_hasAudio && !m_audioFormatDescription)
     196                return;
     197
     198            startAssetWriter();
     199        });
     200    }
     201    if (!m_hasStartedWriting)
     202        return;
     203    appendCompressedSampleBuffers();
     204}
     205
     206void MediaRecorderPrivateWriter::processNewCompressedAudioSampleBuffers()
     207{
     208    ASSERT(m_hasAudio);
     209    if (!m_audioFormatDescription) {
     210        m_audioFormatDescription = CMSampleBufferGetFormatDescription(m_audioCompressor->getOutputSampleBuffer());
     211        callOnMainThread([weakThis = makeWeakPtr(this), this] {
     212            if (!weakThis)
     213                return;
     214
     215            if (m_hasVideo && !m_videoFormatDescription)
     216                return;
     217
     218            startAssetWriter();
     219        });
     220    }
     221    if (!m_hasStartedWriting)
     222        return;
     223    appendCompressedSampleBuffers();
     224}
     225
     226void MediaRecorderPrivateWriter::startAssetWriter()
     227{
     228    if (m_hasVideo) {
     229        m_videoAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:nil sourceFormatHint:m_videoFormatDescription.get()]);
     230        [m_videoAssetWriterInput setExpectsMediaDataInRealTime:true];
     231        if (![m_writer.get() canAddInput:m_videoAssetWriterInput.get()]) {
     232            RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for video");
     233            return;
     234        }
     235        [m_writer.get() addInput:m_videoAssetWriterInput.get()];
     236    }
     237
     238    if (m_hasAudio) {
     239        m_audioAssetWriterInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:m_audioFormatDescription.get()]);
     240        [m_audioAssetWriterInput setExpectsMediaDataInRealTime:true];
     241        if (![m_writer.get() canAddInput:m_audioAssetWriterInput.get()]) {
     242            RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed canAddInput for audio");
     243            return;
     244        }
     245        [m_writer.get() addInput:m_audioAssetWriterInput.get()];
     246    }
     247
     248    if (![m_writer.get() startWriting]) {
     249        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter::startAssetWriter failed startWriting");
     250        return;
     251    }
     252
     253    [m_writer.get() startSessionAtSourceTime:kCMTimeZero];
     254
     255    appendCompressedSampleBuffers();
     256
     257    m_hasStartedWriting = true;
     258}
     259
     260bool MediaRecorderPrivateWriter::appendCompressedAudioSampleBuffer()
     261{
     262    if (!m_audioCompressor)
     263        return false;
     264
     265    if (![m_audioAssetWriterInput isReadyForMoreMediaData])
     266        return false;
     267
     268    auto buffer = m_audioCompressor->takeOutputSampleBuffer();
     269    if (!buffer)
     270        return false;
     271
     272    [m_audioAssetWriterInput.get() appendSampleBuffer:buffer.get()];
     273    return true;
     274}
     275
     276bool MediaRecorderPrivateWriter::appendCompressedVideoSampleBuffer()
     277{
     278    if (!m_videoCompressor)
     279        return false;
     280
     281    if (![m_videoAssetWriterInput isReadyForMoreMediaData])
     282        return false;
     283
     284    auto buffer = m_videoCompressor->takeOutputSampleBuffer();
     285    if (!buffer)
     286        return false;
     287
     288    m_lastVideoPresentationTime = CMSampleBufferGetPresentationTimeStamp(buffer.get());
     289    m_lastVideoDecodingTime = CMSampleBufferGetDecodeTimeStamp(buffer.get());
     290    m_hasEncodedVideoSamples = true;
     291
     292    [m_videoAssetWriterInput.get() appendSampleBuffer:buffer.get()];
     293    return true;
     294}
     295
     296void MediaRecorderPrivateWriter::appendCompressedSampleBuffers()
     297{
     298    while (appendCompressedVideoSampleBuffer() || appendCompressedAudioSampleBuffer()) { };
     299}
     300
     301static inline void appendEndsPreviousSampleDurationMarker(AVAssetWriterInput *assetWriterInput, CMTime presentationTimeStamp, CMTime decodingTimeStamp)
     302{
     303    CMSampleTimingInfo timingInfo = { kCMTimeInvalid, presentationTimeStamp, decodingTimeStamp};
     304
     305    CMSampleBufferRef buffer = NULL;
     306    auto error = CMSampleBufferCreate(kCFAllocatorDefault, NULL, true, NULL, NULL, NULL, 0, 1, &timingInfo, 0, NULL, &buffer);
     307    if (error) {
     308        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendEndsPreviousSampleDurationMarker failed CMSampleBufferCreate with %d", error);
     309        return;
     310    }
     311    auto sampleBuffer = adoptCF(buffer);
     312
     313    CMSetAttachment(sampleBuffer.get(), kCMSampleBufferAttachmentKey_EndsPreviousSampleDuration, kCFBooleanTrue, kCMAttachmentMode_ShouldPropagate);
     314    if (![assetWriterInput appendSampleBuffer:sampleBuffer.get()])
     315        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter appendSampleBuffer to writer input failed");
     316}
     317
     318void MediaRecorderPrivateWriter::appendEndOfVideoSampleDurationIfNeeded(CompletionHandler<void()>&& completionHandler)
     319{
     320    if (!m_hasEncodedVideoSamples) {
     321        completionHandler();
     322        return;
     323    }
     324    if ([m_videoAssetWriterInput isReadyForMoreMediaData]) {
     325        appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime);
     326        completionHandler();
     327        return;
     328    }
     329
     330    auto block = makeBlockPtr([this, weakThis = makeWeakPtr(this), completionHandler = WTFMove(completionHandler)]() mutable {
     331        if (weakThis) {
     332            appendEndsPreviousSampleDurationMarker(m_videoAssetWriterInput.get(), m_lastVideoPresentationTime, m_lastVideoDecodingTime);
     333            [m_videoAssetWriterInput markAsFinished];
     334        }
     335        completionHandler();
     336    });
     337    [m_videoAssetWriterInput requestMediaDataWhenReadyOnQueue:dispatch_get_main_queue() usingBlock:block.get()];
     338}
     339
     340void MediaRecorderPrivateWriter::flushCompressedSampleBuffers(CompletionHandler<void()>&& completionHandler)
     341{
     342    appendCompressedSampleBuffers();
     343    appendEndOfVideoSampleDurationIfNeeded(WTFMove(completionHandler));
    140344}
    141345
    142346void MediaRecorderPrivateWriter::clear()
    143347{
    144     if (m_videoInput) {
    145         m_videoInput.clear();
    146         dispatch_release(m_videoPullQueue);
    147     }
    148     if (m_audioInput) {
    149         m_audioInput.clear();
    150         dispatch_release(m_audioPullQueue);
    151     }
    152348    if (m_writer)
    153349        m_writer.clear();
     
    158354}
    159355
    160 bool MediaRecorderPrivateWriter::setVideoInput(int width, int height)
    161 {
    162     ASSERT(!m_videoInput);
    163    
    164     NSDictionary *compressionProperties = @{
    165         AVVideoAverageBitRateKey : @(width * height * 12),
    166         AVVideoExpectedSourceFrameRateKey : @(30),
    167         AVVideoMaxKeyFrameIntervalKey : @(120),
    168         AVVideoProfileLevelKey : AVVideoProfileLevelH264MainAutoLevel
    169     };
    170 
    171     NSDictionary *videoSettings = @{
    172         AVVideoCodecKey: AVVideoCodecH264,
    173         AVVideoWidthKey: @(width),
    174         AVVideoHeightKey: @(height),
    175         AVVideoCompressionPropertiesKey: compressionProperties
    176     };
    177    
    178     m_videoInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeVideo outputSettings:videoSettings sourceFormatHint:nil]);
    179     [m_videoInput setExpectsMediaDataInRealTime:true];
    180    
    181     if (![m_writer canAddInput:m_videoInput.get()]) {
    182         m_videoInput = nullptr;
    183         RELEASE_LOG_ERROR(MediaStream, "the video input is not allowed to add to the AVAssetWriter");
    184         return false;
    185     }
    186     [m_writer addInput:m_videoInput.get()];
    187     m_videoPullQueue = dispatch_queue_create("WebCoreVideoRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);
    188     return true;
    189 }
    190 
    191 bool MediaRecorderPrivateWriter::setAudioInput()
    192 {
    193     ASSERT(!m_audioInput);
    194 
    195     NSDictionary *audioSettings = @{
    196         AVEncoderBitRateKey : @(28000),
    197         AVFormatIDKey : @(kAudioFormatMPEG4AAC),
    198         AVNumberOfChannelsKey : @(1),
    199         AVSampleRateKey : @(22050)
    200     };
    201 
    202     m_audioInput = adoptNS([PAL::allocAVAssetWriterInputInstance() initWithMediaType:AVMediaTypeAudio outputSettings:audioSettings sourceFormatHint:nil]);
    203     [m_audioInput setExpectsMediaDataInRealTime:true];
    204    
    205     if (![m_writer canAddInput:m_audioInput.get()]) {
    206         m_audioInput = nullptr;
    207         RELEASE_LOG_ERROR(MediaStream, "the audio input is not allowed to add to the AVAssetWriter");
    208         return false;
    209     }
    210     [m_writer addInput:m_audioInput.get()];
    211     m_audioPullQueue = dispatch_queue_create("WebCoreAudioRecordingPullBufferQueue", DISPATCH_QUEUE_SERIAL);
    212     return true;
    213 }
    214356
    215357static inline RetainPtr<CMSampleBufferRef> copySampleBufferWithCurrentTimeStamp(CMSampleBufferRef originalBuffer)
     
    218360    CMItemCount count = 0;
    219361    CMSampleBufferGetSampleTimingInfoArray(originalBuffer, 0, nil, &count);
    220    
     362
    221363    Vector<CMSampleTimingInfo> timeInfo(count);
    222364    CMSampleBufferGetSampleTimingInfoArray(originalBuffer, count, timeInfo.data(), &count);
    223    
    224     for (CMItemCount i = 0; i < count; i++) {
     365
     366    for (auto i = 0; i < count; i++) {
    225367        timeInfo[i].decodeTimeStamp = kCMTimeInvalid;
    226368        timeInfo[i].presentationTimeStamp = startTime;
    227369    }
    228    
     370
    229371    CMSampleBufferRef newBuffer = nullptr;
    230     auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer);
    231     if (error)
     372    if (auto error = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault, originalBuffer, count, timeInfo.data(), &newBuffer)) {
     373        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferCreateCopyWithNewTiming failed with %d", error);
    232374        return nullptr;
     375    }
    233376    return adoptCF(newBuffer);
    234377}
     
    236379void MediaRecorderPrivateWriter::appendVideoSampleBuffer(CMSampleBufferRef sampleBuffer)
    237380{
    238     ASSERT(m_videoInput);
    239     if (m_isStopped)
    240         return;
    241 
    242     if (!m_hasStartedWriting) {
    243         if (![m_writer startWriting]) {
    244             m_isStopped = true;
    245             RELEASE_LOG_ERROR(MediaStream, "create AVAssetWriter instance failed with error code %ld", (long)[m_writer error]);
    246             return;
    247         }
    248         [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())];
    249         m_hasStartedWriting = true;
    250         RefPtr<MediaRecorderPrivateWriter> protectedThis = this;
    251         [m_videoInput requestMediaDataWhenReadyOnQueue:m_videoPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] {
    252             do {
    253                 if (![m_videoInput isReadyForMoreMediaData])
    254                     break;
    255                 auto locker = holdLock(m_videoLock);
    256                 if (m_videoBufferPool.isEmpty())
    257                     break;
    258                 auto buffer = m_videoBufferPool.takeFirst();
    259                 locker.unlockEarly();
    260                 if (![m_videoInput appendSampleBuffer:buffer.get()])
    261                     break;
    262             } while (true);
    263             if (m_isStopped && m_videoBufferPool.isEmpty()) {
    264                 [m_videoInput markAsFinished];
    265                 m_finishWritingVideoSemaphore.signal();
    266             }
    267         }];
    268         return;
    269     }
    270     auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer);
    271     if (!bufferWithCurrentTime)
    272         return;
    273 
    274     auto locker = holdLock(m_videoLock);
    275     m_videoBufferPool.append(WTFMove(bufferWithCurrentTime));
     381    // FIXME: We should not set the timestamps if they are already set.
     382    if (auto bufferWithCurrentTime = copySampleBufferWithCurrentTimeStamp(sampleBuffer))
     383        m_videoCompressor->addSampleBuffer(bufferWithCurrentTime.get());
    276384}
    277385
     
    281389    CMFormatDescriptionRef format = nullptr;
    282390    auto error = CMAudioFormatDescriptionCreate(kCFAllocatorDefault, basicDescription, 0, NULL, 0, NULL, NULL, &format);
    283     if (error)
     391    if (error) {
     392        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMAudioFormatDescriptionCreate failed with %d", error);
    284393        return nullptr;
     394    }
    285395    return adoptCF(format);
    286396}
    287397
    288 static inline RetainPtr<CMSampleBufferRef> createAudioSampleBufferWithPacketDescriptions(CMFormatDescriptionRef format, size_t sampleCount)
    289 {
    290     CMTime startTime = CMClockGetTime(CMClockGetHostTimeClock());
    291     CMSampleBufferRef sampleBuffer = nullptr;
    292     auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format, sampleCount, startTime, NULL, &sampleBuffer);
    293     if (error)
    294         return nullptr;
    295     return adoptCF(sampleBuffer);
    296 }
    297 
    298 void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime&, size_t sampleCount)
    299 {
    300     ASSERT(m_audioInput);
    301     if ((!m_hasStartedWriting && m_videoInput) || m_isStopped)
    302         return;
     398static inline RetainPtr<CMSampleBufferRef> createAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount)
     399{
    303400    auto format = createAudioFormatDescription(description);
    304401    if (!format)
    305         return;
    306     if (m_isFirstAudioSample) {
    307         if (!m_videoInput) {
    308             // audio-only recording.
    309             if (![m_writer startWriting]) {
    310                 m_isStopped = true;
    311                 return;
    312             }
    313             [m_writer startSessionAtSourceTime:CMClockGetTime(CMClockGetHostTimeClock())];
    314             m_hasStartedWriting = true;
    315         }
    316         m_isFirstAudioSample = false;
    317         RefPtr<MediaRecorderPrivateWriter> protectedThis = this;
    318         [m_audioInput requestMediaDataWhenReadyOnQueue:m_audioPullQueue usingBlock:[this, protectedThis = WTFMove(protectedThis)] {
    319             do {
    320                 if (![m_audioInput isReadyForMoreMediaData])
    321                     break;
    322                 auto locker = holdLock(m_audioLock);
    323                 if (m_audioBufferPool.isEmpty())
    324                     break;
    325                 auto buffer = m_audioBufferPool.takeFirst();
    326                 locker.unlockEarly();
    327                 [m_audioInput appendSampleBuffer:buffer.get()];
    328             } while (true);
    329             if (m_isStopped && m_audioBufferPool.isEmpty()) {
    330                 [m_audioInput markAsFinished];
    331                 m_finishWritingAudioSemaphore.signal();
    332             }
     402        return nullptr;
     403
     404    CMSampleBufferRef sampleBuffer = nullptr;
     405    auto error = CMAudioSampleBufferCreateWithPacketDescriptions(kCFAllocatorDefault, NULL, false, NULL, NULL, format.get(), sampleCount, toCMTime(time), NULL, &sampleBuffer);
     406    if (error) {
     407        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter createAudioSampleBufferWithPacketDescriptions failed with %d", error);
     408        return nullptr;
     409    }
     410    auto buffer = adoptCF(sampleBuffer);
     411
     412    error = CMSampleBufferSetDataBufferFromAudioBufferList(buffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list());
     413    if (error) {
     414        RELEASE_LOG_ERROR(MediaStream, "MediaRecorderPrivateWriter CMSampleBufferSetDataBufferFromAudioBufferList failed with %d", error);
     415        return nullptr;
     416    }
     417    return buffer;
     418}
     419
     420void MediaRecorderPrivateWriter::appendAudioSampleBuffer(const PlatformAudioData& data, const AudioStreamDescription& description, const WTF::MediaTime& time, size_t sampleCount)
     421{
     422    if (auto sampleBuffer = createAudioSampleBuffer(data, description, time, sampleCount))
     423        m_audioCompressor->addSampleBuffer(sampleBuffer.get());
     424}
     425
     426void MediaRecorderPrivateWriter::stopRecording()
     427{
     428    if (m_isStopped)
     429        return;
     430
     431    m_isStopped = true;
     432
     433    if (m_videoCompressor)
     434        m_videoCompressor->finish();
     435    if (m_audioCompressor)
     436        m_audioCompressor->finish();
     437
     438    if (!m_hasStartedWriting)
     439        return;
     440    ASSERT([m_writer status] == AVAssetWriterStatusWriting);
     441
     442    m_isStopping = true;
     443
     444    flushCompressedSampleBuffers([this, weakThis = makeWeakPtr(this)]() mutable {
     445        if (!weakThis)
     446            return;
     447
     448        ALLOW_DEPRECATED_DECLARATIONS_BEGIN
     449        [m_writer flush];
     450        ALLOW_DEPRECATED_DECLARATIONS_END
     451        [m_writer finishWritingWithCompletionHandler:[this, weakThis = WTFMove(weakThis)]() mutable {
     452            callOnMainThread([this, weakThis = WTFMove(weakThis)]() mutable {
     453                if (!weakThis)
     454                    return;
     455
     456                m_isStopping = false;
     457                if (m_fetchDataCompletionHandler) {
     458                    auto buffer = WTFMove(m_data);
     459                    m_fetchDataCompletionHandler(WTFMove(buffer));
     460                }
     461
     462                m_isStopped = false;
     463                m_hasStartedWriting = false;
     464                clear();
     465            });
    333466        }];
    334     }
    335 
    336     auto sampleBuffer = createAudioSampleBufferWithPacketDescriptions(format.get(), sampleCount);
    337     if (!sampleBuffer)
    338         return;
    339     auto error = CMSampleBufferSetDataBufferFromAudioBufferList(sampleBuffer.get(), kCFAllocatorDefault, kCFAllocatorDefault, 0, downcast<WebAudioBufferList>(data).list());
    340     if (error)
    341         return;
    342 
    343     auto locker = holdLock(m_audioLock);
    344     m_audioBufferPool.append(WTFMove(sampleBuffer));
    345 }
    346 
    347 void MediaRecorderPrivateWriter::stopRecording()
    348 {
    349     if (m_isStopped)
    350         return;
    351 
    352     m_isStopped = true;
    353     if (!m_hasStartedWriting)
    354         return;
    355     ASSERT([m_writer status] == AVAssetWriterStatusWriting);
    356     if (m_videoInput)
    357         m_finishWritingVideoSemaphore.wait();
    358 
    359     if (m_audioInput)
    360         m_finishWritingAudioSemaphore.wait();
    361 
    362     m_isStopping = true;
    363     [m_writer finishWritingWithCompletionHandler:[this, weakPtr = makeWeakPtr(*this)]() mutable {
    364         callOnMainThread([this, weakPtr = WTFMove(weakPtr), buffer = SharedBuffer::createWithContentsOfFile(m_path)]() mutable {
    365             if (!weakPtr)
    366                 return;
    367 
    368             m_isStopping = false;
    369             if (m_fetchDataCompletionHandler)
    370                 m_fetchDataCompletionHandler(WTFMove(buffer));
    371             else
    372                 m_data = WTFMove(buffer);
    373 
    374             m_isStopped = false;
    375             m_hasStartedWriting = false;
    376             m_isFirstAudioSample = true;
    377             clear();
    378         });
    379         m_finishWritingSemaphore.signal();
    380     }];
    381     m_finishWritingSemaphore.wait();
     467    });
    382468}
    383469
     
    393479}
    394480
     481void MediaRecorderPrivateWriter::appendData(const char* data, size_t size)
     482{
     483    if (!m_data) {
     484        m_data = SharedBuffer::create(data, size);
     485        return;
     486    }
     487    m_data->append(data, size);
     488}
     489
     490void MediaRecorderPrivateWriter::appendData(Ref<SharedBuffer>&& buffer)
     491{
     492    if (!m_data) {
     493        m_data = WTFMove(buffer);
     494        return;
     495    }
     496    m_data->append(WTFMove(buffer));
     497}
     498
    395499} // namespace WebCore
    396500
    397 #endif // ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
     501#endif // ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebCore/platform/mediarecorder/cocoa/VideoSampleBufferCompressor.h

    r262707 r262708  
    11/*
    2  * Copyright (C) 2018 Apple Inc. All rights reserved.
     2 * Copyright (C) 2020 Apple Inc. All rights reserved.
    33 *
    44 * Redistribution and use in source and binary forms, with or without
     
    2525#pragma once
    2626
    27 #if ENABLE(MEDIA_STREAM)
     27#if ENABLE(MEDIA_STREAM) && USE(AVFOUNDATION)
    2828
    29 #include "MediaRecorderPrivate.h"
    30 #include "MediaRecorderPrivateWriterCocoa.h"
     29#include <CoreMedia/CoreMedia.h>
     30#include <VideoToolbox/VTErrors.h>
     31
     32typedef struct opaqueCMSampleBuffer *CMSampleBufferRef;
     33typedef struct OpaqueVTCompressionSession *VTCompressionSessionRef;
    3134
    3235namespace WebCore {
    3336
    34 class MediaStreamPrivate;
    35 
    36 class MediaRecorderPrivateAVFImpl final
    37     : public MediaRecorderPrivate {
     37class VideoSampleBufferCompressor {
    3838    WTF_MAKE_FAST_ALLOCATED;
    3939public:
    40     static std::unique_ptr<MediaRecorderPrivateAVFImpl> create(MediaStreamPrivate&);
    41     ~MediaRecorderPrivateAVFImpl();
     40    static std::unique_ptr<VideoSampleBufferCompressor> create(CMVideoCodecType, CMBufferQueueTriggerCallback, void* callbackObject);
     41    ~VideoSampleBufferCompressor();
     42
     43    void finish();
     44    void addSampleBuffer(CMSampleBufferRef);
     45    CMSampleBufferRef getOutputSampleBuffer();
     46    RetainPtr<CMSampleBufferRef> takeOutputSampleBuffer();
    4247
    4348private:
    44     MediaRecorderPrivateAVFImpl(Ref<MediaRecorderPrivateWriter>&&, String&& audioTrackId, String&& videoTrackId);
     49    explicit VideoSampleBufferCompressor(CMVideoCodecType);
    4550
    46     friend std::unique_ptr<MediaRecorderPrivateAVFImpl> std::make_unique<MediaRecorderPrivateAVFImpl>(Ref<MediaRecorderPrivateWriter>&&, String&&, String&&);
     51    bool initialize(CMBufferQueueTriggerCallback, void* callbackObject);
    4752
    48     // MediaRecorderPrivate
    49     void videoSampleAvailable(MediaSample&) final;
    50     void fetchData(FetchDataCallback&&) final;
    51     void audioSamplesAvailable(const WTF::MediaTime&, const PlatformAudioData&, const AudioStreamDescription&, size_t) final;
     53    void processSampleBuffer(CMSampleBufferRef);
     54    bool initCompressionSession(CMVideoFormatDescriptionRef);
    5255
    53     const String& mimeType();
    54     void stopRecording();
     56    static void videoCompressionCallback(void *refCon, void*, OSStatus, VTEncodeInfoFlags, CMSampleBufferRef);
    5557
    56     Ref<MediaRecorderPrivateWriter> m_writer;
    57     String m_recordedAudioTrackID;
    58     String m_recordedVideoTrackID;
     58    dispatch_queue_t m_serialDispatchQueue;
     59    RetainPtr<CMBufferQueueRef> m_outputBufferQueue;
     60    RetainPtr<VTCompressionSessionRef> m_vtSession;
     61
     62    bool m_isEncoding { false };
     63
     64    CMVideoCodecType m_outputCodecType;
     65    float m_maxKeyFrameIntervalDuration { 2.0 };
     66    unsigned m_expectedFrameRate { 30 };
    5967};
    6068
    61 } // namespace WebCore
     69}
    6270
    63 #endif // ENABLE(MEDIA_STREAM)
     71#endif
  • trunk/Source/WebCore/platform/network/cocoa/ResourceRequestCocoa.mm

    r261153 r262708  
    3737#import <pal/spi/cf/CFNetworkSPI.h>
    3838#import <wtf/FileSystem.h>
     39#import <wtf/cocoa/VectorCocoa.h>
    3940#import <wtf/text/CString.h>
    4041
  • trunk/Source/WebCore/testing/Internals.cpp

    r262695 r262708  
    574574    RuntimeEnabledFeatures::sharedFeatures().setInterruptAudioOnPageVisibilityChangeEnabled(false);
    575575    WebCore::MediaRecorder::setCustomPrivateRecorderCreator(nullptr);
    576     page.mediaRecorderProvider().setUseGPUProcess(true);
    577576#endif
    578577
  • trunk/Source/WebKit/ChangeLog

    r262703 r262708  
     12020-06-08  youenn fablet  <youenn@apple.com>
     2
     3        [Cocoa] Use AVAssetWriterDelegate to implement MediaRecorder
     4        https://bugs.webkit.org/show_bug.cgi?id=206582
     5        <rdar://problem/58985368>
     6
     7        Reviewed by Eric Carlson.
     8
     9        Enable RemoteMediaRecorder only for systems supporting AVAssetWriterDelegate.
     10
     11        * GPUProcess/GPUConnectionToWebProcess.cpp:
     12        (WebKit::GPUConnectionToWebProcess::didReceiveMessage):
     13        * GPUProcess/GPUConnectionToWebProcess.h:
     14        * GPUProcess/webrtc/RemoteMediaRecorder.cpp:
     15        * GPUProcess/webrtc/RemoteMediaRecorder.h:
     16        * GPUProcess/webrtc/RemoteMediaRecorder.messages.in:
     17        * GPUProcess/webrtc/RemoteMediaRecorderManager.cpp:
     18        * GPUProcess/webrtc/RemoteMediaRecorderManager.h:
     19        * GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in:
     20        * GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h:
     21        * WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp:
     22        * WebProcess/GPU/webrtc/MediaRecorderPrivate.h:
     23        * WebProcess/GPU/webrtc/MediaRecorderProvider.cpp:
     24        (WebKit::MediaRecorderProvider::createMediaRecorderPrivate):
     25
    1262020-06-07  Lauro Moura  <lmoura@igalia.com>
    227
  • trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.cpp

    r262695 r262708  
    229229}
    230230
     231#if HAVE(AVASSETWRITERDELEGATE)
    231232RemoteMediaRecorderManager& GPUConnectionToWebProcess::mediaRecorderManager()
    232233{
     
    236237    return *m_remoteMediaRecorderManager;
    237238}
     239#endif
    238240
    239241RemoteAudioMediaStreamTrackRendererManager& GPUConnectionToWebProcess::audioTrackRendererManager()
     
    252254    return *m_sampleBufferDisplayLayerManager;
    253255}
    254 #endif
     256#endif //  PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
    255257
    256258#if PLATFORM(COCOA) && USE(LIBWEBRTC)
     
    370372        return true;
    371373    }
     374#if HAVE(AVASSETWRITERDELEGATE)
    372375    if (decoder.messageReceiverName() == Messages::RemoteMediaRecorderManager::messageReceiverName()) {
    373376        mediaRecorderManager().didReceiveMessageFromWebProcess(connection, decoder);
     
    378381        return true;
    379382    }
     383#endif // HAVE(AVASSETWRITERDELEGATE)
    380384    if (decoder.messageReceiverName() == Messages::RemoteAudioMediaStreamTrackRendererManager::messageReceiverName()) {
    381385        audioTrackRendererManager().didReceiveMessageFromWebProcess(connection, decoder);
     
    394398        return true;
    395399    }
    396 #endif
     400#endif // PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
    397401#if PLATFORM(COCOA) && USE(LIBWEBRTC)
    398402    if (decoder.messageReceiverName() == Messages::LibWebRTCCodecsProxy::messageReceiverName()) {
  • trunk/Source/WebKit/GPUProcess/GPUConnectionToWebProcess.h

    r262695 r262708  
    117117#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
    118118    UserMediaCaptureManagerProxy& userMediaCaptureManagerProxy();
     119#if HAVE(AVASSETWRITERDELEGATE)
    119120    RemoteMediaRecorderManager& mediaRecorderManager();
     121#endif
    120122    RemoteAudioMediaStreamTrackRendererManager& audioTrackRendererManager();
    121123    RemoteSampleBufferDisplayLayerManager& sampleBufferDisplayLayerManager();
     
    167169#if PLATFORM(COCOA) && ENABLE(MEDIA_STREAM)
    168170    std::unique_ptr<UserMediaCaptureManagerProxy> m_userMediaCaptureManagerProxy;
     171#if HAVE(AVASSETWRITERDELEGATE)
    169172    std::unique_ptr<RemoteMediaRecorderManager> m_remoteMediaRecorderManager;
     173#endif
    170174    std::unique_ptr<RemoteAudioMediaStreamTrackRendererManager> m_audioTrackRendererManager;
    171175    std::unique_ptr<RemoteSampleBufferDisplayLayerManager> m_sampleBufferDisplayLayerManager;
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.cpp

    r262663 r262708  
    2727#include "RemoteMediaRecorder.h"
    2828
    29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     29#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    3030
    3131#include "SharedRingBufferStorage.h"
     
    136136}
    137137
    138 #endif
     138#endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.h

    r262663 r262708  
    2626#pragma once
    2727
    28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     28#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    2929
    3030#include "MediaRecorderIdentifier.h"
     
    8383}
    8484
    85 #endif
     85#endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorder.messages.in

    r262663 r262708  
    2222# THE POSSIBILITY OF SUCH DAMAGE.
    2323
    24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     24#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    2525
    2626messages -> RemoteMediaRecorder NotRefCounted {
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.cpp

    r262663 r262708  
    2727#include "RemoteMediaRecorderManager.h"
    2828
    29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     29#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    3030
    3131#include "DataReference.h"
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.h

    r262663 r262708  
    2828#pragma once
    2929
    30 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     30#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    3131
    3232#include "MediaRecorderIdentifier.h"
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteMediaRecorderManager.messages.in

    r262663 r262708  
    2222# THE POSSIBILITY OF SUCH DAMAGE.
    2323
    24 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     24#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    2525
    2626messages -> RemoteMediaRecorderManager NotRefCounted {
  • trunk/Source/WebKit/GPUProcess/webrtc/RemoteSampleBufferDisplayLayerManager.h

    r262695 r262708  
    3131#include "RemoteSampleBufferDisplayLayerManagerMessagesReplies.h"
    3232#include "SampleBufferDisplayLayerIdentifier.h"
     33#include <WebCore/IntSize.h>
    3334#include <wtf/HashMap.h>
    3435
  • trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.cpp

    r262663 r262708  
    2727#include "MediaRecorderPrivate.h"
    2828
    29 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     29#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    3030
    3131#include "DataReference.h"
     
    136136}
    137137
    138 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     138#endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
  • trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderPrivate.h

    r262663 r262708  
    2626#pragma once
    2727
    28 #if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     28#if PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    2929
    3030#include "MediaRecorderIdentifier.h"
     
    7777}
    7878
    79 #endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM)
     79#endif // PLATFORM(COCOA) && ENABLE(GPU_PROCESS) && ENABLE(MEDIA_STREAM) && HAVE(AVASSETWRITERDELEGATE)
    8080
  • trunk/Source/WebKit/WebProcess/GPU/webrtc/MediaRecorderProvider.cpp

    r262663 r262708  
    3737std::unique_ptr<WebCore::MediaRecorderPrivate> MediaRecorderProvider::createMediaRecorderPrivate(MediaStreamPrivate& stream)
    3838{
    39 #if ENABLE(GPU_PROCESS)
     39#if ENABLE(GPU_PROCESS) && HAVE(AVASSETWRITERDELEGATE)
    4040    if (m_useGPUProcess)
    4141        return makeUnique<MediaRecorderPrivate>(stream);
Note: See TracChangeset for help on using the changeset viewer.