将5秒视频间隔上传到服务器

时间:2013-03-10 20:32:18

标签: ios avfoundation http-live-streaming avassetwriter

我已将Apple RosyWriter示例代码转换为符合ARC和现代目标C.我一直在阅读人们如何使用{{将5-10秒剪辑上传到服务器1}}方法,但我不确定在我的captureOutput:didOutputSampleBuffer:fromConnection混合体中做什么...

RosyWriter

然后像这样编写样本缓冲区:

-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);

if (connection == _videoConnection) {
    CMTime timeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    [self calculateFramerateAtTimestamp:timeStamp];

    if (_videoDimensions.height == 0 && _videoDimensions.width == 0) 
        _videoDimensions = CMVideoFormatDescriptionGetDimensions(formatDescription);

    if (_videoType == 0) 
        _videoType = CMFormatDescriptionGetMediaSubType(formatDescription);
}

CFRetain(sampleBuffer);
CFRetain(formatDescription);

dispatch_async(movieWritingQueue, ^{
    if (_assetWriter) {
        BOOL wasReadyToRecord = (_readyToRecordAudio && _readyToRecordVideo);

        if (connection == _videoConnection) {
            if (!_readyToRecordVideo)
                _readyToRecordVideo = [self setupAssetWriterVideoInput:formatDescription];

            if (_readyToRecordAudio && _readyToRecordVideo)
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeVideo];

        }else if (connection == _audioConnection) {
            if (!_readyToRecordAudio)
                _readyToRecordAudio = [self setupAssetWriterAudioInput:formatDescription];

            if (_readyToRecordVideo && _readyToRecordAudio)
                [self writeSampleBuffer:sampleBuffer ofType:AVMediaTypeAudio];

        }

        BOOL isReadyToRecord = (_readyToRecordAudio && _readyToRecordVideo);

        if (!wasReadyToRecord && isReadyToRecord) {
            _recordingWillBeStarted = NO;
            _recording = YES;
            [_delegate recordingDidStart];
        }
    }

    CFRelease(sampleBuffer);
    CFRelease(formatDescription);
});
}

现在我的问题是......我应该在-(void)writeSampleBuffer:(CMSampleBufferRef)sampleBuffer ofType:(NSString*)mediaType { if (_assetWriter.status == AVAssetWriterStatusUnknown) { if ([_assetWriter startWriting]) { [_assetWriter startSessionAtSourceTime:CMSampleBufferGetPresentationTimeStamp(sampleBuffer)]; }else { [self showError:[_assetWriter error] source:@"Write sample buffer"]; } } if (_assetWriter.status == AVAssetWriterStatusWriting) { if (mediaType == AVMediaTypeVideo) { if (_videoInput.readyForMoreMediaData) { if (![_videoInput appendSampleBuffer:sampleBuffer]) { [self showError:[_assetWriter error] source:@"set up video asset writer"]; } } }else if (mediaType == AVMediaTypeAudio) { if (_audioInput.readyForMoreMediaData) { if (![_audioInput appendSampleBuffer:sampleBuffer]) { [self showError:[_assetWriter error] source:@"set up audio asset writer"]; } } } } } assetWriter方法中创建和交换captureOutput:didOutputSampleBuffer:fromConnection:吗?从我在ffmpeg-ios中看到的内容,它实现了一个在不同writeSampleBuffer:ofType:类上具有自定义writeSampleBuffer:ofType方法的类,其中一个是每5秒一个分段编码器...但是如何将这个上传到服务器?

0 个答案:

没有答案