更改AVCaptureDeviceInput会导致AVAssetWriterStatusFailed

时间:2014-05-22 11:43:29

标签: ios objective-c avfoundation avcapturesession avassetwriter

我正在尝试更改相机视图FrontBack。它运行良好。如果录制的视频没有翻转Pause/Record选项它工作正常。但是如果我们{ {1}}之后,进一步录制的视频未保存,导致Flip Camera View - AVAssetWriterStatusFailed。任何人都可以帮我找到我错的地方吗?以下是我的代码。

Camera.m

The operation could not be completed

Encoder.m

- (void)flipCamera{
NSArray * inputs = _session.inputs;
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeVideo ] ) {
        AVCaptureDevicePosition position = Device . position ; AVCaptureDevice * newCamera = nil ; AVCaptureDeviceInput * newInput = nil ;
        if ( position == AVCaptureDevicePositionFront )
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionBack ] ;
        else
            newCamera = [ self cameraWithPosition : AVCaptureDevicePositionFront ] ; newInput = [ AVCaptureDeviceInput deviceInputWithDevice : newCamera error : nil ] ;
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
for ( AVCaptureDeviceInput * INPUT in inputs ) {
    AVCaptureDevice * Device = INPUT.device ;
    if ( [ Device hasMediaType : AVMediaTypeAudio ] ) {
        // audio input from default mic
        AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
        AVCaptureDeviceInput* newInput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
        //            [_session addInput:micinput];
        // beginConfiguration ensures that pending changes are not applied immediately
        [ _session beginConfiguration ] ;
        [ _session removeInput : INPUT ] ;
        [ _session addInput : newInput ] ;
        // Changes take effect once the outermost commitConfiguration is invoked.
        [ _session commitConfiguration ] ;
        break ;
    }
}
}

- ( AVCaptureDevice * ) cameraWithPosition : ( AVCaptureDevicePosition ) position
 {
NSArray * Devices = [ AVCaptureDevice devicesWithMediaType : AVMediaTypeVideo ] ;
for ( AVCaptureDevice * Device in Devices )
    if ( Device . position == position )
        return Device ;
return nil ;
}

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;
@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));
            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }
    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);
    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }
    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}
// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo];
CFRelease(sampleBuffer);
}

提前致谢。

1 个答案:

答案 0 :(得分:7)

问题在于这一行:

if (connection != _videoConnection)
    {
        bVideo = NO;
    }

当您更换相机时,会创建一个新的videoConnection,我不知道如何操作。但是,如果你改变这一行,如下所示:

//if (connection != _videoConnection)
if ([connection.output connectionWithMediaType:AVMediaTypeVideo] == nil)
    {
        bVideo = NO;
    }