我的ios应用程序中的EXC_BAD_ACCESS KERN_INVALID_ADDRESS

时间:2014-04-16 08:19:24

标签: ios objective-c avfoundation

我正在ios应用程序中录制一些视频,有时(非常不可预测)它在录制时与EXC_BAD_ACCESS KERN_INVALID_ADDRESS崩溃。 (编辑:项目正在使用ARC)

Thread : Crashed: com.myapp.myapp
0  libobjc.A.dylib                0x3b1cc622 objc_msgSend + 1
1  com.myapp.myap                 0x00156faf -[Encoder encodeFrame:isVideo:] (Encoder.m:129)
2  com.myapp.myap                 0x001342ab -[CameraController     captureOutput:didOutputSampleBuffer:fromConnection:] (CameraController.m:423)
3  AVFoundation                   0x2f918327 __74-[AVCaptureAudioDataOutput  _AVCaptureAudioDataOutput_AudioDataBecameReady]_block_invoke + 282
4  libdispatch.dylib              0x3b6abd53 _dispatch_call_block_and_release + 10
5  libdispatch.dylib              0x3b6b0cbd _dispatch_queue_drain + 488
6  libdispatch.dylib              0x3b6adc6f _dispatch_queue_invoke + 42
7  libdispatch.dylib              0x3b6b15f1 _dispatch_root_queue_drain + 76
8  libdispatch.dylib              0x3b6b18dd _dispatch_worker_thread2 + 56
9  libsystem_pthread.dylib        0x3b7dcc17 _pthread_wqthread + 298

我的变量声明:

@interface CameraController  () <AVCaptureVideoDataOutputSampleBufferDelegate,    AVCaptureAudioDataOutputSampleBufferDelegate>
{
AVCaptureSession* _session;
AVCaptureVideoPreviewLayer* _preview;
dispatch_queue_t _captureQueue;
AVCaptureConnection* _audioConnection;
AVCaptureConnection* _videoConnection;


Encoder* _encoder;
BOOL _isRecording;
BOOL _isPaused;
BOOL _discont;
int _currentFile;
CMTime _timeOffset;
CMTime _lastVideo;
CMTime _lastAudio;

int _cx;
int _cy;
int _channels;
Float64 _samplerate;  
}
@end

这里是[Encoder encodeFrame:isVideo:](ntrace中的第1行)在上下文中:

- (void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
BOOL bVideo = YES;

@synchronized(self)
{
    if (!self.isCapturing  || self.isPaused)
    {
        return;
    }
    if (connection != _videoConnection)
    {
        bVideo = NO;
    }
    if ((_encoder == nil) && !bVideo)
    {
        CMFormatDescriptionRef fmt = CMSampleBufferGetFormatDescription(sampleBuffer);
        [self setAudioFormat:fmt];
        NSString* filename = [NSString stringWithFormat:@"capture%d.mp4", _currentFile];
        NSString* path = [NSTemporaryDirectory() stringByAppendingPathComponent:filename];
        _encoder = [VideoEncoder encoderForPath:path Height:_cy width:_cx channels:_channels samples:_samplerate];
    }
    if (_discont)
    {
        if (bVideo)
        {
            return;
        }
        _discont = NO;
        // calc adjustment
        CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
        CMTime last = bVideo ? _lastVideo : _lastAudio;
        if (last.flags & kCMTimeFlags_Valid)
        {
            if (_timeOffset.flags & kCMTimeFlags_Valid)
            {
                pts = CMTimeSubtract(pts, _timeOffset);
            }
            CMTime offset = CMTimeSubtract(pts, last);
            NSLog(@"Setting offset from %s", bVideo?"video": "audio");
            NSLog(@"Adding %f to %f (pts %f)", ((double)offset.value)/offset.timescale, ((double)_timeOffset.value)/_timeOffset.timescale, ((double)pts.value/pts.timescale));

            // this stops us having to set a scale for _timeOffset before we see the first video time
            if (_timeOffset.value == 0)
            {
                _timeOffset = offset;
            }
            else
            {
                _timeOffset = CMTimeAdd(_timeOffset, offset);
            }
        }
        _lastVideo.flags = 0;
        _lastAudio.flags = 0;
    }

    // retain so that we can release either this or modified one
    CFRetain(sampleBuffer);

    if (_timeOffset.value > 0)
    {
        CFRelease(sampleBuffer);
        sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
    }

    // record most recent time so we know the length of the pause
    CMTime pts = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
    CMTime dur = CMSampleBufferGetDuration(sampleBuffer);
    if (dur.value > 0)
    {
        pts = CMTimeAdd(pts, dur);
    }
    if (bVideo)
    {
        _lastVideo = pts;
    }
    else
    {
        _lastAudio = pts;
    }
}

// pass frame to encoder
[_encoder encodeFrame:sampleBuffer isVideo:bVideo]; //This is line 129
CFRelease(sampleBuffer);
}

有关完整使用的代码,请参阅:http://www.gdcl.co.uk/2013/02/20/iPhone-Pause.html - 我已使用此控件进行视频录制。 我知道这种问题很难解决,但我应该在哪里开始调试这个问题?感谢您的帮助

1 个答案:

答案 0 :(得分:1)

在您的方法中,您有以下内容......

CFRetain(sampleBuffer);

if (_timeOffset.value > 0)
{
    CFRelease(sampleBuffer);
    sampleBuffer = [self adjustTime:sampleBuffer by:_timeOffset];
}

然后在最后你有另一个

CFRelease(sampleBuffer);

_timeOffset.value大于0的情况下,您是否过度释放?或者你在其他地方做retain吗?你应该在if区块内再次保留它吗?