在OpenCV中分析sampleBuffer时EXC_Bad_Access

时间:2017-03-07 20:12:54

标签: c++ objective-c swift opencv avfoundation

我正在尝试使用OpenCV进行实时相机处理。 在AVCaptureVideoDataOutputSampleBufferDelegate的didOutputSampleBuffer-Method中,我正在使用sampleBuffer创建一个矩阵(它没有任何问题)。但是当执行某些方法时,例如cv :: GaussianBlur,应用程序崩溃是因为" exc_bad_access code = 1,address = 0x10 ......"。你知道为什么吗?

cv::Mat matrix(bufferHeight, bufferWidth, CV_8UC4, baseAddress);

cv::GaussianBlur(matrix, matrix, cvSize(5,5), 0); // Crahes here

__编辑:

基地址的计算方法如下(在将这些变量传递给objective-c ++之前,这是在didOutputSampleBuffer-Method中的Swift中完成的)

    var pixelBuffer: CVImageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
    CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))

    var baseAddress = CVPixelBufferGetBaseAddress(pixelBuffer)

____ Edit2:

baseAddress的价值: 0x0000000107790000

pixelBuffer的价值:

<CVPixelBuffer 0x17413aae0 width=1920 height=1080 pixelFormat=420v iosurface=0x1700039e0 planes=2>
<Plane 0 width=1920 height=1080 bytesPerRow=1920>
<Plane 1 width=960 height=540 bytesPerRow=1920>
<attributes=<CFBasicHash 0x17426e640 [0x1b8d18bb8]>{type = immutable dict, count = 5,
entries =>
    0 : <CFString 0x1b300de28 [0x1b8d18bb8]>{contents = "PixelFormatDescription"} = <CFBasicHash 0x17426a080 [0x1b8d18bb8]>{type = immutable dict, count = 10,
entries =>
    0 : <CFString 0x1b300e088 [0x1b8d18bb8]>{contents = "Planes"} = (
        {
        BitsPerBlock = 8;
        BlackBlock = <10>;
        FillExtendedPixelsCallback = <00000000 00000000 b840aa95 01000000 00000000 00000000>;
    },
        {
        BitsPerBlock = 16;
        BlackBlock = <8080>;
        FillExtendedPixelsCallback = <00000000 00000000 443faa95 01000000 00000000 00000000>;
        HorizontalSubsampling = 2;
        VerticalSubsampling = 2;
    }
)
    2 : <CFString 0x1b300dd68 [0x1b8d18bb8]>{contents = "IOSurfaceOpenGLESFBOCompatibility"} = <CFBoolean 0x1b8d19110 [0x1b8d18bb8]>{value = true}
    3 : <CFString 0x1b300e228 [0x1b8d18bb8]>{contents = "ContainsYCbCr"} = <CFBoolean 0x1b8d19110 [0x1b8d18bb8]>{value = true}
    4 : <CFString 0x1b300dd48 [0x1b8d18bb8]>{contents = "IOSurfaceOpenGLESTextureCompatibility"} = <CFBoolean 0x1b8d19110 [0x1b8d18bb8]>{value = true}
    5 : <CFString 0x1b300e288 [0x1b8d18bb8]>{contents = "ComponentRange"} = <CFString 0x1b300e2a8 [0x1b8d18bb8]>{contents = "VideoRange"}
    6 : <CFString 0x1b300e008 [0x1b8d18bb8]>{contents = "PixelFormat"} = <CFNumber 0xb000000343230762 [0x1b8d18bb8]>{value = +875704438, type = kCFNumberSInt32Type}
    7 : <CFString 0x1b300dd28 [0x1b8d18bb8]>{contents = "IOSurfaceCoreAnimationCompatibility"} = <CFBoolean 0x1b8d19110 [0x1b8d18bb8]>{value = true}
    9 : <CFString 0x1b300e068 [0x1b8d18bb8]>{contents = "ContainsAlpha"} = <CFBoolean 0x1b8d19120 [0x1b8d18bb8]>{value = false}
    10 : <CFString 0x1b300e248 [0x1b8d18bb8]>{contents = "ContainsRGB"} = <CFBoolean 0x1b8d19120 [0x1b8d18bb8]>{value = false}
    11 : <CFString 0x1b300dd88 [0x1b8d18bb8]>{contents = "OpenGLESCompatibility"} = <CFBoolean 0x1b8d19110 [0x1b8d18bb8]>{value = true}
}

    2 : <CFString 0x1b300dbe8 [0x1b8d18bb8]>{contents = "ExtendedPixelsRight"} = <CFNumber 0xb000000000000002 [0x1b8d18bb8]>{value = +0, type = kCFNumberSInt32Type}
    3 : <CFString 0x1b300dbc8 [0x1b8d18bb8]>{contents = "ExtendedPixelsTop"} = <CFNumber 0xb000000000000002 [0x1b8d18bb8]>{value = +0, type = kCFNumberSInt32Type}
    4 : <CFString 0x1b300dba8 [0x1b8d18bb8]>{contents = "ExtendedPixelsLeft"} = <CFNumber 0xb000000000000002 [0x1b8d18bb8]>{value = +0, type = kCFNumberSInt32Type}
    5 : <CFString 0x1b300dc08 [0x1b8d18bb8]>{contents = "ExtendedPixelsBottom"} = <CFNumber 0xb000000000000082 [0x1b8d18bb8]>{value = +8, type = kCFNumberSInt32Type}
}
 propagatedAttachments=<CFBasicHash 0x17426e900 [0x1b8d18bb8]>{type = mutable dict, count = 4,
entries =>
    0 : <CFString 0x1b300d7c8 [0x1b8d18bb8]>{contents = "CVImageBufferYCbCrMatrix"} = <CFString 0x1b300d7e8 [0x1b8d18bb8]>{contents = "ITU_R_709_2"}
    1 : <CFString 0x1b300d928 [0x1b8d18bb8]>{contents = "CVImageBufferTransferFunction"} = <CFString 0x1b300d7e8 [0x1b8d18bb8]>{contents = "ITU_R_709_2"}
    2 : <CFString 0x1b3044fa0 [0x1b8d18bb8]>{contents = "MetadataDictionary"} = <CFBasicHash 0x170077840 [0x1b8d18bb8]>{type = mutable dict, count = 3,
entries =>
    0 : <CFString 0x1b304d100 [0x1b8d18bb8]>{contents = "SNR"} = <CFNumber 0x170036300 [0x1b8d18bb8]>{value = +28.30700356903138370512, type = kCFNumberFloat64Type}
    1 : <CFString 0x1b304b2e0 [0x1b8d18bb8]>{contents = "ExposureTime"} = <CFNumber 0x170033d00 [0x1b8d18bb8]>{value = +0.01000000000000000021, type = kCFNumberFloat64Type}
    2 : <CFString 0x1b304d0e0 [0x1b8d18bb8]>{contents = "SensorID"} = <CFNumber 0xb000000000002472 [0x1b8d18bb8]>{value = +583, type = kCFNumberSInt32Type}
}

    5 : <CFString 0x1b300d8a8 [0x1b8d18bb8]>{contents = "CVImageBufferColorPrimaries"} = <CFString 0x1b300d7e8 [0x1b8d18bb8]>{contents = "ITU_R_709_2"}
}
 nonPropagatedAttachments=<CFBasicHash 0x17426e8c0 [0x1b8d18bb8]>{type = mutable dict, count = 0,
entries =>
}

1 个答案:

答案 0 :(得分:1)

啊 - 您的视频数据不是4分量RGBA(或其他),但是&#34; 1.5&#34;组件YUV。你应该在YUV中执行模糊,或者更容易,将捕获会话切换到RGBA。

YUV是默认格式&amp;你有两架飞机&#34;在它。

平面0是&#34; Y&#34;,一个1920x1080 8位位图,平面1是&#34; UV&#34;,一个960x540 16位位图(实际上是两个960x540 8位位图边U&amp; V,不知道他们为什么不分成3架飞机。

在任何情况下,您的代码都需要一个1920x1080的32位位图,并在Y通道内存的末尾运行。

如果你想切换到RGBA,那么(我想 - 我永远不会记得iOS使用的是哪种4组件格式):

output.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: kCVPixelFormatType_32BGRA]

如果您喜欢冒险,请对yuv数据进行模糊处理 - 它的尺寸要小2.666666667倍,而您的代码 的速度要快2.666667倍。