DJI Osmo Mobile视频预览

时间:2018-05-30 20:52:56

标签: ios dji-sdk

我想为DJI Osmo Mobile2创建示例应用程序但是当我尝试从=Msg(H5)获取相机时,它总是DJIHandheld。我怎样才能使用原生相机?我尝试在nil委托方法中将CMSampleBuffer AVCaptureVideoDataOutputSampleBufferDelegate UnsafeMutablePointer<UInt8>映射到captureOutput,但预览始终为黑色。

func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {

    let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!

    CVPixelBufferLockBaseAddress(pixelBuffer, CVPixelBufferLockFlags(rawValue: 0))
    let lumaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0)
    let chromaBaseAddress = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1)

    let width = CVPixelBufferGetWidth(pixelBuffer)
    let height = CVPixelBufferGetHeight(pixelBuffer)

    let lumaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0)
    let chromaBytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1)
    let lumaBuffer = lumaBaseAddress?.assumingMemoryBound(to: UInt8.self)
    let chromaBuffer = chromaBaseAddress?.assumingMemoryBound(to: UInt8.self)

    var rgbaImage = [UInt8](repeating: 0, count: 4*width*height)
    for x in 0 ..< width {
        for y in 0 ..< height {
            let lumaIndex = x+y*lumaBytesPerRow
            let chromaIndex = (y/2)*chromaBytesPerRow+(x/2)*2
            let yp = lumaBuffer?[lumaIndex]
            let cb = chromaBuffer?[chromaIndex]
            let cr = chromaBuffer?[chromaIndex+1]

            let ri = Double(yp!)                                + 1.402   * (Double(cr!) - 128)
            let gi = Double(yp!) - 0.34414 * (Double(cb!) - 128) - 0.71414 * (Double(cr!) - 128)
            let bi = Double(yp!) + 1.772   * (Double(cb!) - 128)

            let r = UInt8(min(max(ri,0), 255))
            let g = UInt8(min(max(gi,0), 255))
            let b = UInt8(min(max(bi,0), 255))

            rgbaImage[(x + y * width) * 4] = b
            rgbaImage[(x + y * width) * 4 + 1] = g
            rgbaImage[(x + y * width) * 4 + 2] = r
            rgbaImage[(x + y * width) * 4 + 3] = 255
        }
    }

    let data = NSData(bytes: &rgbaImage, length: rgbaImage.count)
    let videoBuffer = UnsafeMutablePointer<UInt8>.allocate(capacity: data.length)
    data.getBytes(videoBuffer, length: data.length)
    VideoPreviewer.instance().push(videoBuffer, length: Int32(data.length))

}

我不知道这是否是正确的方法。

PS:VideoPreviewer基于ffmpeg。

1 个答案:

答案 0 :(得分:0)

Osmo Mobile 2没有附带自己的相机,所以SDK不会返回相机的实例 - 这与拥有相机的其他版本的Osmos不同。您需要构建代码以直接与iOS设备进行交互,而不是通过Osmo Mobile 2进行交互。