如何在iPhone XS上使用ARKit2和视觉(VNDetectFaceRectanglesRequest)时解决IOAF代码GPU错误

时间:2019-01-04 11:18:26

标签: ios swift computer-vision arkit

在iPhone XS(带有iOS 12.1.2和Xcode 10.1)上运行ARKit时,在运行Vision代码以检测人脸边界时遇到错误并崩溃/挂起。

我遇到的错误是:

2019-01-04 03:03:03.155867-0800 ARKit Vision Demo[12969:3307770] Execution of the command buffer was aborted due to an error during execution. Caused GPU Timeout Error (IOAF code 2)
2019-01-04 03:03:03.155786-0800 ARKit Vision Demo[12969:3307850] Execution of the command buffer was aborted due to an error during execution. Discarded (victim of GPU error/recovery) (IOAF code 5)
[SceneKit] Error: display link thread seems stuck

在iPhone XS上运行以下概念验证代码以重现错误(在运行应用程序的几秒钟内发生)时,就会发生这种情况-https://github.com/xta/ARKit-Vision-Demo

相关的ViewController.swift包含有问题的方法:

func classifyCurrentImage() {
    guard let buffer = currentBuffer else { return }

    let image = CIImage(cvPixelBuffer: buffer)
    let options: [VNImageOption: Any] = [:]
    let imageRequestHandler = VNImageRequestHandler(ciImage: image, orientation: self.imageOrientation, options: options)

    do {
        try imageRequestHandler.perform(self.requests)
    } catch {
        print(error)
    }
}

func handleFaces(request: VNRequest, error: Error?) {
    DispatchQueue.main.async {
        guard let results = request.results as? [VNFaceObservation] else { return }
        // TODO - something here with results
        print(results)

        self.currentBuffer = nil
    }
}

将Apple的ARKit + Vision与VNDetectFaceRectanglesRequest结合使用的正确方法是什么?得到神秘的IOAF代码错误是不正确的。

理想情况下,我还想使用VNTrackObjectRequest和VNSequenceRequestHandler来跟踪请求。

有不错的在线文档,可以将VNDetectFaceRectanglesRequest与Vision(和不带ARKit)一起使用。苹果在这里有一个页面(https://developer.apple.com/documentation/arkit/using_vision_in_real_time_with_arkit),但我仍然遇到错误/崩溃。

3 个答案:

答案 0 :(得分:1)

对于其他经历过痛苦的人,我只是尝试解决VNDetectRectanglesRequest的确切错误,这是我的解决方案:

似乎使用了CIImage:

let imageRequestHandler = VNImageRequestHandler(ciImage: image, orientation: self.imageOrientation, options: options)

导致金属在我的内存图中保留了大量内部功能。

我注意到Apple的示例项目全都使用了它:

let handler: VNImageRequestHandler! = VNImageRequestHandler(cvPixelBuffer: pixelBuffer,
                                                                    orientation: orientation,
                                                                    options: requestHandlerOptions)

改用cvPixelBuffer代替CIImage修复了我所有的随机GPU超时错误!

我使用了这些功能来获取orientation(我正在使用后置摄像头。我想根据您要执行的操作,您可能需要为前置摄像头镜像):

func exifOrientationForDeviceOrientation(_ deviceOrientation: UIDeviceOrientation) -> CGImagePropertyOrientation {

    switch deviceOrientation {
    case .portraitUpsideDown:
        return .right

    case .landscapeLeft:
        return .down

    case .landscapeRight:
        return .up

    default:
        return .left
    }
}

func exifOrientationForCurrentDeviceOrientation() -> CGImagePropertyOrientation {
    return exifOrientationForDeviceOrientation(UIDevice.current.orientation)
}

,以下为options

var requestHandlerOptions: [VNImageOption: AnyObject] = [:]
let cameraIntrinsicData = CMGetAttachment(pixelBuffer, key: kCMSampleBufferAttachmentKey_CameraIntrinsicMatrix, attachmentModeOut: nil)
if cameraIntrinsicData != nil {
            requestHandlerOptions[VNImageOption.cameraIntrinsics] = cameraIntrinsicData
}

希望这可以为我节省一周的时间!

答案 1 :(得分:0)

您需要调用perform方法异步,就像在您共享的链接中一样。 尝试以下代码:

func classifyCurrentImage() {
    guard let buffer = currentBuffer else { return }

    let image = CIImage(cvPixelBuffer: buffer)
    let options: [VNImageOption: Any] = [:]
    let imageRequestHandler = VNImageRequestHandler(ciImage: image, orientation: self.imageOrientation, options: options)

    DispatchQueue.global(qos: .userInteractive).async {
        do {
            try imageRequestHandler.perform(self.requests)
        } catch {
            print(error)
        }
    }
}

答案 2 :(得分:0)

更新:据我所知,问题出在我的演示存储库中的保留周期(或缺少[weak self])。在Apple的sample project中,他们正确使用[weak self]以避免保留周期,并且ARKit + Vision应用程序在iPhone XS上运行。