使用UnityARKitPlugin获取固定的数组句柄时为EXC_BAD_ACCESS

时间:2018-11-26 10:12:04

标签: c# unity3d augmented-reality arkit

我正在尝试使用Unity ARKit插件将YUV相机帧读取到使用带有双缓冲区的SetCapturePixelData的一对固定字节数组中。为此,我需要先将阵列固定在托管内存中,以停止GC更改其地址,然后将该阵列的地址传递给本机插件代码,然后由本机插件代码进行写入。为了防止发生写/读冲突,我使用了双缓冲,因此在每个新帧中,我们在两对字节数组之间切换写操作,并从最后一个写的数组中读取。

在几乎所有兼容的iOS设备上,它都可以正常工作(iPhone 7、7 +,X和iPad 9.7 2017 / 5th Gen);但是在iPad 2018 / 6th Gen上,在固定阵列之后,尝试从固定的句柄中读取地址时,我得到EXC_BAD_ACCESS。

以下是最低可行的MonoBehaviour:

using System;
using System.Runtime.InteropServices;
using n00dle;
using UnityEngine;
using UnityEngine.UI;
using UnityEngine.XR.iOS;

public class ExtractVideoFrameBytes : MonoBehaviour
{

    CameraImage image;

    private byte[] uvBytes;
    private byte[] yBytes;
    private byte[] uvBytes2;
    private byte[] yBytes2;
    private int width;
    private int height;

    private GCHandle m_PinnedYArray;
    private IntPtr m_PinnedYAddr;
    private GCHandle m_PinnedUVArray;
    private IntPtr m_PinnedUVAddr;

    private UnityARSessionNativeInterface iface;

    private bool texturesInitialised = false;

    private long currentFrameNumber = 0;

    // Use this for initialization
    void Start () {
        UnityARSessionNativeInterface.ARFrameUpdatedEvent += UpdateFrame;
    }

    private void OnDestroy()
    {
        UnityARSessionNativeInterface.ARFrameUpdatedEvent -= UpdateFrame;
    }

    void UpdateFrame(UnityARCamera camera)
    {
        if (!texturesInitialised)
        {
            iface = UnityARSessionNativeInterface.GetARSessionNativeInterface();
            Debug.Log("INITIALISING");
            width = camera.videoParams.yWidth;
            height = camera.videoParams.yHeight;

            int numYBytes = camera.videoParams.yWidth * camera.videoParams.yHeight;
            int numUVBytes = camera.videoParams.yWidth * camera.videoParams.yHeight / 2; //quarter resolution, but two bytes per pixel

            yBytes = new byte[numYBytes];
            uvBytes = new byte[numUVBytes];
            yBytes2 = new byte[numYBytes];
            uvBytes2 = new byte[numUVBytes];

            m_PinnedYArray = GCHandle.Alloc(yBytes, GCHandleType.Pinned);
            m_PinnedUVArray = GCHandle.Alloc(uvBytes, GCHandleType.Pinned);

            texturesInitialised = true;
        }

        if (TryGetImage(ref image))
        {
            Debug.Log("Got an image...");
        }
        else
        {
            Debug.LogError("No image :(");
        }
    }


    public bool TryGetImage(ref CameraImage cameraImage)
        {

#if !UNITY_EDITOR && UNITY_IOS
            ARTextureHandles handles = iface.GetARVideoTextureHandles();

            if (handles.TextureY == IntPtr.Zero || handles.TextureCbCr == IntPtr.Zero)
                return false;

            if (!texturesInitialised)
                return false;

            long doubleBuffId = currentFrameNumber % 2;
            ++currentFrameNumber;

            m_PinnedYArray.Free();
            m_PinnedYArray = GCHandle.Alloc(doubleBuffId == 0 ? yBytes : yBytes2, GCHandleType.Pinned);
            m_PinnedYAddr = m_PinnedYArray.AddrOfPinnedObject();
            m_PinnedUVArray.Free();
            m_PinnedUVArray = GCHandle.Alloc(doubleBuffId == 0 ? uvBytes : uvBytes2, GCHandleType.Pinned);
            m_PinnedUVAddr = m_PinnedUVArray.AddrOfPinnedObject();

            // Tell Unity to write the NEXT frame into these buffers
            iface.SetCapturePixelData(true, m_PinnedYAddr, m_PinnedUVAddr);

            // Now, read off the other buffers
            cameraImage.y = (doubleBuffId == 0 ? yBytes2  : yBytes);
            cameraImage.uv = (doubleBuffId == 0 ? uvBytes2  : uvBytes);
            cameraImage.width = width;
            cameraImage.height = height;

#endif
            return true;

    }
}

我认为上面的代码没有什么特别奇怪的,但是希望知道是否有人可以发现我做错了什么。的确,这是Unity实验性AR接口中用于检索摄像机框架的代码的精简版。

由于这仅发生在单个设备上,因此我怀疑这是底层本机代码中的错误(可能是某个地方的免费代码),因此我也将其记录为an issue on the UnityARKitPlugin issue tracker;如果得到答复,我将相应地更新或删除此问题。

编辑:我在XCode中看到的堆栈跟踪如下所示: enter image description here

0 个答案:

没有答案