Oculus quest 与 Agora 的屏幕共享

时间:2021-04-15 12:43:06

标签: c# unity3d agora.io oculus

我在 Unity 中使用 Agora.io 执行屏幕共享,当涉及两台台式电脑时,它运行良好。现在我正在尝试使用 Oculus Quest 和一台 PC 来实现相同的目标。 PC 将具有显示 Oculus 屏幕视图的原始图像纹理。不幸的是根本没有输入,只有黑屏。但是提醒您,当连接两台 PC 甚至一部 android 手机时,它运行良好,它会显示屏幕视图。只有在连接 Oculus Quest 时它才不起作用。我什至已授予 Oculus 实现此目标所需的所有权限,但它不起作用。

编辑:我知道我必须将 Screen.width 和 Screen.height 更改为自定义渲染纹理并将其附加到相机。我也这样做了,但这次即使在桌面模式下输出也是空的。

using System;
using System.Collections;
using System.Collections.Generic;
using System.Globalization;
using System.Runtime.InteropServices;
using agora_gaming_rtc;
using UnityEngine;
using UnityEngine.UI;
public class ScreenShare : MonoBehaviour {
    Texture2D mTexture;
    Rect mRect;
    [SerializeField]
    private string appId = "Your_AppID";
    [SerializeField]
    private string channelName = "agora";
    public IRtcEngine mRtcEngine;
    int i = 100;
    void Start () {
        Debug.Log ("ScreenShare Activated");
        mRtcEngine = IRtcEngine.getEngine (appId);
        // enable log
        mRtcEngine.SetLogFilter (LOG_FILTER.DEBUG | LOG_FILTER.INFO | LOG_FILTER.WARNING | LOG_FILTER.ERROR | LOG_FILTER.CRITICAL);
        // set callbacks (optional)
        mRtcEngine.SetParameters ("{\"rtc.log_filter\": 65535}");
        //Configure the external video source
        mRtcEngine.SetExternalVideoSource (true, false);
        // Start video mode
        mRtcEngine.EnableVideo ();
        // allow camera output callback
        mRtcEngine.EnableVideoObserver ();
        // join channel
        mRtcEngine.JoinChannel (channelName, null, 0);
        //Create a rectangle width and height of the screen
        mRect = new Rect (0, 0, Screen.width, Screen.height);
        //Create a texture the size of the rectangle you just created
        mTexture = new Texture2D ((int) mRect.width, (int) mRect.height, TextureFormat.BGRA32, false);
    }
    void Update () {
        //Start the screenshare Coroutine
        StartCoroutine (shareScreen ());
    }
    //Screen Share
    IEnumerator shareScreen () {
        yield return new WaitForEndOfFrame ();
        //Read the Pixels inside the Rectangle
        mTexture.ReadPixels (mRect, 0, 0);
        //Apply the Pixels read from the rectangle to the texture
        mTexture.Apply ();
        // Get the Raw Texture data from the the from the texture and apply it to an array of bytes
        byte[] bytes = mTexture.GetRawTextureData ();
        // Make enough space for the bytes array
        int size = Marshal.SizeOf (bytes[0]) * bytes.Length;
        // Check to see if there is an engine instance already created
        IRtcEngine rtc = IRtcEngine.QueryEngine ();
        //if the engine is present
        if (rtc != null) {
            //Create a new external video frame
            ExternalVideoFrame externalVideoFrame = new ExternalVideoFrame ();
            //Set the buffer type of the video frame
            externalVideoFrame.type = ExternalVideoFrame.VIDEO_BUFFER_TYPE.VIDEO_BUFFER_RAW_DATA;
            // Set the video pixel format
            externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_BGRA;
            //apply raw data you are pulling from the rectangle you created earlier to the video frame
            externalVideoFrame.buffer = bytes;
            //Set the width of the video frame (in pixels)
            externalVideoFrame.stride = (int) mRect.width;
            //Set the height of the video frame
            externalVideoFrame.height = (int) mRect.height;
            //Remove pixels from the sides of the frame
            externalVideoFrame.cropLeft = 10;
            externalVideoFrame.cropTop = 10;
            externalVideoFrame.cropRight = 10;
            externalVideoFrame.cropBottom = 10;
            //Rotate the video frame (0, 90, 180, or 270)
            externalVideoFrame.rotation = 180;
            // increment i with the video timestamp
            externalVideoFrame.timestamp = i++;
            //Push the external video frame with the frame we just created
            int a = rtc.PushVideoFrame (externalVideoFrame);
            Debug.Log (" pushVideoFrame =       " + a);
        }
    }
}

1 个答案:

答案 0 :(得分:0)

你如何管理渲染纹理?它是否链接到相机?您应该将渲染纹理分配给相机并从中获取数据。 Here is an example from a different project 您可以查看渲染纹理数据的使用情况。

另请注意,您正在关注一个过时的教程,其中 API 在 SDK 更新后略有变化。这个例子也是在这方面注明日期的。 Pixel 格式应使用 RGBA 而不是 BGRA 以实现跨平台兼容性。

externalVideoFrame.format = ExternalVideoFrame.VIDEO_PIXEL_FORMAT.VIDEO_PIXEL_RGBA;