我想在Android上的Unity中在虚拟现实中播放立体声360度视频。到目前为止,我一直在做一些研究,我有两个摄像头用于右眼和左眼,每个摄像头围绕它们。我还需要一个自定义着色器来在球体内部渲染图像。我通过将y-tiling设置为0.5来使图像的上半部分显示在一个球体上,而下半部分显示在另一个球体上,y-tile为0.5,y-offset为0.5。 有了这个,我可以显示已经正确的3D 360度图像。整个想法来自this tutorial。
现在对于视频,我需要控制视频速度所以it turned out我需要新的Unity 5.6测试版中的VideoPlayer。现在我的设置到目前为止需要视频播放器在两个球体上播放视频,其中一个球体播放上部(一只眼睛),另一个视频播放下部(另一只眼睛)。
这是我的问题:我不知道如何让视频播放器在两种不同的素材上播放相同的视频(因为它们有不同的平铺值)。有没有办法做到这一点?
我得到一个提示,我可以使用相同的材料并通过紫外线实现平铺效果,但我不知道它是如何工作的,我甚至没有让视频播放器播放视频两个对象在它们上面使用相同的材料。我有截屏of that here。右侧球体只有材质videoMaterial。没有平铺,因为我必须通过UV做到这一点。
走哪条路怎么办?我在正确的路上吗?
答案 0 :(得分:3)
我在正确的路上吗?
几乎您目前正在使用Renderer
和Material
而不是RenderTexture
和Material
。
走哪条路怎么办?
您需要使用RenderTexture
。基本上,您将视频渲染为RenderTexture
,然后将该纹理指定给两个球体的材质。
1 。创建RenderTexture
并将其分配给VideoPlayer
。
2 。为球体创建两种材质。
3 。将VideoPlayer.renderMode
设置为VideoRenderMode.RenderTexture;
4 。从RenderTexture
5 。准备并播放视频。
下面的代码正在做那件事。它应该开箱即用。您唯一需要做的就是根据需要修改每种材料的平铺和偏移。
您还应该注释掉:
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
然后使用从任何3D应用程序导入的Sphere。这行代码仅用于测试目的,并且使用Unity的球体播放视频不是一个好主意,因为球体没有足够的细节来使视频流畅。
using UnityEngine;
using UnityEngine.Video;
public class StereoscopicVideoPlayer : MonoBehaviour
{
RenderTexture renderTexture;
Material leftSphereMat;
Material rightSphereMat;
public GameObject leftSphere;
public GameObject rightSphere;
private VideoPlayer videoPlayer;
//Audio
private AudioSource audioSource;
void Start()
{
//Create Render Texture
renderTexture = createRenderTexture();
//Create Left and Right Sphere Materials
leftSphereMat = createMaterial();
rightSphereMat = createMaterial();
//Create the Left and Right Sphere Spheres
leftSphere = createSphere("LeftEye", new Vector3(-5f, 0f, 0f), new Vector3(4f, 4f, 4f));
rightSphere = createSphere("RightEye", new Vector3(5f, 0f, 0f), new Vector3(4f, 4f, 4f));
//Assign material to the Spheres
leftSphere.GetComponent<MeshRenderer>().material = leftSphereMat;
rightSphere.GetComponent<MeshRenderer>().material = rightSphereMat;
//Add VideoPlayer to the GameObject
videoPlayer = gameObject.AddComponent<VideoPlayer>();
//Add AudioSource
audioSource = gameObject.AddComponent<AudioSource>();
//Disable Play on Awake for both Video and Audio
videoPlayer.playOnAwake = false;
audioSource.playOnAwake = false;
// We want to play from url
videoPlayer.source = VideoSource.Url;
videoPlayer.url = "http://www.quirksmode.org/html5/videos/big_buck_bunny.mp4";
//Set Audio Output to AudioSource
videoPlayer.audioOutputMode = VideoAudioOutputMode.AudioSource;
//Assign the Audio from Video to AudioSource to be played
videoPlayer.EnableAudioTrack(0, true);
videoPlayer.SetTargetAudioSource(0, audioSource);
//Set the mode of output to be RenderTexture
videoPlayer.renderMode = VideoRenderMode.RenderTexture;
//Set the RenderTexture to store the images to
videoPlayer.targetTexture = renderTexture;
//Set the Texture of both Spheres to the Texture from the RenderTexture
assignTextureToSphere();
//Prepare Video to prevent Buffering
videoPlayer.Prepare();
//Subscribe to prepareCompleted event
videoPlayer.prepareCompleted += OnVideoPrepared;
}
RenderTexture createRenderTexture()
{
RenderTexture rd = new RenderTexture(1024, 1024, 16, RenderTextureFormat.ARGB32);
rd.Create();
return rd;
}
Material createMaterial()
{
return new Material(Shader.Find("Specular"));
}
void assignTextureToSphere()
{
//Set the Texture of both Spheres to the Texture from the RenderTexture
leftSphereMat.mainTexture = renderTexture;
rightSphereMat.mainTexture = renderTexture;
}
GameObject createSphere(string name, Vector3 spherePos, Vector3 sphereScale)
{
GameObject sphere = GameObject.CreatePrimitive(PrimitiveType.Sphere);
sphere.transform.position = spherePos;
sphere.transform.localScale = sphereScale;
sphere.name = name;
return sphere;
}
void OnVideoPrepared(VideoPlayer source)
{
Debug.Log("Done Preparing Video");
//Play Video
videoPlayer.Play();
//Play Sound
audioSource.Play();
//Change Play Speed
if (videoPlayer.canSetPlaybackSpeed)
{
videoPlayer.playbackSpeed = 1f;
}
}
}
关于如何使用特殊着色器执行此操作还有Unity tutorial,但这对我和其他人不起作用。我建议您使用上述方法,直到将{q}支持添加到VideoPlayer
API。