Xamarin.Android与UrhoSharp相机节点基于Android设备方向的旋转

时间:2018-05-11 10:53:56

标签: android xamarin camera rotation urhosharp

在我的 Xamarin.Android 应用程序中,我从设备的地磁旋转矢量复合传感器获取了X,Y,Z轴的方向数据,并使用SensorManager.GetOrientation( )处理方法。我想在UrhoSharp的场景中将此方向数据应用于CameraNode的Rotation属性。 换句话说,我想使用设备“方向”传感器来控制场景的相机。

到目前为止我在SensorChanged事件处理程序中所做的事情:

// app -> an instance of Urho.SimpleApplication
public void OnSensorChanged(SensorEvent e) {
    if (e.Sensor.Type == SensorType.GeomagneticRotationVector) {
        var rm = new float[9];
        SensorManager.GetRotationMatrixFromVector(rm, e.Values.ToArray());
        var ov = new float[3];
        SensorManager.GetOrientation(rm, ov);
        app.Pitch = (Urho.MathHelper.RadiansToDegrees(ov[0]) + 360) % 360;      // map [-Pi...+Pi] to [0...360]
        app.Yaw = (Urho.MathHelper.RadiansToDegrees(ov[1]) + 360) % 360;        // map [-Pi/2...+Pi/2] to [0...360]
        app.CameraNode.Rotation = new Urho.Quaternion(app.Pitch, app.Yaw, 0);
    }
}

但不幸的是,它没有按预期工作,相机看起来总是走向错误的方向。有什么想法吗?

3 个答案:

答案 0 :(得分:1)

OnSensorChanged应为:

   if (e.Sensor == mRotationSensor)
    {
        var rm = new float[9];
        SensorManager.GetRotationMatrixFromVector(rm, e.Values.ToArray());
        var ov = new float[3];
        SensorManager.GetOrientation(rm, ov);
        app.pitch = (Urho.MathHelper.RadiansToDegrees(ov[1]) + 360) % 360;      // map [-Pi...+Pi] to [0...360]
        app.yaw = (Urho.MathHelper.RadiansToDegrees(ov[0]) + 360) % 360;
        Log.Error("pitch=",app.pitch+"");
        Log.Error("yaw=", app.yaw + "");
        // map [-Pi/2...+Pi/2] to [0...360]
        app.cameraNode.Rotation = new Urho.Quaternion(app.pitch, app.yaw, 0);

    }

您需要在SensorManager方法中为OnCreate添加这些内容:

 mSensorManager = (SensorManager)GetSystemService(Activity.SensorService);
 mRotationSensor = mSensorManager.GetDefaultSensor(SensorType.RotationVector);
 mSensorManager.RegisterListener(this, mRotationSensor, SensorDelay.Game);

并添加变量:

 private SensorManager mSensorManager;
 private Sensor mRotationSensor;

最后,不要忘记为您启用ISensorEventListener接口。

我在github

上提供了演示

答案 1 :(得分:1)

最后,我通过@joe

的一些研究和帮助解决了这个问题

以下是方法的最终版本:

// [app] is an instance of Urho.SimpleApplication
public void OnSensorChanged(SensorEvent e) {
    if (e.Sensor.Type == SensorType.GeomagneticRotationVector) {
        var inR = new float[9];
        SensorManager.GetRotationMatrixFromVector(inR, e.Values.ToArray());
        var outR = new float[9];
        // we need to remap cooridante system, since the Y and Z axes will be swapped, when we pick up the device 
        if (SensorManager.RemapCoordinateSystem(inR, Android.Hardware.Axis.X, Android.Hardware.Axis.Z, outR)) {
            var ov = new float[3];
            SensorManager.GetOrientation(outR, ov);
            try {
                app.Pitch = (MathHelper.RadiansToDegrees(ov[1]) + 360) % 360;
                app.Yaw = (MathHelper.RadiansToDegrees(ov[0]) + 360) % 360;
                app.CameraNode.Rotation = new Quaternion(app.Pitch, app.Yaw, 0);
            }
            catch (System.Exception ex) {
                // while Urho.SimpleApplication is not fully started, the [app] properties are not available
                System.Diagnostics.Trace.WriteLine(ex.Message);
            }
        }
    }
}

答案 2 :(得分:1)

使用四元数的另一种可能的解决方案:

public void OnSensorChanged(SensorEvent e) {
    if (e.Sensor.Type == SensorType.GeomagneticRotationVector) {
        var qv = new float[4];
        SensorManager.GetQuaternionFromVector(qv, e.Values.ToArray());
        try {
            app.CameraNode.Rotation = new Quaternion(qv[1], -qv[3], qv[2], qv[0]);
            app.CameraNode.Pitch(90.0f);
            app.CameraNode.Roll(180.0f);
        }
        catch (System.Exception ex) {
        }
    }
}