设备方向的陀螺仪问题

时间:2013-01-14 01:39:56

标签: android accelerometer android-sensors gyroscope

我使用本教程从设备的陀螺仪获得俯仰和滚动数据:http://www.thousand-thoughts.com/2012/03/android-sensor-fusion-tutorial/

所有读数都非常准确(有一个过滤器应用于教程中的代码以消除陀螺漂移)。不幸的是,只有当我的设备平放在与地面平行的表面上时,代码才有效。我的应用程序工作的最理想位置是设备的顶部指向正面(即,设备垂直于地面,屏幕面向用户)。每当我将设备定位在此位置时,音高值将达到+90度(如预期的那样)。我想要做的是将此位置设置为我的设备的0度点(或初始位置),以便当我的设备直立(纵向模式)且屏幕朝向用户时,音高读数为0度。 / p>

我向教程的作者提供了有关此问题的帮助,他回答说:

“如果你想将直立位置作为初始位置,你必须相应地旋转你的参考框架。最简单的方法是将产生的旋转矩阵围绕x轴旋转-90度。但是你必须要小心算法中的哪一点应用这个旋转。永远记住旋转不是可交换操作。为了更具体这一点,我将不得不再次检查代码,因为我没有使用它一段时间了。“

我真的很困惑,并且难以理解如何旋转我的参考框架。我想底线是我不知道如何围绕x轴旋转矩阵-90度。如果有人可以帮我解决这个问题,那就太棒了。这是我的代码,以防任何人想要引用它:

public class AttitudeDisplayIndicator extends SherlockActivity implements SensorEventListener {

private SensorManager mSensorManager = null;

// angular speeds from gyro
private float[] gyro = new float[3];

// rotation matrix from gyro data
private float[] gyroMatrix = new float[9];

// orientation angles from gyro matrix
private float[] gyroOrientation = new float[3];

// magnetic field vector
private float[] magnet = new float[3];

// accelerometer vector
private float[] accel = new float[3];

// orientation angles from accel and magnet
private float[] accMagOrientation = new float[3];

// final orientation angles from sensor fusion
private float[] fusedOrientation = new float[3];

// accelerometer and magnetometer based rotation matrix
private float[] rotationMatrix = new float[9];

public static final float EPSILON = 0.000000001f;
private static final float NS2S = 1.0f / 1000000000.0f;
private float timestamp;
private boolean initState = true;

public static final int TIME_CONSTANT = 30;
public static final float FILTER_COEFFICIENT = 0.98f;
private Timer fuseTimer = new Timer();

// The following members are only for displaying the sensor output.
public Handler mHandler;
DecimalFormat d = new DecimalFormat("#.##");

//ADI background image.
private ImageView adiBackground;

//ADI axes.
private ImageView adiAxes;

//ADI frame.
private ImageView adiFrame;

//Layout.
private RelativeLayout layout;

//Pitch and Roll TextViews.
private TextView pitchAngleText;
private TextView bankAngleText;

//Instantaneous output values from sensors as the device moves.
public static double pitch;
public static double roll;

//Matrix for rotating the ADI (roll).
Matrix mMatrix = new Matrix();

@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.activity_attitude_display_indicator);

    gyroOrientation[0] = 0.0f;
    gyroOrientation[1] = 0.0f;
    gyroOrientation[2] = 0.0f;

    // initialise gyroMatrix with identity matrix
    gyroMatrix[0] = 1.0f; gyroMatrix[1] = 0.0f; gyroMatrix[2] = 0.0f;
    gyroMatrix[3] = 0.0f; gyroMatrix[4] = 1.0f; gyroMatrix[5] = 0.0f;
    gyroMatrix[6] = 0.0f; gyroMatrix[7] = 0.0f; gyroMatrix[8] = 1.0f;

    // get sensorManager and initialise sensor listeners
    mSensorManager = (SensorManager) this.getSystemService(SENSOR_SERVICE);
    initListeners();

    // wait for one second until gyroscope and magnetometer/accelerometer
    // data is initialised then scedule the complementary filter task
    fuseTimer.scheduleAtFixedRate(new calculateFusedOrientationTask(),
                                  1000, TIME_CONSTANT);

    mHandler = new Handler();
    adiBackground = (ImageView) findViewById(R.id.adi_background);
    adiFrame = (ImageView) findViewById(R.id.adi_frame);
    adiAxes = (ImageView) findViewById(R.id.adi_axes);

    layout = (RelativeLayout) findViewById(R.id.adi_layout);
    new Color();
    layout.setBackgroundColor(Color.rgb(150, 150, 150));

    pitchAngleText = (TextView) findViewById(R.id.pitch_angle_text);
    bankAngleText = (TextView) findViewById(R.id.bank_angle_text);

}

// This function registers sensor listeners for the accelerometer, magnetometer and gyroscope.
public void initListeners(){
    mSensorManager.registerListener(this,
        mSensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER),
        SensorManager.SENSOR_DELAY_FASTEST);

    mSensorManager.registerListener(this,
        mSensorManager.getDefaultSensor(Sensor.TYPE_GYROSCOPE),
        SensorManager.SENSOR_DELAY_FASTEST);

    mSensorManager.registerListener(this,
        mSensorManager.getDefaultSensor(Sensor.TYPE_MAGNETIC_FIELD),
        SensorManager.SENSOR_DELAY_FASTEST);
}

//@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {    

}

//@Override
public void onSensorChanged(SensorEvent event) {
    switch(event.sensor.getType()) {
    case Sensor.TYPE_ACCELEROMETER:
        // copy new accelerometer data into accel array and calculate orientation
        System.arraycopy(event.values, 0, accel, 0, 3);
        calculateAccMagOrientation();
        break;

    case Sensor.TYPE_GYROSCOPE:
        // process gyro data
        gyroFunction(event);
        break;

    case Sensor.TYPE_MAGNETIC_FIELD:
        // copy new magnetometer data into magnet array
        System.arraycopy(event.values, 0, magnet, 0, 3);
        break;
    }
}

// calculates orientation angles from accelerometer and magnetometer output
    public void calculateAccMagOrientation() {
        if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
            SensorManager.getOrientation(rotationMatrix, accMagOrientation);
        }
    }

    // This function is borrowed from the Android reference
    // at http://developer.android.com/reference/android/hardware/SensorEvent.html#values
    // It calculates a rotation vector from the gyroscope angular speed values.
    private void getRotationVectorFromGyro(float[] gyroValues,
            float[] deltaRotationVector,
            float timeFactor)
    {
        float[] normValues = new float[3];

        // Calculate the angular speed of the sample
        float omegaMagnitude =
        (float)Math.sqrt(gyroValues[0] * gyroValues[0] +
        gyroValues[1] * gyroValues[1] +
        gyroValues[2] * gyroValues[2]);

        // Normalize the rotation vector if it's big enough to get the axis
        if(omegaMagnitude > EPSILON) {
        normValues[0] = gyroValues[0] / omegaMagnitude;
        normValues[1] = gyroValues[1] / omegaMagnitude;
        normValues[2] = gyroValues[2] / omegaMagnitude;
        }

        // Integrate around this axis with the angular speed by the timestep
        // in order to get a delta rotation from this sample over the timestep
        // We will convert this axis-angle representation of the delta rotation
        // into a quaternion before turning it into the rotation matrix.
        float thetaOverTwo = omegaMagnitude * timeFactor;
        float sinThetaOverTwo = (float)Math.sin(thetaOverTwo);
        float cosThetaOverTwo = (float)Math.cos(thetaOverTwo);
        deltaRotationVector[0] = sinThetaOverTwo * normValues[0];
        deltaRotationVector[1] = sinThetaOverTwo * normValues[1];
        deltaRotationVector[2] = sinThetaOverTwo * normValues[2];
        deltaRotationVector[3] = cosThetaOverTwo;
    }

    // This function performs the integration of the gyroscope data.
    // It writes the gyroscope based orientation into gyroOrientation.
    public void gyroFunction(SensorEvent event) {
        // don't start until first accelerometer/magnetometer orientation has been acquired
        if (accMagOrientation == null)
            return;

        // initialisation of the gyroscope based rotation matrix
        if(initState) {
            float[] initMatrix = new float[9];
            initMatrix = getRotationMatrixFromOrientation(accMagOrientation);
            float[] test = new float[3];
            SensorManager.getOrientation(initMatrix, test);
            gyroMatrix = matrixMultiplication(gyroMatrix, initMatrix);
            initState = false;
        }

        // copy the new gyro values into the gyro array
        // convert the raw gyro data into a rotation vector
        float[] deltaVector = new float[4];
        if(timestamp != 0) {
            final float dT = (event.timestamp - timestamp) * NS2S;
        System.arraycopy(event.values, 0, gyro, 0, 3);
        getRotationVectorFromGyro(gyro, deltaVector, dT / 2.0f);
        }

        // measurement done, save current time for next interval
        timestamp = event.timestamp;

        // convert rotation vector into rotation matrix
        float[] deltaMatrix = new float[9];
        SensorManager.getRotationMatrixFromVector(deltaMatrix, deltaVector);

        // apply the new rotation interval on the gyroscope based rotation matrix
        gyroMatrix = matrixMultiplication(gyroMatrix, deltaMatrix);

        // get the gyroscope based orientation from the rotation matrix
        SensorManager.getOrientation(gyroMatrix, gyroOrientation);
    }

    private float[] getRotationMatrixFromOrientation(float[] o) {
        float[] xM = new float[9];
        float[] yM = new float[9];
        float[] zM = new float[9];

        float sinX = (float)Math.sin(o[1]);
        float cosX = (float)Math.cos(o[1]);
        float sinY = (float)Math.sin(o[2]);
        float cosY = (float)Math.cos(o[2]);
        float sinZ = (float)Math.sin(o[0]);
        float cosZ = (float)Math.cos(o[0]);

        // rotation about x-axis (pitch)
        xM[0] = 1.0f; xM[1] = 0.0f; xM[2] = 0.0f;
        xM[3] = 0.0f; xM[4] = cosX; xM[5] = sinX;
        xM[6] = 0.0f; xM[7] = -sinX; xM[8] = cosX;

        // rotation about y-axis (roll)
        yM[0] = cosY; yM[1] = 0.0f; yM[2] = sinY;
        yM[3] = 0.0f; yM[4] = 1.0f; yM[5] = 0.0f;
        yM[6] = -sinY; yM[7] = 0.0f; yM[8] = cosY;

        // rotation about z-axis (azimuth)
        zM[0] = cosZ; zM[1] = sinZ; zM[2] = 0.0f;
        zM[3] = -sinZ; zM[4] = cosZ; zM[5] = 0.0f;
        zM[6] = 0.0f; zM[7] = 0.0f; zM[8] = 1.0f;

        // rotation order is y, x, z (roll, pitch, azimuth)
        float[] resultMatrix = matrixMultiplication(xM, yM);
        resultMatrix = matrixMultiplication(zM, resultMatrix);
        return resultMatrix;
    }

    private float[] matrixMultiplication(float[] A, float[] B) {
        float[] result = new float[9];

        result[0] = A[0] * B[0] + A[1] * B[3] + A[2] * B[6];
        result[1] = A[0] * B[1] + A[1] * B[4] + A[2] * B[7];
        result[2] = A[0] * B[2] + A[1] * B[5] + A[2] * B[8];

        result[3] = A[3] * B[0] + A[4] * B[3] + A[5] * B[6];
        result[4] = A[3] * B[1] + A[4] * B[4] + A[5] * B[7];
        result[5] = A[3] * B[2] + A[4] * B[5] + A[5] * B[8];

        result[6] = A[6] * B[0] + A[7] * B[3] + A[8] * B[6];
        result[7] = A[6] * B[1] + A[7] * B[4] + A[8] * B[7];
        result[8] = A[6] * B[2] + A[7] * B[5] + A[8] * B[8];

        return result;
    }

    class calculateFusedOrientationTask extends TimerTask {
        public void run() {
            float oneMinusCoeff = 1.0f - FILTER_COEFFICIENT;

            /*
             * Fix for 179° <--> -179° transition problem:
             * Check whether one of the two orientation angles (gyro or accMag) is negative while the other one is positive.
             * If so, add 360° (2 * math.PI) to the negative value, perform the sensor fusion, and remove the 360° from the result
             * if it is greater than 180°. This stabilizes the output in positive-to-negative-transition cases.
             */

            // azimuth
            if (gyroOrientation[0] < -0.5 * Math.PI && accMagOrientation[0] > 0.0) {
                fusedOrientation[0] = (float) (FILTER_COEFFICIENT * (gyroOrientation[0] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[0]);
                fusedOrientation[0] -= (fusedOrientation[0] > Math.PI) ? 2.0 * Math.PI : 0;
            }
            else if (accMagOrientation[0] < -0.5 * Math.PI && gyroOrientation[0] > 0.0) {
                fusedOrientation[0] = (float) (FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * (accMagOrientation[0] + 2.0 * Math.PI));
                fusedOrientation[0] -= (fusedOrientation[0] > Math.PI)? 2.0 * Math.PI : 0;
            }
            else {
                fusedOrientation[0] = FILTER_COEFFICIENT * gyroOrientation[0] + oneMinusCoeff * accMagOrientation[0];
            }

            // pitch
            if (gyroOrientation[1] < -0.5 * Math.PI && accMagOrientation[1] > 0.0) {
                fusedOrientation[1] = (float) (FILTER_COEFFICIENT * (gyroOrientation[1] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[1]);
                fusedOrientation[1] -= (fusedOrientation[1] > Math.PI) ? 2.0 * Math.PI : 0;
            }
            else if (accMagOrientation[1] < -0.5 * Math.PI && gyroOrientation[1] > 0.0) {
                fusedOrientation[1] = (float) (FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * (accMagOrientation[1] + 2.0 * Math.PI));
                fusedOrientation[1] -= (fusedOrientation[1] > Math.PI)? 2.0 * Math.PI : 0;
            }
            else {
                fusedOrientation[1] = FILTER_COEFFICIENT * gyroOrientation[1] + oneMinusCoeff * accMagOrientation[1];
            }

            // roll
            if (gyroOrientation[2] < -0.5 * Math.PI && accMagOrientation[2] > 0.0) {
                fusedOrientation[2] = (float) (FILTER_COEFFICIENT * (gyroOrientation[2] + 2.0 * Math.PI) + oneMinusCoeff * accMagOrientation[2]);
                fusedOrientation[2] -= (fusedOrientation[2] > Math.PI) ? 2.0 * Math.PI : 0;
            }
            else if (accMagOrientation[2] < -0.5 * Math.PI && gyroOrientation[2] > 0.0) {
                fusedOrientation[2] = (float) (FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * (accMagOrientation[2] + 2.0 * Math.PI));
                fusedOrientation[2] -= (fusedOrientation[2] > Math.PI)? 2.0 * Math.PI : 0;
            }
            else {
                fusedOrientation[2] = FILTER_COEFFICIENT * gyroOrientation[2] + oneMinusCoeff * accMagOrientation[2];
            }

            // overwrite gyro matrix and orientation with fused orientation
            // to comensate gyro drift
            gyroMatrix = getRotationMatrixFromOrientation(fusedOrientation);
            System.arraycopy(fusedOrientation, 0, gyroOrientation, 0, 3);


            // update sensor output in GUI
            mHandler.post(updateOrientationDisplayTask);
        }
    }

提前感谢您的帮助!

1 个答案:

答案 0 :(得分:9)

理论......

我不确定表示“参照系”矩阵的格式,但通常使用矩阵乘法进行旋转。

基本上,您可以使用“参考框架矩阵”并将其乘以90度旋转矩阵。

这样的矩阵可以在维基百科上找到: Three-dimensional rotation matrices

由于你的角度是90度,你的正弦和余弦将分解为1或0,你可以直接插入矩阵而不是计算正弦和余弦。例如,围绕x轴逆时针旋转90度的矩阵将如下所示:

1   0   0
0   0   1
0  -1   0

另外,请注意,像这样的矩阵对x y z坐标的行向量进行操作。 因此,例如,如果空间中的点位于(2,5,7)并且您想使用上述矩阵旋转它,则必须执行以下操作:

|2 5 7| |1  0  0|
        |0  0  1|
        |0 -1  0|

给出了[2 -7 5]

...已应用于您的代码

我已经快速查看了您的代码,您需要进行的修改似乎涉及calculateAccMagOrientation()的输出,因为它用于初始化设备的方向。

1: public void calculateAccMagOrientation() {
2:     if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
3:         SensorManager.getOrientation(rotationMatrix, accMagOrientation);
4:     }
5: }

在上面代码段的第2行,您可以获得初始的rotationMatrix。尝试使用手工制作的90度旋转矩阵乘以rotationMatrix,然后在第3行调用getOrientation。我认为这将有效地重新调整您的参考方向:

public void calculateAccMagOrientation() {
    if(SensorManager.getRotationMatrix(rotationMatrix, null, accel, magnet)) {
        rotationMatrix = matrixMultiplication(rotationMatrix, my90DegRotationMatrix);
        SensorManager.getOrientation(rotationMatrix, accMagOrientation);
    }
}

请注意,根据角度在Android中的工作方式,您可能需要使用90度顺时针旋转矩阵而不是逆时针旋转矩阵。

替代解决方案

我刚刚想到,也许你还可以在显示之前从最终的音高结果中减去90?