我想从TYPE_MAGNETIC_FIELD Position sensor收集磁场矢量(即微特拉斯中的x,y,z),并将其放在与从ARCore获得的Frame's相同的坐标系中
磁场矢量在Sensor Coordinate System中。我们需要将其放入相机坐标系中。我相信我可以使用以下两个API,它们在每个相机框架上都提供(显示我更喜欢文档的NDK版本):
下面,我计算相机坐标系中的磁矢量(magneticVectorInCamera)。当我对其进行测试时(通过在手机周围绕一圈弱磁,并将其与iOS's CLHeading's raw x,y,z values进行比较,我没有得到我期望的值。有什么建议吗?
scene.addOnUpdateListener(frameTime -> { processFrame(this.sceneView.getArFrame()) });
public void processFrame(Frame frame) {
if (frame.getCamera().getTrackingState() != TrackingState.TRACKING) {
return;
}
// Get the magnetic vector in sensor that we stored in the onSensorChanged() delegate
float[] magneticVectorInSensor = {x,y,z};
// Get sensor to world
Pose sensorToWorldPose = frame.getAndroidSensorPose();
// Get world to camera
Pose cameraToWorldPose = frame.getCamera().getPose();
Pose worldToCameraPose = cameraToWorldPose.inverse();
// Get sensor to camera
Pose sensorToCameraPose = sensorToWorldPose.compose(worldToCameraPose);
// Get the magnetic vector in camera coordinate space
float[] magneticVectorInCamera = sensorToCameraPose.rotateVector(magneticVectorInSensor);
}
@Override
public void onSensorChanged(SensorEvent sensorEvent) {
int sensorType = sensorEvent.sensor.getType();
switch (sensorType) {
case Sensor.TYPE_MAGNETIC_FIELD:
mMagnetometerData = sensorEvent.values.clone();
break;
default:
return;
}
x = mMagnetometerData[0];
y = mMagnetometerData[1];
z = mMagnetometerData[2];
}
这是我从中得到的示例日志行:
V/processFrame: magneticVectorInSensor: [-173.21014, -138.63983, 54.873657]
V/processFrame: sensorToWorldPose: t:[x:-1.010, y:-0.032, z:-0.651], q:[x:-0.28, y:-0.62, z:-0.21, w:0.71]
V/processFrame: cameraToWorldPose: t:[x:-0.941, y:0.034, z:-0.610], q:[x:-0.23, y:0.62, z:0.66, w:-0.35]
V/processFrame: worldToCameraPose: t:[x:-0.509, y:0.762, z:-0.647], q:[x:0.23, y:-0.62, z:-0.66, w:-0.35]
V/processFrame: sensorToCameraPose: t:[x:-0.114, y:0.105, z:-1.312], q:[x:0.54, y:-0.46, z:-0.08, w:-0.70]
V/processFrame: magneticVectorInCamera: [15.159668, 56.381603, 220.96408]
让我感到困惑的一件事是,为什么我在移动手机时sensoryToCamera姿势会发生变化:
sensorToCameraPose: t:[x:0.068, y:-0.014, z:0.083], q:[x:0.14, y:-0.65, z:-0.25, w:-0.70]
sensorToCameraPose: t:[x:0.071, y:-0.010, z:0.077], q:[x:0.11, y:-0.66, z:-0.23, w:-0.70]
sensorToCameraPose: t:[x:0.075, y:-0.007, z:0.070], q:[x:0.08, y:-0.68, z:-0.20, w:-0.70]
sensorToCameraPose: t:[x:0.080, y:-0.007, z:0.061], q:[x:0.05, y:-0.69, z:-0.18, w:-0.70]
sensorToCameraPose: t:[x:0.084, y:-0.008, z:0.052], q:[x:0.01, y:-0.69, z:-0.17, w:-0.70]
sensorToCameraPose: t:[x:0.091, y:-0.011, z:0.045], q:[x:-0.03, y:-0.69, z:-0.17, w:-0.70]
sensorToCameraPose: t:[x:0.094, y:-0.017, z:0.037], q:[x:-0.09, y:-0.69, z:-0.17, w:-0.70]
sensorToCameraPose: t:[x:0.098, y:-0.026, z:0.027], q:[x:-0.16, y:-0.67, z:-0.17, w:-0.70]
sensorToCameraPose: t:[x:0.100, y:-0.037, z:0.020], q:[x:-0.23, y:-0.65, z:-0.19, w:-0.70]
sensorToCameraPose: t:[x:0.098, y:-0.046, z:0.012], q:[x:-0.30, y:-0.62, z:-0.20, w:-0.70]
sensorToCameraPose: t:[x:0.096, y:-0.055, z:0.005], q:[x:-0.35, y:-0.59, z:-0.19, w:-0.70]
sensorToCameraPose: t:[x:0.092, y:-0.061, z:-0.003], q:[x:-0.41, y:-0.56, z:-0.18, w:-0.70]
sensorToCameraPose: t:[x:0.086, y:-0.066, z:-0.011], q:[x:-0.45, y:-0.52, z:-0.17, w:-0.70]
sensorToCameraPose: t:[x:0.080, y:-0.069, z:-0.018], q:[x:-0.49, y:-0.49, z:-0.16, w:-0.70]
sensorToCameraPose: t:[x:0.073, y:-0.071, z:-0.025], q:[x:-0.53, y:-0.45, z:-0.15, w:-0.70]
sensorToCameraPose: t:[x:0.065, y:-0.072, z:-0.031], q:[x:-0.56, y:-0.42, z:-0.13, w:-0.70]
sensorToCameraPose: t:[x:0.059, y:-0.072, z:-0.038], q:[x:-0.59, y:-0.38, z:-0.13, w:-0.70]
sensorToCameraPose: t:[x:0.053, y:-0.071, z:-0.042], q:[x:-0.61, y:-0.35, z:-0.12, w:-0.70]
sensorToCameraPose: t:[x:0.047, y:-0.069, z:-0.046], q:[x:-0.63, y:-0.32, z:-0.11, w:-0.70]
sensorToCameraPose: t:[x:0.041, y:-0.067, z:-0.048], q:[x:-0.64, y:-0.28, z:-0.10, w:-0.70]
sensorToCameraPose: t:[x:0.037, y:-0.064, z:-0.050], q:[x:-0.65, y:-0.26, z:-0.10, w:-0.70]
sensorToCameraPose: t:[x:0.032, y:-0.060, z:-0.052], q:[x:-0.67, y:-0.23, z:-0.09, w:-0.70]
sensorToCameraPose: t:[x:0.027, y:-0.057, z:-0.054], q:[x:-0.68, y:-0.20, z:-0.08, w:-0.70]
注意-关于将磁场矢量转换为全局坐标空间(即this和this)还有其他一些问题,但是我还没有找到任何东西可以用于照相坐标空间。
答案 0 :(得分:0)
上面的代码有两个问题。
首先,我使用了compose错误。要先由A转换然后由B转换,请执行B.compose(A)。通过该修复,我开始获得一致的sensorToCameraPose。
第二,修复之后,我在x和y之间旋转了90°。从Reddit上的u / inio:
因此,通常对于电话外形设备,相机坐标系之间会旋转90°(定义为在物理相机图像的水平轴(通常是长轴)的方向上有+ x点设备)和android传感器坐标系(其中+ y指向android导航按钮,而+ x则沿着设备的短轴)。您描述的差异是88.8°旋转。也许您想要虚拟相机摆姿势? Source
我使用getDisplayOrientedPose()进行了测试。有了它,我就能达到人像模式时的期望。但是,如果我翻转到风景,则坐标系会发生变化,并且我会旋转90°离开。因此,我自己做了轮换。
public void processFrame(Frame frame) {
if (frame.getCamera().getTrackingState() != TrackingState.TRACKING) {
return;
}
// Get the magnetic vector in sensor that we stored in the onSensorChanged() delegate
float[] magneticVectorInSensor = {x,y,z};
// Get sensor to world
Pose sensorToWorldPose = frame.getAndroidSensorPose();
// Get camera to world
Pose cameraToWorldPose = frame.getCamera().getPose();
// +90° rotation about Z
// https://github.com/google-ar/arcore-android-sdk/issues/535#issuecomment-418845833
Pose CAMERA_POSE_FIX = Pose.makeRotation(0, 0, ((float) Math.sqrt(0.5f)), ((float) Math.sqrt(0.5f)));
Pose rotatedCameraToWorldPose = cameraToWorldPose.compose(CAMERA_POSE_FIX);
// Get world to camera
Pose worldToCameraPose = rotatedCameraToWorldPose.inverse();
// Get sensor to camera
Pose sensorToCameraPose = worldToCameraPose.compose(sensorToWorldPose);
// Get the magnetic vector in camera coordinate space
float[] magneticVectorInCamera = sensorToCameraPose.rotateVector(magneticVectorInSensor);
}