我正在制作一个玻璃器皿,我的活动中显示了一个图像。我还设法通过在触控板上使用滑动手势来放大图像。我还需要滚动浏览这个缩放的图像。
所以我的问题是他们有什么方法可以让你转过头去做某种姿势 我看过谷歌玻璃飞镖十字线的例子 https://www.youtube.com/watch?v=pGhamZnj6V0 但如果我理解它的工作方式,它会使用某种浏览器/网络技术 但浏览器获取此信息的方式也可以帮助我。
或者我应该在我的活动中从我的imageview切换到网页控件并尝试加载嵌入此控件的图像?如果是,我该如何处理它们的缩放功能?
答案 0 :(得分:1)
我不相信当您移动头部时系统会触发手势事件。但是,您可以使用Android中提供的Accelerometer API实现自己的。这样的事情可能有用:
double lastX = 0;
double lastY = 0;
// Set up the accelerometer
SensorManager manager = (SensorManager) getSystemService(SENSOR_SERVICE);
manager.registerListener(this, sensorManager.getDefaultSensor(Sensor.TYPE_ACCELEROMETER), SensorManager.SENSOR_DELAY_NORMAL);
@Override
public void onSensorChanged(SensorEvent event) {
// Check for correct sensor
if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
// Get current acceleration values
double currentX = event.values[0];
double currentY = event.values[1];
// Calculate delta values
double newX = x - lastX;
double newY = y - lastY;
// Set last values
lastX = newX;
lastY = newY;
// Move the view!
moveView(newX, newY);
}
}
public void moveView(double x, double y) {
view.setX(x);
view.setY(y);
}
您需要修改此代码才能使其在您的设置中正常运行,但它应该让您了解如何启动。
答案 1 :(得分:1)
我使用了以下library。
该库仅支持向下/向左/向右,但我添加了一些代码以使其识别向上的手势。 另外,因为我必须自己编辑代码,并且无法从我链接的库中创建更新的库。我在我的项目中创建了一个新包,并在那里粘贴了所有需要的代码。
这是我添加一些代码后我的HeadGestureDetector的样子
import java.util.List;
import android.content.Context;
import android.hardware.Sensor;
import android.hardware.SensorEvent;
import android.hardware.SensorEventListener;
import android.hardware.SensorManager;
import android.util.Log;
import HeadGestureDetector.*;
import HeadGestureDetector.Constants;
public class HeadGestureDetector implements SensorEventListener {
private static final int MATRIX_SIZE = 16;
private float[] inR = new float[MATRIX_SIZE];
private float[] outR = new float[MATRIX_SIZE];
private float[] I = new float[MATRIX_SIZE];
private float[] orientationValues = new float[3];
private float[] magneticValues = new float[3];
private float[] accelerometerValues = new float[3];
private float[] orientationVelocity = new float[3];
private SensorManager mSensorManager;
private OnHeadGestureListener mListener;
static enum State {
IDLE, SHAKE_TO_RIGHT, SHAKE_BACK_TO_LEFT, SHAKE_TO_LEFT, SHAKE_BACK_TO_RIGHT, GO_DOWN, BACK_UP, GO_UP, BACK_DOWN
}
private State mState = State.IDLE;
private long mLastStateChanged = -1;
private static final long STATE_TIMEOUT_NSEC = 1000 * 1000 * 1000;
public HeadGestureDetector(Context context) {
mSensorManager = (SensorManager) context.getSystemService(Context.SENSOR_SERVICE);
}
private static final int[] REQUIRED_SENSORS = { Sensor.TYPE_MAGNETIC_FIELD, Sensor.TYPE_ACCELEROMETER,
Sensor.TYPE_GYROSCOPE };
private static final int[] SENSOR_RATES = { SensorManager.SENSOR_DELAY_NORMAL, SensorManager.SENSOR_DELAY_NORMAL,
SensorManager.SENSOR_DELAY_NORMAL };
public void start() {
for (int i = 0; i < REQUIRED_SENSORS.length; i++) {
int sensor_type = REQUIRED_SENSORS[i];
Sensor sensor = null;
List<Sensor> sensors = mSensorManager.getSensorList(sensor_type);
if (sensors.size() > 1) {
// Google Glass has two gyroscopes: "MPL Gyroscope" and "Corrected Gyroscope Sensor". Try the later one.
sensor = sensors.get(1);
} else {
sensor = sensors.get(0);
}
Log.d(Constants.TAG, "registered:" + sensor.getName());
mSensorManager.registerListener(this, sensor, SENSOR_RATES[i]);
}
}
public void stop() {
mSensorManager.unregisterListener(this);
}
public void setOnHeadGestureListener(OnHeadGestureListener listener) {
this.mListener = listener;
}
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
// TODO Auto-generated method stub
}
@Override
public void onSensorChanged(SensorEvent event) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
// Log.w(Constants.TAG, "Unreliable event...");
}
int sensorType = event.sensor.getType();
if (sensorType == Sensor.TYPE_MAGNETIC_FIELD) {
magneticValues = event.values.clone();
return;
}
if (sensorType == Sensor.TYPE_ACCELEROMETER) {
accelerometerValues = event.values.clone();
SensorManager.getRotationMatrix(inR, I, accelerometerValues, magneticValues);
SensorManager.remapCoordinateSystem(inR, SensorManager.AXIS_X, SensorManager.AXIS_Z, outR);
SensorManager.getOrientation(outR, orientationValues);
return;
}
if (sensorType == Sensor.TYPE_GYROSCOPE) {
if (event.accuracy == SensorManager.SENSOR_STATUS_UNRELIABLE) {
// Log.w(Constants.TAG, "Unreliable gyroscope event...");
// return;
}
orientationVelocity = event.values.clone();
// state timeout check
if (event.timestamp - mLastStateChanged > STATE_TIMEOUT_NSEC && mState != State.IDLE) {
Log.d(Constants.TAG, "state timeouted");
mLastStateChanged = event.timestamp;
mState = State.IDLE;
}
// Log.d(Constants.TAG, Arrays.toString(orientationValues));
// Log.d(Constants.TAG, "V:" + Arrays.toString(orientationVelocity));
// check if glass is put on
if (!isPutOn(orientationValues, orientationVelocity)) {
Log.d(Constants.TAG, "Looks like glass is off?");
}
int maxVelocityIndex = maxAbsIndex(orientationVelocity);
if (!isStable(orientationValues, orientationVelocity)) {
// Log.d(Constants.TAG, "V:" + Arrays.toString(orientationVelocity));
}
if (isStable(orientationValues, orientationVelocity)) {
// Log.d(Constants.TAG, "isStable");
} else if (maxVelocityIndex == 0) {
if (orientationVelocity[0] < -MIN_MOVE_ANGULAR_VELOCITY) {
if (mState == State.IDLE) {
// Log.d(Constants.TAG, "isNod");
mState = State.GO_DOWN;
mLastStateChanged = event.timestamp;
if (mListener != null) {
mListener.onNod();
}
}
}
}
if (orientationVelocity[0] > MIN_MOVE_ANGULAR_VELOCITY) {
if (mState == State.IDLE) {
mState = State.GO_UP;
mLastStateChanged = event.timestamp;
if (mListener != null) {
mListener.onHey();
}
}
}
else if (maxVelocityIndex == 1) {
if (orientationVelocity[1] < -MIN_MOVE_ANGULAR_VELOCITY) {
if (mState == State.IDLE) {
// Log.d(Constants.TAG, Arrays.toString(orientationValues));
// Log.d(Constants.TAG, "V:" + Arrays.toString(orientationVelocity));
mState = State.SHAKE_TO_RIGHT;
mLastStateChanged = event.timestamp;
if (mListener != null) {
mListener.onShakeToRight();
}
}
} else if (orientationVelocity[1] > MIN_MOVE_ANGULAR_VELOCITY) {
if (mState == State.IDLE) {
// Log.d(Constants.TAG, Arrays.toString(orientationValues));
// Log.d(Constants.TAG, "V:" + Arrays.toString(orientationVelocity));
mState = State.SHAKE_TO_LEFT;
mLastStateChanged = event.timestamp;
if (mListener != null) {
mListener.onShakeToLeft();
}
}
}
}
}
}
private static final float MIN_MOVE_ANGULAR_VELOCITY = 1.00F;
private static final float MAX_STABLE_RADIAN = 0.10F;
private static final float MAX_PUT_ON_PITCH_RADIAN = 0.45F;
private static final float MAX_PUT_ON_ROLL_RADIAN = 0.75F;
private static final float STABLE_ANGULAR_VELOCITY = 0.10F;
private static boolean isStable(float[] orientationValues, float[] orientationVelocity) {
if (Math.abs(orientationValues[1]) < MAX_STABLE_RADIAN
&& Math.abs(orientationVelocity[0]) < STABLE_ANGULAR_VELOCITY
&& Math.abs(orientationVelocity[1]) < STABLE_ANGULAR_VELOCITY
&& Math.abs(orientationVelocity[2]) < STABLE_ANGULAR_VELOCITY) {
return true;
}
return false;
}
private static boolean isPutOn(float[] orientationValues, float[] orientationVelocity) {
if (orientationValues[1] < MAX_PUT_ON_PITCH_RADIAN && Math.abs(orientationValues[2]) < MAX_PUT_ON_ROLL_RADIAN) {
return true;
}
return false;
}
private static int maxAbsIndex(float[] array) {
int n = array.length;
float maxValue = Float.MIN_VALUE;
int maxIndex = -1;
for (int i = 0; i < n; i++) {
float val = Math.abs(array[i]);
if (val > maxValue) {
maxValue = val;
maxIndex = i;
}
}
return maxIndex;
}
}
我的OnHeadGestureListener类
public interface OnHeadGestureListener {
void onHey();
void onNod();
void onShakeToLeft();
void onShakeToRight();
}
和我的常量类
public class Constants {
public static final String TAG = "HeadGestureDetector";
}
在项目中创建这三个类,您应该能够使用头部手势
要使其正常工作,您需要将implements OnHeadGestureListener
添加到您的班级
声明private GestureDetector mGestureDetector
mHeadGestureDetector = new HeadGestureDetector(this);
mHeadGestureDetector.setOnHeadGestureListener(this);
最后在你的onResume
mHeadGestureDetector.start();
和你的onPause
mHeadGestureDetector.stop();
我为这个探测器写了一段代码来触发触摸板手势。所以当你向上看玻璃时,你会认为你在触摸板上滑了下来。可以找到此代码here。