我的问题是:我想要一个后台服务,它将实时从相机中获取帧,以便我可以分析它们。我在这里看到了很多类似的话题,据说可以解决这个问题,但是在我的案例中没有一个真的有效。
我的第一次尝试是创建一个Activity,它启动了一个Service,并在服务中创建了一个surfaceView,我从中获得了一个holder并实现了一个回调,我准备了相机和所有东西。然后,在previewCallback上,我可以创建一个新线程,它可以分析我从PreviewCallback的onPreviewFrame方法获得的数据。
这很好用,虽然我在前台有这个服务,但是当我打开另一个应用程序(服务仍然在后台运行)时,我意识到预览不是那里所以我无法从中获取帧。
在互联网上搜索,我发现我可以使用SurfaceTexture来解决这个问题。所以,我创建了一个启动我的服务的活动,如下所示:
public class SurfaceTextureActivity extends Activity {
public static TextureView mTextureView;
public static Vibrator mVibrator;
public static GLSurfaceView mGLView;
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
mGLView = new GLSurfaceView(this);
mTextureView = new TextureView(this);
setContentView(mTextureView);
try {
Intent intent = new Intent(SurfaceTextureActivity.this, RecorderService.class);
intent.putExtra(RecorderService.INTENT_VIDEO_PATH, "/folder-path/");
startService(intent);
Log.i("ABC", "Start Service "+this.toString()+" + "+mTextureView.toString()+" + "+getWindowManager().toString());
}
catch (Exception e) {
Log.i("ABC", "Exc SurfaceTextureActivity: "+e.getMessage());
}
}
}
然后我让RecorderService实现了SurfaceTextureListener,这样我就可以打开相机并做其他准备工作,然后拍摄帧。我的RecorderService目前看起来像这样:
public class RecorderService extends Service implements TextureView.SurfaceTextureListener, SurfaceTexture.OnFrameAvailableListener {
private Camera mCamera = null;
private TextureView mTextureView;
private SurfaceTexture mSurfaceTexture;
private float[] mTransformMatrix;
private static IMotionDetection detector = null;
public static Vibrator mVibrator;
@Override
public void onCreate() {
try {
mTextureView = SurfaceTextureActivity.mTextureView;
mTextureView.setSurfaceTextureListener(this);
Log.i("ABC","onCreate");
// startForeground(START_STICKY, new Notification()); - doesn't work
} catch (Exception e) {
Log.i("ABC","onCreate exception "+e.getMessage());
e.printStackTrace();
}
}
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture)
{
//How do I obtain frames?!
// SurfaceTextureActivity.mGLView.queueEvent(new Runnable() {
//
// @Override
// public void run() {
// mSurfaceTexture.updateTexImage();
//
// }
// });
}
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width,
int height) {
mSurfaceTexture = surface;
mSurfaceTexture.setOnFrameAvailableListener(this);
mVibrator = (Vibrator)this.getSystemService(VIBRATOR_SERVICE);
detector = new RgbMotionDetection();
int cameraId = 0;
Camera.CameraInfo info = new Camera.CameraInfo();
for (cameraId = 0; cameraId < Camera.getNumberOfCameras(); cameraId++) {
Camera.getCameraInfo(cameraId, info);
if (info.facing == Camera.CameraInfo.CAMERA_FACING_FRONT)
break;
}
mCamera = Camera.open(cameraId);
Matrix transform = new Matrix();
Camera.Size previewSize = mCamera.getParameters().getPreviewSize();
int rotation = ((WindowManager)(getSystemService(Context.WINDOW_SERVICE))).getDefaultDisplay()
.getRotation();
Log.i("ABC", "onSurfaceTextureAvailable(): CameraOrientation(" + cameraId + ")" + info.orientation + " " + previewSize.width + "x" + previewSize.height + " Rotation=" + rotation);
try {
switch (rotation) {
case Surface.ROTATION_0:
mCamera.setDisplayOrientation(90);
mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
previewSize.height, previewSize.width, Gravity.CENTER));
transform.setScale(-1, 1, previewSize.height/2, 0);
break;
case Surface.ROTATION_90:
mCamera.setDisplayOrientation(0);
mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
previewSize.width, previewSize.height, Gravity.CENTER));
transform.setScale(-1, 1, previewSize.width/2, 0);
break;
case Surface.ROTATION_180:
mCamera.setDisplayOrientation(270);
mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
previewSize.height, previewSize.width, Gravity.CENTER));
transform.setScale(-1, 1, previewSize.height/2, 0);
break;
case Surface.ROTATION_270:
mCamera.setDisplayOrientation(180);
mTextureView.setLayoutParams(new FrameLayout.LayoutParams(
previewSize.width, previewSize.height, Gravity.CENTER));
transform.setScale(-1, 1, previewSize.width/2, 0);
break;
}
mCamera.setPreviewTexture(mSurfaceTexture);
Log.i("ABC", "onSurfaceTextureAvailable(): Transform: " + transform.toString());
mCamera.startPreview();
// mTextureView.setVisibility(0);
mCamera.setPreviewCallback(new PreviewCallback() {
@Override
public void onPreviewFrame(byte[] data, Camera camera) {
if (data == null) return;
Camera.Size size = mCamera.getParameters().getPreviewSize();
if (size == null) return;
//This is where I start my thread that analyzes images
DetectionThread thread = new DetectionThread(data, size.width, size.height);
thread.start();
}
});
}
catch (Exception t) {
Log.i("ABC", "onSurfaceTextureAvailable Exception: "+ t.getMessage());
}
}
然而,与其他情况类似,因为我的分析线程在onSurfaceTextureAvailable内部开始,这只是当纹理那里时,而不是当我打开另一个应用程序时,帧捕获赢了&当我打开别的东西时,我会继续。
Some想法表明它可能,但我不知道如何做。有一个想法,我可以实现SurfaceTexture.onFrameAvailable,然后一旦新帧可用,触发runnable在渲染线程(GLSurfaceView.queueEvent(..))上运行,最后运行一个runnable调用SurfaceTexture.updateTexImage()。这是我尝试过的(它在我的代码中被注释掉了),但它不起作用,如果我这样做,应用程序就会崩溃。
我还能做什么?我知道这可以以某种方式工作,因为我已经看到它在像SpyCameraOS这样的应用程序中使用(是的,我知道它是开源的,我已经查看了代码,但我无法&#39 ;做一个有效的解决方案),我觉得我在某个地方只丢了一小块,但我不知道我做错了什么。我过去3天一直在这,但没有成功。
非常感谢帮助。
答案 0 :(得分:2)
总结注释:将Camera的输出定向到未绑定到View对象的SurfaceTexture。当活动暂停时,TextureView将被销毁,释放其SurfaceTexture,但如果你创建一个单独的SurfaceTexture(或从TextureView中分离出一个),那么它将不会受到Activity状态变化的影响。纹理可以渲染到屏幕外的Surface,可以从中读取像素。
可以在Grafika中找到各种示例。