我正在尝试将Camera2 api集成到我的app.Its工作正常,首先捕获图像。但是当我第二次拍摄预览时没有来。我在genymotion nexus 5模拟器中测试过它。 .Preview不是第二次抢购。也得到了这个错误。 java.lang.IllegalArgumentException:Surface没有有效的原生Surface ...
我按照这2个代码 http://inducesmile.com/android/android-camera2-api-example-tutorial/?cid=519 Github-Camera2Master。请帮助任何人解决此错误,并提供一些链接,以获取有关相机2 api的更详细说明
package com.example.cameraapi;
import android.Manifest;
import android.content.Context;
import android.content.pm.PackageManager;
import android.graphics.ImageFormat;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CameraMetadata;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.TotalCaptureResult;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.media.Image;
import android.media.ImageReader;
import android.os.Bundle;
import android.os.Environment;
import android.os.Handler;
import android.os.HandlerThread;
import android.support.annotation.NonNull;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.util.Log;
import android.util.Size;
import android.util.SparseIntArray;
import android.view.Surface;
import android.view.TextureView;
import android.view.View;
import android.widget.Button;
import android.widget.Toast;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.OutputStream;
import java.nio.ByteBuffer;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
public class AndroidCameraApi extends AppCompatActivity {
private static final String TAG = "AndroidCameraApi";
private Button takePictureButton;
private TextureView textureView;
private static final SparseIntArray ORIENTATIONS = new SparseIntArray();
static {
ORIENTATIONS.append(Surface.ROTATION_0, 90);
ORIENTATIONS.append(Surface.ROTATION_90, 0);
ORIENTATIONS.append(Surface.ROTATION_180, 270);
ORIENTATIONS.append(Surface.ROTATION_270, 180);
}
private String cameraId;
protected CameraDevice cameraDevice;
protected CameraCaptureSession cameraCaptureSessions;
protected CaptureRequest captureRequest;
protected CaptureRequest.Builder captureRequestBuilder;
private Size imageDimension;
private ImageReader imageReader;
private File file;
private static final int REQUEST_CAMERA_PERMISSION = 200;
private boolean mFlashSupported;
private Handler mBackgroundHandler;
private HandlerThread mBackgroundThread;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera);
textureView = (TextureView) findViewById(R.id.textureView);
assert textureView != null;
textureView.setSurfaceTextureListener(textureListener);
takePictureButton = (Button) findViewById(R.id.btn_takepicture);
assert takePictureButton != null;
takePictureButton.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View v) {
takePicture();
}
});
}
TextureView.SurfaceTextureListener textureListener = new TextureView.SurfaceTextureListener() {
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
//open your camera here
openCamera();
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture surface, int width, int height) {
// Transform you image captured size according to the surface width and height
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture surface) {
return false;
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture surface) {
}
};
private final CameraDevice.StateCallback stateCallback = new CameraDevice.StateCallback() {
@Override
public void onOpened(CameraDevice camera) {
//This is called when the camera is open
Log.e(TAG, "onOpened");
cameraDevice = camera;
createCameraPreview();
}
@Override
public void onDisconnected(CameraDevice camera) {
cameraDevice.close();
}
@Override
public void onError(CameraDevice camera, int error) {
if(cameraDevice!=null)
cameraDevice.close();
cameraDevice = null;
}
};
final CameraCaptureSession.CaptureCallback captureCallbackListener = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
// Toast.makeText(AndroidCameraApi.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
protected void startBackgroundThread() {
mBackgroundThread = new HandlerThread("Camera Background");
mBackgroundThread.start();
mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
}
protected void stopBackgroundThread() {
mBackgroundThread.quitSafely();
try {
mBackgroundThread.join();
mBackgroundThread = null;
mBackgroundHandler = null;
} catch (InterruptedException e) {
e.printStackTrace();
}
}
protected void takePicture() {
if(null == cameraDevice) {
Log.e(TAG, "cameraDevice is null");
return;
}
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
try {
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraDevice.getId());
Size[] jpegSizes = null;
if (characteristics != null) {
jpegSizes = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP).getOutputSizes(ImageFormat.JPEG);
}
int width = 640;
int height = 480;
if (jpegSizes != null && 0 < jpegSizes.length) {
width = jpegSizes[0].getWidth();
height = jpegSizes[0].getHeight();
}
ImageReader reader = ImageReader.newInstance(width, height, ImageFormat.JPEG, 1);
List<Surface> outputSurfaces = new ArrayList<Surface>(2);
outputSurfaces.add(reader.getSurface());
outputSurfaces.add(new Surface(textureView.getSurfaceTexture()));
final CaptureRequest.Builder captureBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_STILL_CAPTURE);
captureBuilder.addTarget(reader.getSurface());
captureBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
// Orientation
int rotation = getWindowManager().getDefaultDisplay().getRotation();
captureBuilder.set(CaptureRequest.JPEG_ORIENTATION, ORIENTATIONS.get(rotation));
final File file = new File(Environment.getExternalStorageDirectory()+"/pic.jpg");
ImageReader.OnImageAvailableListener readerListener = new ImageReader.OnImageAvailableListener() {
@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (image != null) {
image.close();
}
}
}
private void save(byte[] bytes) throws IOException {
OutputStream output = null;
try {
output = new FileOutputStream(file);
output.write(bytes);
} finally {
if (null != output) {
output.close();
}
}
}
};
reader.setOnImageAvailableListener(readerListener, mBackgroundHandler);
final CameraCaptureSession.CaptureCallback captureListener = new CameraCaptureSession.CaptureCallback() {
@Override
public void onCaptureCompleted(CameraCaptureSession session, CaptureRequest request, TotalCaptureResult result) {
super.onCaptureCompleted(session, request, result);
// Toast.makeText(AndroidCameraApi.this, "Saved:" + file, Toast.LENGTH_SHORT).show();
createCameraPreview();
}
};
cameraDevice.createCaptureSession(outputSurfaces, new CameraCaptureSession.StateCallback() {
@Override
public void onConfigured(CameraCaptureSession session) {
try {
session.capture(captureBuilder.build(), captureListener, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
@Override
public void onConfigureFailed(CameraCaptureSession session) {
}
}, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
protected void createCameraPreview() {
try {
SurfaceTexture texture = textureView.getSurfaceTexture();
assert texture != null;
texture.setDefaultBufferSize(imageDimension.getWidth(), imageDimension.getHeight());
Surface surface = new Surface(texture);
captureRequestBuilder = cameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
captureRequestBuilder.addTarget(surface);
cameraDevice.createCaptureSession(Arrays.asList(surface), new CameraCaptureSession.StateCallback(){
@Override
public void onConfigured(@NonNull CameraCaptureSession cameraCaptureSession) {
//The camera is already closed
if (null == cameraDevice) {
return;
}
// When the session is ready, we start displaying the preview.
cameraCaptureSessions = cameraCaptureSession;
updatePreview();
}
@Override
public void onConfigureFailed(@NonNull CameraCaptureSession cameraCaptureSession) {
Toast.makeText(AndroidCameraApi.this, "Configuration change", Toast.LENGTH_SHORT).show();
}
}, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void openCamera() {
CameraManager manager = (CameraManager) getSystemService(Context.CAMERA_SERVICE);
Log.e(TAG, "is camera open");
try {
cameraId = manager.getCameraIdList()[0];
CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraId);
StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
assert map != null;
imageDimension = map.getOutputSizes(SurfaceTexture.class)[0];
// Add permission for camera and let user grant the permission
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED && ActivityCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE) != PackageManager.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(AndroidCameraApi.this, new String[]{Manifest.permission.CAMERA, Manifest.permission.WRITE_EXTERNAL_STORAGE}, REQUEST_CAMERA_PERMISSION);
return;
}
manager.openCamera(cameraId, stateCallback, null);
} catch (CameraAccessException e) {
e.printStackTrace();
}
Log.e(TAG, "openCamera X");
}
protected void updatePreview() {
if(null == cameraDevice) {
Log.e(TAG, "updatePreview error, return");
}
captureRequestBuilder.set(CaptureRequest.CONTROL_MODE, CameraMetadata.CONTROL_MODE_AUTO);
try {
cameraCaptureSessions.setRepeatingRequest(captureRequestBuilder.build(), null, mBackgroundHandler);
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
private void closeCamera() {
if (null != cameraDevice) {
cameraDevice.close();
cameraDevice = null;
}
if (null != imageReader) {
imageReader.close();
imageReader = null;
}
}
@Override
public void onRequestPermissionsResult(int requestCode, @NonNull String[] permissions, @NonNull int[] grantResults) {
if (requestCode == REQUEST_CAMERA_PERMISSION) {
if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
// close the app
Toast.makeText(AndroidCameraApi.this, "Sorry!!!, you can't use this app without granting permission", Toast.LENGTH_LONG).show();
finish();
}
}
}
@Override
protected void onResume() {
super.onResume();
Log.e(TAG, "onResume");
startBackgroundThread();
if (textureView.isAvailable()) {
openCamera();
} else {
textureView.setSurfaceTextureListener(textureListener);
}
}
@Override
protected void onPause() {
Log.e(TAG, "onPause");
//closeCamera();
stopBackgroundThread();
super.onPause();
}
}
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
>
<TextureView
android:id="@+id/textureView"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_above="@+id/btn_takepicture"
android:layout_alignParentTop="true"/>
<Button
android:id="@+id/btn_takepicture"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="16dp"
android:layout_marginTop="16dp"
android:text="@string/take_picture" />
</RelativeLayout>
当我开始拍摄更多图像时,我得到的错误如下所示。在第二张图像本身之后不再有。
09-30 07:36:43.199 6252-6305/com.example.cameraapi E/Legacy-CameraDevice-JNI: LegacyCameraDevice_nativeGetSurfaceId: Could not retrieve native Surface from surface.
09-30 07:36:43.199 6252-6305/com.example.cameraapi E/AndroidRuntime: FATAL EXCEPTION: Thread-275
Process: com.example.cameraapi, PID: 6252
java.lang.IllegalArgumentException: Surface had no valid native Surface.
at android.hardware.camera2.legacy.LegacyCameraDevice.nativeGetSurfaceId(Native Method)
at android.hardware.camera2.legacy.LegacyCameraDevice.getSurfaceId(LegacyCameraDevice.java:658)
at android.hardware.camera2.legacy.LegacyCameraDevice.containsSurfaceId(LegacyCameraDevice.java:678)
at android.hardware.camera2.legacy.RequestThreadManager$2.onPictureTaken(RequestThreadManager.java:220)
at android.hardware.Camera$EventHandler.handleMessage(Camera.java:1092)
at android.os.Handler.dispatchMessage(Handler.java:102)
at android.os.Looper.loop(Looper.java:148)
at android.hardware.camera2.legacy.CameraDeviceUserShim$CameraLooper.run(CameraDeviceUserShim.java:136)
at java.lang.Thread.run(Thread.java:818)
09-30 07:36:43.373 6252-6252/com.example.cameraapi E/AndroidCameraApi: onPause
09-30 07:36:43.463 6252-6292/com.example.cameraapi E/Surface: getSlotFromBufferLocked: unknown buffer: 0xe05090e0
09-30 07:36:43.475 6252-6292/com.example.cameraapi D/OpenGLRenderer: endAllStagingAnimators on 0xe8b85100 (RippleDrawable) with handle 0xdf9d43d0
09-30 07:36:47.201 6252-6313/com.example.cameraapi E/RequestThread-0: Hit timeout for jpeg callback!
09-30 07:36:47.202 6252-6313/com.example.cameraapi I/CameraDeviceState: Legacy camera service transitioning to state IDLE
09-30 07:36:47.203 6252-6252/com.example.cameraapi W/MessageQueue: Handler (android.os.Handler) {ef17f10} sending message to a Handler on a dead thread
java.lang.IllegalStateException: Handler (android.os.Handler) {ef17f10} sending message to a Handler on a dead thread
at android.os.MessageQueue.enqueueMessage(MessageQueue.java:543)
at android.os.Handler.enqueueMessage(Handler.java:631)
at android.os.Handler.sendMessageAtTime(Handler.java:600)
at android.os.Handler.sendMessageDelayed(Handler.java:570)
at android.os.Handler.post(Handler.java:326)
at android.hardware.camera2.dispatch.HandlerDispatcher.dispatch(HandlerDispatcher.java:61)
at android.hardware.camera2.dispatch.MethodNameInvoker.invoke(MethodNameInvoker.java:88)
at android.hardware.camera2.dispatch.DuckTypingDispatcher.dispatch(DuckTypingDispatcher.java:53)
at android.hardware.camera2.dispatch.ArgumentReplacingDispatcher.dispatch(ArgumentReplacingDispatcher.java:74)
at android.hardware.camera2.dispatch.BroadcastDispatcher.dispatch(BroadcastDispatcher.java:54)
at android.hardware.camera2.dispatch.MethodNameInvoker.invoke(MethodNameInvoker.java:88)
at android.hardware.camera2.impl.CallbackProxies$DeviceCaptureCallbackProxy.onCaptureCompleted(CallbackProxies.java:121)
at android.hardware.camera2.impl.CameraDeviceImpl$CameraDeviceCallbacks$4.run(CameraDeviceImpl.java:1828)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:5417)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
09-30 07:36:47.204 6252-6252/com.example.cameraapi W/MessageQueue: Handler (android.os.Handler) {ef17f10} sending message to a Handler on a dead thread
java.lang.IllegalStateException: Handler (android.os.Handler) {ef17f10} sending message to a Handler on a dead thread
at android.os.MessageQueue.enqueueMessage(MessageQueue.java:543)
at android.os.Handler.enqueueMessage(Handler.java:631)
at android.os.Handler.sendMessageAtTime(Handler.java:600)
at android.os.Handler.sendMessageDelayed(Handler.java:570)
at android.os.Handler.post(Handler.java:326)
at android.hardware.camera2.dispatch.HandlerDispatcher.dispatch(HandlerDispatcher.java:61)
at android.hardware.camera2.dispatch.MethodNameInvoker.invoke(MethodNameInvoker.java:88)
at android.hardware.camera2.dispatch.DuckTypingDispatcher.dispatch(DuckTypingDispatcher.java:53)
at android.hardware.camera2.dispatch.ArgumentReplacingDispatcher.dispatch(ArgumentReplacingDispatcher.java:74)
at android.hardware.camera2.dispatch.BroadcastDispatcher.dispatch(BroadcastDispatcher.java:54)
at android.hardware.camera2.dispatch.MethodNameInvoker.invoke(MethodNameInvoker.java:88)
at android.hardware.camera2.impl.CallbackProxies$DeviceCaptureCallbackProxy.onCaptureSequenceCompleted(CallbackProxies.java:133)
at android.hardware.camera2.impl.CameraDeviceImpl$10.run(CameraDeviceImpl.java:1588)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:148)
at android.app.ActivityThread.main(ActivityThread.java:5417)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
[ 09-30 07:36:48.448 6252: 6257 D/ ]
HostConnection::get() New Host Connection established 0xe9952950, tid 6257
答案 0 :(得分:3)
我之前见过这个问题,因为相机使用的线程不再存在。调试Camera API,您将看到它使用Handlers来调用必要的函数。 = Count(Fields!WithinRequirements.Value)/Sum(Fields!WithinRequirements.Value)* 100
的罪魁祸首就在这里:
sending message to a Handler on a dead thread
你的onPause()正在杀死CameraAPI正在使用的线程,导致该错误消息。这不是真正的问题,而是@Override
protected void onPause() {
Log.e(TAG, "onPause");
//closeCamera();
stopBackgroundThread();
super.onPause();
}
如果表面没有配置到正确的参数,则Camera2 API会抱怨很多。由于您正在使用该示例,您可能应该像链接的示例一样初始化和配置TextureView。我知道他们扩展了类并提供了一个帮助方法来配置,可能会解决这个问题。
作为旁注,你应该在做自己的事情之前调用super。
答案 1 :(得分:1)
我想出了这个错误的两个原因:
1)对于视频我选择了MediaRecorder并不总是支持的CamcorderProfile.QUALITY_HIGH 所以我必须同时检查两者:
profileHQ = CamcorderProfile.get(Integer.parseInt(cameraDevice.getId()),
CamcorderProfile.QUALITY_HIGH);
foreach (Size size : cameraConfig.getOutputSizes(MediaRecorder.class))
if (size.getHeight() == profileHQ.videoFrameHeight && size.getWidth() == profileHQ.videoFrameWidth) {
// you may use this profile
}
2)对于图像捕获我有一部手机可以收集巨大的图像(5248x3936),这花了很长时间下一个图像捕获请求中断第一个:线程被关闭导致表面无法使用。现在,在我拍下一张照片之前,我等到图像保存完毕:
@Override
public void onImageAvailable(ImageReader reader) {
Image image = null;
try {
image = reader.acquireLatestImage();
ByteBuffer buffer = image.getPlanes()[0].getBuffer();
byte[] bytes = new byte[buffer.capacity()];
buffer.get(bytes);
save(bytes);
} catch (IOException e) {
throw new IllegalStateException("Unable to write image to buffer.", e);
} finally {
if (image != null) {
image.close();
}
try {
LOGGER.debug("Taking next picture ...");
if (!closeRequestReceived) takePicture();
} catch (CameraAccessException e) {
e.printStackTrace();
}
}
}
答案 2 :(得分:0)
我遇到了同样的错误,结果证明是API级问题。尝试将其设置在班级的最顶层,并确保只有API 21+设备才能进入活动。
@RequiresApi(api = Build.VERSION_CODES.LOLLIPOP) //<-- This line
public class YourClassName {
....
}
答案 3 :(得分:0)
有点晚了,但我对同一个应用程序确实遇到了同样的问题,但是在我读完@Clocker后我说他意识到他是完全正确的。我们都倾向于忘记,第二次应用程序运行时(即当它被带回到前台时)onPause()会在onResume()之后被调用,因此它会杀死线程...所以以后所有的相机安装操作失败,尤其是Surface。所以只需使用一些逻辑来仅允许onPause()部分运行。
@Override
public void onPause() {
if(onResume_just_ran){super.onPause(); return;}
super.onPause();
closeCamera();
stopBackgroundThread();
}
因此onResume()中的布尔值onResume_just_ran将设置为true。现在它不会运行stopBackgroundThread()。这对我来说很好! 另外,记得在程序的某个地方将布尔值恢复为假,所以当它在退出时真正进入onPause()时运行代码的底部,就像在我的程序中它发生在其他地方一样