我正在尝试将CameraToMpegTest.java从http://bigflake.com/mediacodec/更改为android项目。我遇到了一些麻烦,这里是logcat的错误:
12-13 17:19:38.074: E/AndroidRuntime(4929): FATAL EXCEPTION: main
12-13 17:19:38.074: E/AndroidRuntime(4929): java.lang.RuntimeException: Unable to resume activity {com.brendon.cameratompeg/com.brendon.cameratompeg.CameraToMpeg}: java.lang.IllegalStateException: Can't stop due to wrong state.
12-13 17:19:38.074: E/AndroidRuntime(4929): at android.app.ActivityThread.performResumeActivity(ActivityThread.java:2918)
12-13 17:19:38.074: E/AndroidRuntime(4929): at android.app.ActivityThread.handleResumeActivity(ActivityThread.java:2947)
12-13 17:19:38.074: E/AndroidRuntime(4929): at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:2394)
代码出错“mStManager.awaitNewImage();”,它位于onResume函数中。 logcat说“相机帧等待时间”。 mStManager是SurfaceTextureManager类的一个实例。并且“相机帧等待时间”来自awaitNewImage()函数。我已将该课程添加到我的帖子中。
我的部分代码是这样的(onCreate函数和onResume函数):
@Override
protected void onCreate(Bundle savedInstanceState) {
// arbitrary but popular values
int encWidth = 640;
int encHeight = 480;
int encBitRate = 6000000; // Mbps
Log.d(TAG, MIME_TYPE + " output " + encWidth + "x" + encHeight + " @" + encBitRate);
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_camera_to_mpeg);
prepareCamera(encWidth, encHeight);
prepareEncoder(encWidth, encHeight, encBitRate);
mInputSurface.makeCurrent();
prepareSurfaceTexture();
mCamera.startPreview();
}
@Override
public void onResume(){
try {
long startWhen = System.nanoTime();
long desiredEnd = startWhen + DURATION_SEC * 1000000000L;
SurfaceTexture st = mStManager.getSurfaceTexture();
int frameCount = 0;
while (System.nanoTime() < desiredEnd) {
// Feed any pending encoder output into the muxer.
drainEncoder(false);
// Switch up the colors every 15 frames. Besides demonstrating the use of
// fragment shaders for video editing, this provides a visual indication of
// the frame rate: if the camera is capturing at 15fps, the colors will change
// once per second.
if ((frameCount % 15) == 0) {
String fragmentShader = null;
if ((frameCount & 0x01) != 0) {
fragmentShader = SWAPPED_FRAGMENT_SHADER;
}
mStManager.changeFragmentShader(fragmentShader);
}
frameCount++;
// Acquire a new frame of input, and render it to the Surface. If we had a
// GLSurfaceView we could switch EGL contexts and call drawImage() a second
// time to render it on screen. The texture can be shared between contexts by
// passing the GLSurfaceView's EGLContext as eglCreateContext()'s share_context
// argument.
mStManager.awaitNewImage();
mStManager.drawImage();
// Set the presentation time stamp from the SurfaceTexture's time stamp. This
// will be used by MediaMuxer to set the PTS in the video.
if (VERBOSE) {
Log.d(TAG, "present: " +
((st.getTimestamp() - startWhen) / 1000000.0) + "ms");
}
mInputSurface.setPresentationTime(st.getTimestamp());
// Submit it to the encoder. The eglSwapBuffers call will block if the input
// is full, which would be bad if it stayed full until we dequeued an output
// buffer (which we can't do, since we're stuck here). So long as we fully drain
// the encoder before supplying additional input, the system guarantees that we
// can supply another frame without blocking.
if (VERBOSE) Log.d(TAG, "sending frame to encoder");
mInputSurface.swapBuffers();
}
// send end-of-stream to encoder, and drain remaining output
drainEncoder(true);
} catch(Exception e) {
Log.d(TAG, e.getMessage());
// release everything we grabbed
releaseCamera();
releaseEncoder();
releaseSurfaceTexture();
}
}
代码中与错误相关的类
private static class SurfaceTextureManager
implements SurfaceTexture.OnFrameAvailableListener {
private SurfaceTexture mSurfaceTexture;
private CameraToMpeg.STextureRender mTextureRender;
private Object mFrameSyncObject = new Object(); // guards mFrameAvailable
private boolean mFrameAvailable;
/**
* Creates instances of TextureRender and SurfaceTexture.
*/
public SurfaceTextureManager() {
mTextureRender = new CameraToMpeg.STextureRender();
mTextureRender.surfaceCreated();
if (VERBOSE) Log.d(TAG, "textureID=" + mTextureRender.getTextureId());
mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureId());
// This doesn't work if this object is created on the thread that CTS started for
// these test cases.
//
// The CTS-created thread has a Looper, and the SurfaceTexture constructor will
// create a Handler that uses it. The "frame available" message is delivered
// there, but since we're not a Looper-based thread we'll never see it. For
// this to do anything useful, OutputSurface must be created on a thread without
// a Looper, so that SurfaceTexture uses the main application Looper instead.
//
// Java language note: passing "this" out of a constructor is generally unwise,
// but we should be able to get away with it here.
mSurfaceTexture.setOnFrameAvailableListener(this);
}
public void release() {
// this causes a bunch of warnings that appear harmless but might confuse someone:
// W BufferQueue: [unnamed-3997-2] cancelBuffer: BufferQueue has been abandoned!
//mSurfaceTexture.release();
mTextureRender = null;
mSurfaceTexture = null;
}
/**
* Returns the SurfaceTexture.
*/
public SurfaceTexture getSurfaceTexture() {
return mSurfaceTexture;
}
/**
* Replaces the fragment shader.
*/
public void changeFragmentShader(String fragmentShader) {
mTextureRender.changeFragmentShader(fragmentShader);
}
/**
* Latches the next buffer into the texture. Must be called from the thread that created
* the OutputSurface object.
*/
public void awaitNewImage() {
final int TIMEOUT_MS = 2500;
synchronized (mFrameSyncObject) {
while (!mFrameAvailable) {
try {
// Wait for onFrameAvailable() to signal us. Use a timeout to avoid
// stalling the test if it doesn't arrive.
mFrameSyncObject.wait(TIMEOUT_MS);
if (!mFrameAvailable) {
// TODO: if "spurious wakeup", continue while loop
throw new RuntimeException("Camera frame wait timed out");
}
} catch (InterruptedException ie) {
// shouldn't happen
throw new RuntimeException(ie);
}
}
mFrameAvailable = false;
}
// Latch the data.
mTextureRender.checkGlError("before updateTexImage");
mSurfaceTexture.updateTexImage();
}
/**
* Draws the data from SurfaceTexture onto the current EGL surface.
*/
public void drawImage() {
mTextureRender.drawFrame(mSurfaceTexture);
}
@Override
public void onFrameAvailable(SurfaceTexture st) {
if (VERBOSE) Log.d(TAG, "new frame available");
synchronized (mFrameSyncObject) {
if (mFrameAvailable) {
throw new RuntimeException("mFrameAvailable already set, frame could be dropped");
}
mFrameAvailable = true;
mFrameSyncObject.notifyAll();
}
}
}
有没有人有任何想法?谢谢!