使用SCameraCaptureSession在Samsung设备上获取Android 8上的错误

时间:2018-03-28 11:32:51

标签: android samsung-mobile illegalargumentexception

我尝试使用SCameraCaptureSession类捕获视频。使用此类的函数--setRepeatingRequest(描述here)时,我收到以下错误:

java.lang.IllegalArgumentException:CaptureRequest包含未配置的输入/输出表面!

正如我所注意到的,问题是因为MediaRecorder的Surface对象中出现了问题。但是,在使用早于8的Android版本时,它工作正常,并且只在运行Android 8的三星设备上发生崩溃。 没有谷歌搜索显示有关崩溃的任何有用信息,所以我认为这是一个很新的......

有没有人有任何信息?如何让MediaRecorder的表面在我提到的设备上正常工作?

重要说明:在8之前的任何Android版本上捕获视频精彩

5 个答案:

答案 0 :(得分:3)

似乎来自MediaRecorder的表面配置存在问题。如果传递自定义持久性曲面,它应该可以工作。

  1. 通过调用MediaCodec.createPersistentInputSurface()实现表面

  2. 使用mediaRecorder.setInputSurface(yourSurface)传递它;

  3. 停止使用此表面后,
  4. 调用yourSurface.release()。

  5. 注意:如果您决定使用此方法,请不要使用mediaRecorder.getSurface()

    参考文献:

    MediaRecorder:MediaRecorder - Android Docs

    MediaCodec:MediaCodec - Android Docs

答案 1 :(得分:1)

我有同样的例外,我解决了我的情况。 我案例的根本原因是我重新创建了SurfaceView的Surface。 当我改变它而不是重新创建Surface时,异常就消失了。

我的代码在Android 8.0之前也能正常运行

我的初始化相机如下。

CameraDevice mCameraDevice;
CameraCaptureSession mCameraCaptureSession;
CaptureRequest mCaptureRequest;
Surface mTextureViewSurface;

public void updateCameraState(boolean run) {
    if (run) {
        if (mTextureView == null || !mTextureView.isAvailable()) {
            // wait until mTextureView is available
            // then call updateCameraState() again via SurfaceTextureListener

            return;
        }
        if (mCameraDevice == null) {
            // open camera and wait until mCameraDevice is obtained.
            // then call updateCameraState() again via CameraDevice.StateCallback

            mCameraManager.openCamera(...);
            return;
        }
        if (mCameraCaptureSession == null) {
            // createCaptureSession and wait until mCameraCaptureSession is obtained.
            // then call updateCameraState() again via CameraCaptureSession.StateCallback

            mTextureViewSurface = new Surface(texture);
            List<Surface> surfaces = Arrays.asList(mTextureViewSurface, mImageReader.getSurface());
            mCameraDevice.createCaptureSession(surfaces, mSessionStateCallback, sHandler);
            return;
        }
        if (mCaptureRequest == null) {
            CaptureRequest.Builder builder = mCameraCaptureSession.getDevice().createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            /* Put some values into builder */

            // *************************************************************************
            // POINT: In my old code, It re-create Surface
            // *************************************************************************
            // Surface surface = new Surface(texture);
            // builder.addTarget(surface);

            builder.addTarget(mTextureViewSurface);
            mCameraCaptureSession.setRepeatingRequest(builder.build(), mCaptureCallback, sHandler);
        }
        // fin
    } else {
        if (mCaptureRequest != null) {
            mCaptureRequest = null;
        }
        // *************************************************************************
        // POINT: I do not know release() is needed. But I add it here.
        // *************************************************************************
        if (mTextureViewSurface != null) {
            mTextureViewSurface.release();
            mTextureViewSurface = null;
        }
        if (mCameraCaptureSession != null) {
            mCameraCaptureSession.close();
            mCameraCaptureSession = null;
        }
        if (mCameraDevice != null) {
            mCameraDevice.close();
            mCameraDevice = null;
        }
    }
}

答案 2 :(得分:0)

我有同样的症状。 我解决了使用'SurfaceView'类 但未使用android.hardware.camera2 使用android.hardware.camera

解决了部分代码。

@Override
public void onPreviewFrame(byte[] data, Camera camera) {

    // encoding data
    encoding(data) ;
}


/**
 * byte data encoding
 * @param data
 */
private void encoding (byte[] data) {
    // api 21 미만에 대해서 필요
    ByteBuffer[] inputBuffers = this.mediaCodec.getInputBuffers();
    ByteBuffer[] outputBuffers = this.mediaCodec.getOutputBuffers();

    int inputBufferIndex = this.mediaCodec.dequeueInputBuffer(TIMEOUT_USEC/* wait time, nagative value is infinite */);
    // data write 가능 할 경우
    if (inputBufferIndex >= 0) {
        // data null (마지막 데이터)
        int length = 0, flags = MediaCodec.BUFFER_FLAG_END_OF_STREAM;

        if (data != null) {
            ByteBuffer inputBuffer = null;
            if (CameraUtils.isCamera2()) inputBuffer = this.mediaCodec.getInputBuffer(inputBufferIndex);
            else inputBuffer = inputBuffers[inputBufferIndex];

            inputBuffer.clear();
            inputBuffer.put(data);

            length = data.length;
            flags = 0;
        }
        /*
         - index : dequeueInputBuffer 에서 return 받은 index 번호를 넣습니다.
         - offset : 항상 0이겠지만 Buffer에 채워넣은 데이터의 시작 점을 지정할 수 있습니다.
         - size : Buffer에 채워넣은 데이터 사이즈 정보
         - presentationTimeUs : 디코딩의 경우 Play 할 데이터의 시간(마이크로 초)
         - flags : 읽은 버퍼의 정보가 설정값인지 BUFFER_FLAG_CODEC_CONFIG, 마지막 데이터인지BUFFER_FLAG_END_OF_STREAM에 대한 정보를 초기화 할 수 있습니다.
            대부분은 0을 채워넣고 마지막 데이터를 알리기 위해서는 BUFFER_FLAGS_END_OF_STREAM을 넣습니다.
         */
        this.mediaCodec.queueInputBuffer(inputBufferIndex, 0, length, computePresentationTimeNsec(), flags);
    }

    MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
    int outputBufferIndex = this.mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC/* wait time, nagative value is infinite */);
    switch (outputBufferIndex) {
        /*
         MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED
         - Buffer 정보가 1번 변경되게 됩니다.
         - API 21인 Lollipop 부터는 이 @deprecated 되었기에 불필요하지만 이전 API에서는 꼭 필요한 정보입니다. 이게 호출되면 처음에 생성한 ByteBuffer[] 배열의 변화가 일어나게 됩니다.
         */
        case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
            Log.i(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
            outputBuffers = this.mediaCodec.getOutputBuffers();
            break;
         /*MediaCodec.INFO_OUTPUT_FORMAT_CHANGED
         - 처음에 생성하였든 MediaFormat을 기억하시는지요. 그 MediaFormat이 변경된 정보를 알려주게됩니다.
         - 이 경우는 Encoder에서만 주로 사용하고, 디코더에서는 사용할 일은 없습니다.
         */
        case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
            if (this.isMuxerStart) throw new RuntimeException("Format changed twice");
            Log.d(TAG, "INFO_OUTPUT_FORMAT_CHANGED format : " + this.mediaCodec.getOutputFormat());

            this.trackId = this.mediaMuxer.addTrack(this.mediaCodec.getOutputFormat());
            this.mediaMuxer.start();
            this.isMuxerStart = true;
            break;
         /*MediaCodec.INFO_TRY_AGAIN_LATER
         - 이 함수가 호출되는 경우라면 사실 무시하여도 됩니다.
         */
        case MediaCodec.INFO_TRY_AGAIN_LATER:
            break;
         /*outputBufferIndex >= 0
         - 이 경우에 실제 디코딩 된 데이터가 들어오는 경우에 해당됩니다.
         */
        default:
            while (outputBufferIndex >= 0 && this.mediaCodec != null && this.mediaMuxer != null) {
                ByteBuffer outputBuffer = null;
                if (CameraUtils.isCamera2()) outputBuffer = this.mediaCodec.getOutputBuffer(outputBufferIndex);
                else outputBuffer = outputBuffers[outputBufferIndex];

                // null exception
                if (outputBuffer == null)
                    throw new RuntimeException("EncoderOutputBuffer " + outputBuffer + " was NULL");

                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out and fed to the muxer when we got
                    // the INFO_OUTPUT_FORMAT_CHANGED status.  Ignore it.
                    bufferInfo.size = 0;
                }

                if (bufferInfo.size >= 0) {
                    if (!this.isMuxerStart) throw new RuntimeException("MediaMuxer hasn't started");

                    // 프레임의 타임스탬프 작성
                    bufferInfo.presentationTimeUs = computePresentationTimeNsec();
                    this.prevTime = bufferInfo.presentationTimeUs;
                    this.mediaMuxer.writeSampleData(this.trackId, outputBuffer, bufferInfo);
                }
                this.mediaCodec.releaseOutputBuffer(outputBufferIndex, false/* true is surface init */);
                outputBufferIndex = this.mediaCodec.dequeueOutputBuffer(bufferInfo, TIMEOUT_USEC/* wait time, nagative value is infinite */);

                // end of frame
                if ((bufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    // release
                    releaseRecorder();
                    // 저장 완료
                    onCompleteEncoding(recordPath);
                    stopEncodingThread();
                    return;
                }
            }
            break;
    }
}

答案 3 :(得分:0)

我遇到了与您同样的问题。只有在使用三星 SCamera SDK 中的SCameraProcessor后,它才能正常工作。

我建议您下载他们提供的示例APK并仔细看一下,但这是一个片段,其中包含您需要更多注意的主要部分:

设置:

SCamera sCamera = new SCamera();
sCamera.initialize(this);
...
SCameraProcessorManager processorManager = sCamera.getSCameraProcessorManager();
SCameraEffectProcessor processor = processorManager
    .createProcessor(SCameraProcessorManager.PROCESSOR_TYPE_EFFECT);
...
processor.initialize();
...
// Carry out the opening process of the camera device here.
...
processor.setOutputSurface(outputSurface);
Surface cameraSurface = processor.getInputSurface();

// 'cameraSurface' above must then be added as a target to your
// SCaptureRequest.Builder and given as part of the surfaces list
// to have configured when calling SCameraDevice.createCaptureRequest().

开始录制:

// After setting up your MediaRecorder object...
processor.setRecordingSurface(mediaRecorder.getSurface());
mediaRecorder.start();

停止录制:

processor.setRecordingSurface(null);
mediaRecorder.stop();
mediaRecorder.reset();

参考文献:

我希望这会有所帮助!

答案 4 :(得分:0)

如果您使用的是androidx库,我建议更新到该库的最新版本(1.0.0-alpha03),如changelog所述,应解决此问题。

https://developer.android.com/jetpack/androidx/releases/camera#1.0.0-alpha03

  • 修复了快速打开/关闭或绑定/取消绑定时未配置的输入/输出表面崩溃