Libstreaming错误(解码器缓冲区不够大,解码器没有解码任何东西)

时间:2014-06-09 20:17:39

标签: android video-streaming android-camera rtsp android-mediarecorder

我正在尝试使用此处的libstreaming库:https://github.com/fyhertz/libstreaming

我正在关注示例2:https://github.com/fyhertz/libstreaming-examples

尝试在Galaxy Nexus上使用此流媒体库。

如果我使用小分辨率(新的VideoQuality(128,96,20,500000)),我会收到解码器没有解码的错误:

06-09 19:59:31.531: D/libEGL(8198): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:59:31.632: D/OpenGLRenderer(8198): Enabling debug mode 0
06-09 19:59:33.773: D/MainActivity(8198): Start
06-09 19:59:33.773: D/MainActivity(8198): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{420285e0 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:59:33.789: I/dalvikvm(8198): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:59:33.789: W/dalvikvm(8198): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:59:33.789: D/dalvikvm(8198): VFY: replacing opcode 0x6e at 0x005e
06-09 19:59:33.789: I/MediaStream(8198): Phone supports the MediaCoded API
06-09 19:59:33.843: D/dalvikvm(8198): GC_CONCURRENT freed 65K, 2% free 9075K/9168K, paused 4ms+2ms, total 34ms
06-09 19:59:33.843: D/dalvikvm(8198): WAIT_FOR_CONCURRENT_GC blocked 15ms
06-09 19:59:34.750: V/VideoQuality(8198): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:59:34.750: V/VideoQuality(8198): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:59:35.140: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.171: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.179: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.211: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.242: W/ACodec(8198): Use baseline profile instead of 8 for AVC recording
06-09 19:59:35.242: I/ACodec(8198): setupVideoEncoder succeeded
06-09 19:59:35.515: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.515: E/OMXNodeInstance(8198): OMX_GetExtensionIndex failed
06-09 19:59:36.359: D/dalvikvm(8198): GC_CONCURRENT freed 156K, 3% free 9356K/9552K, paused 4ms+5ms, total 25ms
06-09 19:59:38.531: W/System.err(8198): java.lang.RuntimeException: The decoder did not decode anything.
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:799)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:59:38.546: W/System.err(8198):     at android.os.Looper.loop(Looper.java:137)
06-09 19:59:38.546: W/System.err(8198):     at android.os.HandlerThread.run(HandlerThread.java:60)

如果我尝试使用更大的分辨率(新的VideoQuality(640,480,20,500000)),它会抱怨解码器输入缓冲区不够大:

06-09 19:51:51.054: D/libEGL(8096): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:51:51.062: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:51:51.070: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:51:51.164: D/OpenGLRenderer(8096): Enabling debug mode 0
06-09 19:51:53.054: D/MainActivity(8096): Start
06-09 19:51:53.054: D/MainActivity(8096): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{42031b00 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:51:53.062: I/dalvikvm(8096): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:51:53.062: W/dalvikvm(8096): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:51:53.062: D/dalvikvm(8096): VFY: replacing opcode 0x6e at 0x005e
06-09 19:51:53.070: I/MediaStream(8096): Phone supports the MediaCoded API
06-09 19:51:53.132: D/dalvikvm(8096): GC_CONCURRENT freed 103K, 2% free 9038K/9168K, paused 4ms+3ms, total 42ms
06-09 19:51:53.132: D/dalvikvm(8096): WAIT_FOR_CONCURRENT_GC blocked 28ms
06-09 19:51:54.039: V/VideoQuality(8096): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:51:54.039: V/VideoQuality(8096): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:51:54.468: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.500: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.515: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.554: D/dalvikvm(8096): GC_FOR_ALLOC freed 106K, 2% free 9210K/9344K, paused 18ms, total 18ms
06-09 19:51:54.554: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.458MB for 460816-byte allocation
06-09 19:51:54.578: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 9660K/9796K, paused 22ms, total 22ms
06-09 19:51:54.593: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 9660K/9796K, paused 3ms+2ms, total 20ms
06-09 19:51:54.656: D/dalvikvm(8096): GC_FOR_ALLOC freed <1K, 2% free 9660K/9796K, paused 13ms, total 13ms
06-09 19:51:54.656: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.897MB for 460816-byte allocation
06-09 19:51:54.671: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 10110K/10248K, paused 16ms, total 16ms
06-09 19:51:54.679: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.687: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 10110K/10248K, paused 2ms+1ms, total 13ms
06-09 19:51:54.703: W/ACodec(8096): Use baseline profile instead of 8 for AVC recording
06-09 19:51:54.703: I/ACodec(8096): setupVideoEncoder succeeded
06-09 19:51:55.257: D/dalvikvm(8096): GC_CONCURRENT freed 2K, 1% free 10501K/10576K, paused 4ms+2ms, total 32ms
06-09 19:51:55.359: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:55.359: E/OMXNodeInstance(8096): OMX_GetExtensionIndex failed
06-09 19:51:56.187: W/System.err(8096): java.lang.IllegalStateException: The decoder input buffer is not big enough (nal=91280, capacity=65536).
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.check(EncoderDebugger.java:838)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:753)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Looper.loop(Looper.java:137)
06-09 19:51:56.187: W/System.err(8096):     at android.os.HandlerThread.run(HandlerThread.java:60)

我已经尝试了几十种不同的分辨率,帧率和比特率组合。我尝试的所有内容都导致“解码器没有解码任何东西”或“解码器输入缓冲区不够大。”

有没有人让这个图书馆开箱即用?造成这些错误的原因是什么?解决方案是什么?如果我的搜索结果有任何迹象,我似乎是世界上唯一遇到此问题的人。我很感激任何见解!

以下是我的MainActivity.java中的代码:

package com.cornet.cornetspydroid2;

import net.majorkernelpanic.streaming.Session;
import net.majorkernelpanic.streaming.SessionBuilder;
import net.majorkernelpanic.streaming.audio.AudioQuality;
import net.majorkernelpanic.streaming.gl.SurfaceView;
import net.majorkernelpanic.streaming.video.VideoQuality;
import android.app.Activity;
import android.app.Fragment;
import android.content.pm.ActivityInfo;
import android.os.Bundle;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceHolder;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;

public class MainActivity extends Activity implements Session.Callback, SurfaceHolder.Callback {

    private static final String TAG = "MainActivity";

    private static final String ip = "10.3.1.204";
    private static final VideoQuality VIDEO_QUALITY = new VideoQuality(128,96,20,500000);

    private Session mSession;
    private SurfaceView mSurfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);

        if (savedInstanceState == null) {
            getFragmentManager().beginTransaction().add(R.id.container, new PlaceholderFragment()).commit();
        }

        setContentView(R.layout.activity_main);

        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

    }

    public void start(View view) {

        if (mSession != null && mSession.isStreaming()) {
            Log.d(TAG, "Already streaming!");
            return;
        }

        Log.d(TAG, "Start");

        mSurfaceView = (SurfaceView)findViewById(R.id.surface);

        mSession = SessionBuilder.getInstance()
            .setCallback(this)
            .setSurfaceView(mSurfaceView)
            .setPreviewOrientation(90)
            .setContext(getApplicationContext())
            .setAudioEncoder(SessionBuilder.AUDIO_NONE)
            .setAudioQuality(new AudioQuality(16000, 32000))
            .setVideoEncoder(SessionBuilder.VIDEO_H264)
            .setVideoQuality(VIDEO_QUALITY)
            .setDestination(ip)
        .build();

        mSurfaceView.getHolder().addCallback(this);

        if (!mSession.isStreaming()) {
            mSession.configure();
        }

    }

    public void stop(View view) {

        Log.d(TAG, "Stop");

        if (mSession != null) {
            mSession.stop();
        }

        if (mSurfaceView != null) {
            mSurfaceView.getHolder().removeCallback(this);
        }

    }

    @Override
    public void onDestroy() {

        super.onDestroy();

        if (mSession != null) {
            mSession.release();
        }

    }

    @Override
    public void onPreviewStarted() {
        Log.d(TAG,"Preview started.");
    }

    @Override
    public void onSessionConfigured() {
        Log.d(TAG,"Preview configured.");
        // Once the stream is configured, you can get a SDP formated session description
        // that you can send to the receiver of the stream.
        // For example, to receive the stream in VLC, store the session description in a .sdp file
        // and open it with VLC while streming.
        Log.d(TAG, mSession.getSessionDescription());
        mSession.start();
    }

    @Override
        public void onSessionStarted() {
        Log.d(TAG,"Session started.");
    }

    @Override
    public void onBitrareUpdate(long bitrate) {
        Log.d(TAG,"Bitrate: "+bitrate);
    }

    @Override
    public void onSessionError(int message, int streamType, Exception e) {
        if (e != null) {
            Log.e(TAG, e.getMessage(), e);
        }
    }

    @Override
    public void onSessionStopped() {
        Log.d(TAG,"Session stopped.");
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        mSession.startPreview();
    }

    @Override
        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mSession.stop();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {

        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        // Handle action bar item clicks here. The action bar will
        // automatically handle clicks on the Home/Up button, so long
        // as you specify a parent activity in AndroidManifest.xml.
        int id = item.getItemId();
        if (id == R.id.action_settings) {
            return true;
        }
        return super.onOptionsItemSelected(item);
    }

    /**
     * A placeholder fragment containing a simple view.
     */
    public static class PlaceholderFragment extends Fragment {

        public PlaceholderFragment() {
        }

        @Override
        public View onCreateView(LayoutInflater inflater, ViewGroup container,
                Bundle savedInstanceState) {
            View rootView = inflater.inflate(R.layout.fragment_main, container,
                    false);
            return rootView;
        }
    }

}

UPDATE:此库中的MediaStream类具有一个静态初始化程序,用于查找名为“android.media.MediaCodec”的类。当我强制它使用sSuggestedMode = MODE_MEDIARECORDER_API而不是MediaCodec时,无论我选择何种分辨率,都没有错误,并且Wireshark会看到从手机流出的数据包。但由于某种原因,VLC无法播放此视频流(udp / h264:// @ 10.3.1.204:16420)。这似乎表明我选择的决议不是问题;至少不是直接的。

在Session.syncConfigure()调用中发生错误(它甚至没有到达Session.start())。它能够成功配置音频流,但是对视频流的Stream.configure()调用失败。 syncConfigure()调用最终进入H264Stream.testMediaCodecAPI(),调用EncoderDebugger.debug()。那个debug()方法抛出了两个原始错误:输入缓冲区不够大,或者解码器没有解码任何东西。

可能揭示的东西(包括在我提供的原始日志中):我似乎总是在启动时从“dalvikm”标记获得调试错误:“找不到方法android.media.MediaCodec.createInputSurface,引用来自方法net.majorkernalpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2“。在该日志条目之后,再次从“dalvikm”标记发出警告:“VFY:无法解析虚方法377:Landroid / media / MediaCodec; .createInputSurface()Landroid / view / Surface;”这跟它有什么关系吗?为什么它能够从MediaStream中的Class.forName()调用中找到MediaCodec类,但是稍后当它试图从MediaCodec(createInputSurface)访问文档化的方法时,有警告它无法找到该方法?我的AndroidManifest.xml文件(在主项目和libstreaming库项目中)都指定了min SDK 16和目标SDK 19. MediaCodec类在API版本16中添加,因此我不应该收到这些警告。这是否表明我错误配置?这些警告可能与我遇到的问题有关吗?

4 个答案:

答案 0 :(得分:4)

实际上createInputSurface()并不是强制使用libstreaming,只有在使用模式MODE_MEDIACODEC_API_2时才需要它。 Lisbtreaming应该适用于Android 4.1和4.2,只要MediaCodec API在手机上正常运行。

<强>解释

在Android 4.1和4.2上使用libstreaming时,您确实会看到VM在日志中抱怨createInputSurface()不存在。这不会导致应用程序崩溃,因为不会调用此方法(除非您尝试以某种方式强制执行MODE_MEDIACODEC_API_2)。

现在让我解释一下“解码器输入缓冲区不够大”的错误。

当libstreaming与MODE_MEDIACODEC_API一起使用时,其分辨率以前从未在用户的手机上使用过,它首先会尝试查看至少有一个可通过MediaCodec API访问的编码器是否在该分辨率下正常工作。为此,它将尝试使用手机上提供的每个编码器和解码器对简单视频进行编码和解码。当解码器无法解码编码器产生的H264流时,您提到的错误就会发生。

如果该测试停止并且未找到有效的编码器/解码器对,则认为该电话不支持该分辨率。然后,Libstreaming将尝试回退到MODE_MEDIARECORDER_API模式。

要了解这些“模式”究竟是什么,只需阅读项目的main page,我在那里解释了所有这些。

了解该测试的重要事项

该测试的结果存储在SharedPreference中,因此如果要重试,可以清除应用程序的缓存或在EncoderDebugger.java中将布尔DEBUG更改为true。此外,如果android版本发生变化(例如更新后),测试将再次运行。

这个测试背后的原因是MediaCodec API很糟糕,如果你试图使用它,你可能已经知道了。

(此测试实际上是在EncoderDebugger类中编写的,你可以在github上查看。)

那么OP手机上发生了什么?

嗯,他的手机没有通过Android 4.2的测试,但它确实与Android 4.3。 MediaCodec API已经在中间的手机上打了补丁。

可能仍有一种方法可以改进该测试,以使其适用于Android 4.2的Galaxy Nexus。例如,它目前仅支持以下颜色格式:

  • COLOR_FormatYUV420SemiPlanar
  • COLOR_FormatYUV420PackedSemiPlanar
  • COLOR_TI_FormatYUV420PackedSemiPlanar:
  • COLOR_FormatYUV420Planar:
  • COLOR_FormatYUV420PackedPlanar:

(免责声明,我写了lib)

答案 1 :(得分:1)

Galaxy Nexus报告支持的视频分辨率存在问题。我从未尝试过128x96,而且我再也无法访问手机进行检查了。我确实尝试过320x240,它在这台设备上坏了。但640x480确实有效,但它可能对20 FPS不满意。我建议你试试15 FPS:

private static final VideoQuality VIDEO_QUALITY = VideoQuality(640, 480, 15, 500000);

答案 2 :(得分:1)

终于找到了问题!我的Galaxy Nexus有Android 4.2.2。虽然MediaCodec类存在于4.2.2中,但是直到4.3(构建18)才添加createInputSurface()方法。我用4.3设备测试它,它可以工作。

除非您在MediaStream静态初始化程序中强制使用sSuggestedMode = MODE_MEDIARECORDER_API,否则不要将此库用于4.1.x或4.2.x设备。

答案 3 :(得分:0)

根据我自己的经验:

private static final VideoQuality VIDEO_QUALITY = VideoQuality(352, 288, 30, 300000);

没有任何问题,效果很好。