如何在Android中的不同视图中并排在一个布局中同时播放多个视频文件

时间:2012-03-23 06:13:12

标签: android android-ndk surfaceview android-videoview android-mediaplayer

在Android中,我创建了一个并排有三个表面视图的布局,我想同时播放一个带有不同媒体播放器的视频文件。但是我面临的一个问题是,三个人都不能同时播放该视频。其中一两个人停下了显示器。 如果我直接使用视频视图而不是Media Player类,但问题仍然存在。 请任何人都可以提供帮助。问题是什么?它给错误表面创建失败原生错误。我尝试了不同的组合,例如3个不同视图中的一个文件,三个不同视图中的三个文件,但问题尚未解决。 其他网站上的一些回复称它取决于内核版本。 如果它取决于内核版本,请你在android网站上给我任何android文档链接,它取决于内核版本。或者可以玩,请给我代码的步骤。这是错误日志 -

04-10 19:23:37.995: E/ANDROID_DRM_TEST(2573): Client::notify In
04-10 19:23:37.995: V/AudioPolicyManager(2573): startOutput() output 1, stream 3,  session 131
04-10 19:23:37.995: V/AudioPolicyManager(2573): getDeviceForStrategy() from cache strategy 0, device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): getNewDevice() selected device 2
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() output 1 device 2 delayMs 0
04-10 19:23:37.995: V/AudioPolicyManager(2573): setOutputDevice() setting same device 2 or null device for output 1
04-10 19:23:37.995: I/AudioFlinger(2573): start output streamType (0, 3) for 1
04-10 19:23:37.995: D/AudioHardwareYamaha(2573): AudioStreamOut::setParameters(keyValuePairs="start_output_streamtype=3")
04-10 19:23:38.010: W/SEC_Overlay(2689): overlay_setPosition(0) 0,0,200,397 => 0,0,200,397
04-10 19:23:38.010: I/SEC_Overlay(2689): overlay_setParameter param[4]=4
04-10 19:23:38.010: D/SEC_Overlay(2689): dst width, height have changed [w= 200, h= 397] -> [w=200, h= 397]
04-10 19:23:38.010: I/SEC_Overlay(2689): Nothing to do!
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO ROTATION 0
04-10 19:23:38.090: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO RENDERER 1
04-10 19:23:38.090: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.090: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.090: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.195: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.195: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.195: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO ROTATION 0
04-10 19:23:38.230: E/VideoMIO(2573): AndroidSurfaceOutput::setParametersSync()  VIDEO RENDERER 1
04-10 19:23:38.230: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.230: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.230: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.295: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.295: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.295: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.330: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.330: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.330: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.395: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.395: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.395: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.435: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.435: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.435: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.495: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48
04-10 19:23:38.495: E/SEC_Overlay(2689): Error - overlays already in use
04-10 19:23:38.495: D/VideoMIO(2573): Overlay create failed - retrying
04-10 19:23:38.535: D/SEC_Overlay(2689): overlay_createOverlay:IN w=128 h=96 format=48

5 个答案:

答案 0 :(得分:34)

你没有详细说明你曾尝试过什么以及有问题的领域,所以我只是做了一个小测试,看看我是否可以重现你所描述的任何内容。

我没有任何确凿的发现,但至少可以确认我的Galaxy Nexus(Android 4.0.2)能够同时播放三个视频而没有任何问题。另一方面,我躺在一个旧的三星Galaxy Spica(Android 2.1-update1)一次只播放一个文件 - 它似乎总是第一个SurfaceView

我通过为Android 3.0,2.3.3和2.2设置模拟器进一步调查了不同的API级别。所有这些平台似乎都能够处理多个视频文件在不同表面视图上的播放。我用一个运行2.1-update1的模拟器进行了一次最终测试,有趣的是,测试用例也没有问题,与实际手机不同。我注意到布局的渲染方式略有不同。

这种行为让我怀疑对你所追求的内容并没有任何软件限制,但它似乎取决于硬件,支持同时播放多个视频文件。因此,对于此方案的支持因设备而异。从一个经验的角度来看,我绝对认为在更多物理设备上测试这个假设会很有趣。

仅供参考有关实施的一些细节:

  1. 我设置了两个略有不同的实现:一个基于单个MediaPlayer中的三个Activity个实例,另一个实例分为三个独立的片段,每个片段各自有MediaPlayer对象。 (顺便说一下,我没有发现这两种实现的播放差异)
  2. 位于assets文件夹中的单3gp file(感谢,Apple)用于与所有玩家一起播放。
  3. 下面附有两个实现的代码,主要基于Googles MediaPlayerDemo_Video示例实现 - 我确实删除了实际测试不需要的一些代码。结果绝不是完整或适合在实时应用中使用。

  4. 基于活动的实施:

    public class MultipleVideoPlayActivity extends Activity implements
        OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
    
        private static final String TAG = "MediaPlayer";
        private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };
    
        private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
        private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
        private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
        private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
        private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];
    
        @Override public void onCreate(Bundle icicle) {
            super.onCreate(icicle);
            setContentView(R.layout.multi_videos_layout);
    
            // create surface holders
            for (int i=0; i<mSurfaceViews.length; i++) {
                mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
                mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
                mSurfaceHolders[i].addCallback(this);
                mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            }
        }
    
        public void onBufferingUpdate(MediaPlayer player, int percent) {
            Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
        }
    
        public void onCompletion(MediaPlayer player) {
            Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
        }
    
        public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
            Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
            if (width == 0 || height == 0) {
                Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
                return;
            }
    
            int index = indexOf(player);
            if (index == -1) return; // sanity check; should never happen
            mSizeKnown[index] = true;
            if (mVideoReady[index] && mSizeKnown[index]) {
                startVideoPlayback(player);
            }
        }
    
        public void onPrepared(MediaPlayer player) {
            Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");
    
            int index = indexOf(player);
            if (index == -1) return; // sanity check; should never happen
            mVideoReady[index] = true;
            if (mVideoReady[index] && mSizeKnown[index]) {
                startVideoPlayback(player);
            }
        }
    
        public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
            Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
        }
    
        public void surfaceDestroyed(SurfaceHolder holder) {
            Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
        }
    
    
        public void surfaceCreated(SurfaceHolder holder) {
            Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");
    
            int index = indexOf(holder);
            if (index == -1) return; // sanity check; should never happen
            try { 
                mMediaPlayers[index] = new MediaPlayer();
                AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
                mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
                mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
                mMediaPlayers[index].prepare();
                mMediaPlayers[index].setOnBufferingUpdateListener(this);
                mMediaPlayers[index].setOnCompletionListener(this);
                mMediaPlayers[index].setOnPreparedListener(this);
                mMediaPlayers[index].setOnVideoSizeChangedListener(this);
                mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
            }
            catch (Exception e) { e.printStackTrace(); }
        }
    
        @Override protected void onPause() {
            super.onPause();
            releaseMediaPlayers();
        }
    
        @Override protected void onDestroy() {
            super.onDestroy();
            releaseMediaPlayers();
        }
    
        private void releaseMediaPlayers() {
            for (int i=0; i<mMediaPlayers.length; i++) {
                if (mMediaPlayers[i] != null) {
                    mMediaPlayers[i].release();
                    mMediaPlayers[i] = null;
                }
            }
        }
    
    
        private void startVideoPlayback(MediaPlayer player) {
            Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
            player.start();
        }
    
        private int indexOf(MediaPlayer player) {
            for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
            return -1;  
        }
    
        private int indexOf(SurfaceHolder holder) {
            for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
            return -1;  
        }
    }
    

    <强> R.layout.multi_videos_layout:

    <?xml version="1.0" encoding="utf-8"?>
    <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
        android:layout_width="match_parent" android:layout_height="match_parent"
        android:orientation="vertical">
    
        <SurfaceView android:id="@+id/video_1_surfaceview"
            android:layout_width="fill_parent" android:layout_height="0dp"
            android:layout_weight="1" />
    
        <SurfaceView android:id="@+id/video_2_surfaceview"
            android:layout_width="fill_parent" android:layout_height="0dp"
            android:layout_weight="1" />
    
        <SurfaceView android:id="@+id/video_3_surfaceview"
            android:layout_width="fill_parent" android:layout_height="0dp"
            android:layout_weight="1" />
    
    </LinearLayout>
    

    基于片段的实施:

    public class MultipleVideoPlayFragmentActivity extends FragmentActivity {
    
        private static final String TAG = "MediaPlayer";
    
        @Override public void onCreate(Bundle icicle) {
            super.onCreate(icicle);
            setContentView(R.layout.multi_videos_activity_layout);
        }
    
        public static class VideoFragment extends Fragment implements
            OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
    
            private MediaPlayer mMediaPlayer;
            private SurfaceView mSurfaceView;
            private SurfaceHolder mSurfaceHolder;
            private boolean mSizeKnown;
            private boolean mVideoReady;
    
            @Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
                return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
            }
    
            @Override public void onActivityCreated(Bundle savedInstanceState) {
                super.onActivityCreated(savedInstanceState);
                mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
                mSurfaceHolder = mSurfaceView.getHolder();
                mSurfaceHolder.addCallback(this);
                mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
            }
    
            public void onBufferingUpdate(MediaPlayer player, int percent) {
                Log.d(TAG, "onBufferingUpdate percent: " + percent);
            }
    
            public void onCompletion(MediaPlayer player) {
                Log.d(TAG, "onCompletion called");
            }
    
            public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
                Log.v(TAG, "onVideoSizeChanged called");
                if (width == 0 || height == 0) {
                    Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
                    return;
                }
    
                mSizeKnown = true;
                if (mVideoReady && mSizeKnown) {
                    startVideoPlayback();
                }
            }
    
            public void onPrepared(MediaPlayer player) {
                Log.d(TAG, "onPrepared called");
    
                mVideoReady = true;
                if (mVideoReady && mSizeKnown) {
                    startVideoPlayback();
                }
            }
    
            public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
                Log.d(TAG, "surfaceChanged called");
            }
    
            public void surfaceDestroyed(SurfaceHolder holder) {
                Log.d(TAG, "surfaceDestroyed called");
            }
    
            public void surfaceCreated(SurfaceHolder holder) {
                Log.d(TAG, "surfaceCreated called");
    
                try { 
                    mMediaPlayer = new MediaPlayer();
                    AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
                    mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength()); 
                    mMediaPlayer.setDisplay(mSurfaceHolder);
                    mMediaPlayer.prepare();
                    mMediaPlayer.setOnBufferingUpdateListener(this);
                    mMediaPlayer.setOnCompletionListener(this);
                    mMediaPlayer.setOnPreparedListener(this);
                    mMediaPlayer.setOnVideoSizeChangedListener(this);
                    mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
                }
                catch (Exception e) { e.printStackTrace(); }
            }
    
            @Override public void onPause() {
                super.onPause();
                releaseMediaPlayer();
            }
    
            @Override public void onDestroy() {
                super.onDestroy();
                releaseMediaPlayer();
            }
    
            private void releaseMediaPlayer() {
                if (mMediaPlayer != null) {
                    mMediaPlayer.release();
                    mMediaPlayer = null;
                }
            }
    
            private void startVideoPlayback() {
                Log.v(TAG, "startVideoPlayback");
                mMediaPlayer.start();
            }
        }
    }
    

    <强> R.layout.multi_videos_activity_layout:

    <?xml version="1.0" encoding="utf-8"?>
    <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
        android:layout_width="match_parent" android:layout_height="match_parent"
        android:orientation="vertical">
    
        <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
            android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
            android:layout_height="0dp" android:layout_weight="1" />
    
        <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
            android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
            android:layout_height="0dp" android:layout_weight="1" />
    
        <fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
            android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
            android:layout_height="0dp" android:layout_weight="1" />
    
    </LinearLayout>
    

    <强> R.layout.multi_videos_fragment_layout:

    <?xml version="1.0" encoding="utf-8"?>
    <SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
        android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
        android:layout_height="fill_parent" />
    

    更新虽然现在已经存在了一段时间,但我认为值得指出Google的Grafika project展示了'double decode'功能,其中“同时将两个视频流解码为两个TextureViews。”。不确定它可以扩展到两个以上的视频文件,但仍然与原始问题相关。

答案 1 :(得分:5)

查看此代码,它有效....

video1=(VideoView)findViewById(R.id.myvideoview);
    video1.setVideoURI(Uri.parse("android.resource://" +getPackageName()+ "/"+R.raw.sample));
    video1.setMediaController(new MediaController(this));
    video1.requestFocus();
video2=(VideoView)findViewById(R.id.myvideview);
video2.setVideoURI(Uri.parse("android.resource://" +getPackageName()+ "/"+R.raw.sample1));
video2.setMediaController(new MediaController(this));
video2.requestFocus();

Thread view1=new Thread(new Runnable() {

    @Override
    public void run() {
        // TODO Auto-generated method stub
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DISPLAY);
        video1.start();
    }
});

Thread view2=new Thread(new Runnable() {

    @Override
    public void run() {
        // TODO Auto-generated method stub
        android.os.Process.setThreadPriority(android.os.Process.THREAD_PRIORITY_DISPLAY);
        video2.start();
    }
});

但这取决于您的设备天气是否支持多video-view。如果它不支持它会给你错误This video can not be played Error (1, -110)

答案 2 :(得分:0)

您没有提供任何代码示例。

根据我的经验,我发现你可以用Fragments(至少在我使用过的设备上)做到这一点。请记住,旧设备有一个Fragment支持库。

所以基本上放一个LinearLayout或其他代替VideoViews,然后使用Fragment事务将LinearLayouts替换为带有VideoView的Fragment。

答案 3 :(得分:0)

试试这个

public class CustomPictureActivity extends Activity {
/** Called when the activity is first created. */
VideoView vd1,vd2;
@Override
public void onCreate(Bundle savedInstanceState) {
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    vd1=(VideoView) findViewById(R.id.v1);
    vd2=(VideoView) findViewById(R.id.v2);
    vd1.setVideoURI(Uri.parse("/mnt/sdcard/file.mp4"));
    vd1.setMediaController(new MediaController(this));
    vd1.requestFocus();
    vd1.start();

    vd2.setVideoURI(Uri.parse("/mnt/sdcard/android.mp4"));
    vd2.setMediaController(new MediaController(this));
    vd2.requestFocus();
    vd2.start();
}

}

和xml应该是这样的

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="horizontal" >

<VideoView
    android:layout_width="0dp"
    android:layout_height="wrap_content"
    android:layout_weight="0.5" 
    android:id="@+id/v1"/>

<VideoView
    android:layout_width="0dp"
    android:layout_height="wrap_content"
    android:layout_weight="0.5" 
    android:id="@+id/v2"/>

</LinearLayout>

答案 4 :(得分:0)

我找到了一个解决方案。只需用以下build.prop -

替换/system/build.prop

build.prop

# begin build properties
# autogenerated by buildinfo.sh
ro.build.id=GINGERBREAD
ro.build.display.id=GINGERBREAD.EG14
ro.build.version.incremental=EG14
ro.build.version.sdk=10
ro.build.version.codename=REL
ro.build.version.release=2.3.4
ro.build.date=Thu Jul 14 12:16:01 KST 2011
ro.build.date.utc=1310613361
ro.build.type=user
ro.build.user=se.infra
ro.build.host=SEI-28
ro.build.tags=release-keys
ro.product.model=SHW-M250S
ro.product.brand=samsung
ro.product.name=SHW-M250S
ro.product.device=SHW-M250S
ro.product.board=SHW-M250S
ro.product.cpu.abi=armeabi-v7a
# Samsung Specific Properties
ro.build.PDA=M250S.EG14.1208
ro.build.hidden_ver=M250S.EG14.1208
ro.b uild.changelist=380592
ro.product.cpu.abi2=armeabi
ro.product.manufacturer=samsung
ro.product.locale.language=ko
ro.product.locale.region=KR
ro.wifi.channels=
ro.board.platform=s5pc210
# ro.build.product is obsolete; use ro.product.device
ro.build.product=SHW-M250S
# Do not try to parse ro.build.description or .fingerprint
ro.build.description=SHW-M250S-user 2.3.4 GINGERBREAD EG14 release-keys
ro.build.fingerprint=samsung/SHW-M250S/SHW-M250S:2.3.4/GINGERBREAD/EG14:user/release-keys
# Samsung Specific Properties
ro.build.PDA=M250S.EG14.1208
ro.build.hidden_ver=M250S.EG14.1208
ro.build.changelist=380592
ro.build.fota_ver=SSNT11GINGEREG14
ro.tether.denied=false
ro.flash.resolution=1080
# end build properties
#
# system.prop for asop5000
#

rild.libpath=/system/lib/libsec-ril.so
rild.libargs=-d /dev/ttyS0
ro.sf.lcd_density=240
dalvik.vm.heapsize=64m

# Samsung USB default mode
persist.service.usb.setting=2

#
# ADDITIONAL_BUILD_PROPERTIES
#
ro.setupwizard.mode=OPTIONAL
ro.com.google.gmsversion=2.3_r4
media.stagefright.enable-player=true
media.stagefright.enable-meta=true
media.stagefright.enable-scan=true
media.stagefright.enable-http=true
media.stagefright.enable-rtsp=true
ro.com.google.clientidbase=android-samsung
ro.com.google.clientidbase.ms=android-skt-kr
ro.com.google.clientidbase.am=android-skt-kr
ro.com.google.clientidbase.gmm=android-samsung
ro.com.google.clientidbase.yt=android-samsung
ro.url.legal=http://www.google.com/intl/%s/mobile/android/basic/phone-legal.html
ro.url.legal.android_privacy=http://www.google.com/intl/%s/mobile/android/basic/privacy.html
ro.com.google.locationfeatures=1
keyguard.no_require_sim=true
ro.config.ringtone=Over_the_horizon.ogg
ro.config.notification_sound=Sherbet.ogg
ro.config.alarm_alert=Good_Morning.ogg
ro.config.media_sound=Over_the_horizon.ogg
ro.opengles.version=131072
ro.csc.sales_code=MSK
ro.secdevenc=true
ro.wtldatapassword=true
net.bt.name=Android
dalvik.vm.stack-trace-file=/data/anr/traces.txt

首先将Samsung Galaxy s-II与USB连接,然后输入命令提示符来安装系统 -

cmd:> adb remount

然后替换文件并重启设备 -

cmd:> adb shell
#reboot

我注意到默认情况下这个设备使用opencore框架而不是libstagefright。并且opencore有一些问题,这就是为什么nave错误正在抛出。但是libstagefright已经在Android 2.3版本中实现了。看看build.prop文件,stagefright是禁用的。这是启用libstagefright框架并支持libstagefright框架的最佳解决方案。您也可以播放MPEG-2TS文件,它支持同时播放多个视频文件而不会有任何问题。 试试吧,享受吧。