我正在尝试使用视频作为其元素来实现列表视图。我正在使用this project在纹理视图上显示视频。它使用下面的MediaPlayer。在同时加载两个视频时失败(大部分时间)。
我得到的错误是:
TextureVideoView error. File or network related operation errors.
MediaPlayer: error (1, -2147479551)
并且当从磁盘加载文件时也会发生这种情况
在错误处理部分,我尝试重置URL。然后我主要得到
E/BufferQueueProducer: [unnamed-30578-12] disconnect(P): connected to another API (cur=0 req=3)
错误。我不清楚的是,从网络设置一些任意视频会有效,但重试相同的网址会失败。
所以在OnErrorListener:
textureView.setVideo(item.getUriMp4(),MediaFensterPlayerController.DEFAULT_VIDEO_START);
会失败,但是:
textureView.setVideo("http://different.video" ... )
会很有效。
这也不是特定文件的问题,因为滚动不同的视频文件会失败。有时那些失败的人会在下次工作等等。
我还尝试了MediaCodec
和MediaExtractor
组合,而不是MediaPlayer
方法,但我遇到了,device specific platform bug
任何提示?有什么建议吗?
感谢
瓦特
答案 0 :(得分:2)
你可以试试这个而不是图书馆它是从github上 Google 的示例中获取的:
将两个视频流同时解码为两个TextureView。
一个关键特性是视频解码器在重新启动活动时不会停止 改变方向。这是为了模拟实时视频流的回放。如果 活动暂停,因为它已“完成”(表示我们正在离开活动 在非常重要的时间内,视频解码器被关闭。
TODO:考虑在屏幕关闭时关机,以保护电池。
爪哇:
<强> DoubleDecodeActivity.java 强>
public class DoubleDecodeActivity extends Activity {
private static final String TAG = MainActivity.TAG;
private static final int VIDEO_COUNT = 2;
//How many videos to play simultaneously.
// Must be static storage so they'll survive Activity restart.
private static boolean sVideoRunning = false;
private static VideoBlob[] sBlob = new VideoBlob[VIDEO_COUNT];
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_double_decode);
if (!sVideoRunning) {
sBlob[0] = new VideoBlob((TextureView) findViewById(R.id.double1_texture_view),
ContentManager.MOVIE_SLIDERS, 0);
sBlob[1] = new VideoBlob((TextureView) findViewById(R.id.double2_texture_view),
ContentManager.MOVIE_EIGHT_RECTS, 1);
sVideoRunning = true;
} else {
sBlob[0].recreateView((TextureView) findViewById(R.id.double1_texture_view));
sBlob[1].recreateView((TextureView) findViewById(R.id.double2_texture_view));
}
}
@Override
protected void onPause() {
super.onPause();
boolean finishing = isFinishing();
Log.d(TAG, "isFinishing: " + finishing);
for (int i = 0; i < VIDEO_COUNT; i++) {
if (finishing) {
sBlob[i].stopPlayback();
sBlob[i] = null;
}
}
sVideoRunning = !finishing;
Log.d(TAG, "onPause complete");
}
/**
* Video playback blob.
* <p>
* Encapsulates the video decoder and playback surface.
* <p>
* We want to avoid tearing down and recreating the video decoder on orientation changes,
* because it can be expensive to do so. That means keeping the decoder's output Surface
* around, which means keeping the SurfaceTexture around.
* <p>
* It's possible that the orientation change will cause the UI thread's EGL context to be
* torn down and recreated (the app framework docs don't seem to make any guarantees here),
* so we need to detach the SurfaceTexture from EGL on destroy, and reattach it when
* the new SurfaceTexture becomes available. Happily, TextureView does this for us.
*/
private static class VideoBlob implements TextureView.SurfaceTextureListener {
private final String LTAG;
private TextureView mTextureView;
private int mMovieTag;
private SurfaceTexture mSavedSurfaceTexture;
private PlayMovieThread mPlayThread;
private SpeedControlCallback mCallback;
/**
* Constructs the VideoBlob.
*
* @param view The TextureView object we want to draw into.
* @param movieTag Which movie to play.
* @param ordinal The blob's ordinal (only used for log messages).
*/
public VideoBlob(TextureView view, int movieTag, int ordinal) {
LTAG = TAG + ordinal;
Log.d(LTAG, "VideoBlob: tag=" + movieTag + " view=" + view);
mMovieTag = movieTag;
mCallback = new SpeedControlCallback();
recreateView(view);
}
/**
* Performs partial construction. The VideoBlob is already created, but the Activity
* was recreated, so we need to update our view.
*/
public void recreateView(TextureView view) {
Log.d(LTAG, "recreateView: " + view);
mTextureView = view;
mTextureView.setSurfaceTextureListener(this);
if (mSavedSurfaceTexture != null) {
Log.d(LTAG, "using saved st=" + mSavedSurfaceTexture);
view.setSurfaceTexture(mSavedSurfaceTexture);
}
}
/**
* Stop playback and shut everything down.
*/
public void stopPlayback() {
Log.d(LTAG, "stopPlayback");
mPlayThread.requestStop();
// TODO: wait for the playback thread to stop so we don't kill the Surface
// before the video stops
// We don't need this any more, so null it out. This also serves as a signal
// to let onSurfaceTextureDestroyed() know that it can tell TextureView to
// free the SurfaceTexture.
mSavedSurfaceTexture = null;
}
@Override
public void onSurfaceTextureAvailable(SurfaceTexture st, int width, int height) {
Log.d(LTAG, "onSurfaceTextureAvailable size=" + width + "x" + height + ", st=" + st);
// If this is our first time though, we're going to use the SurfaceTexture that
// the TextureView provided. If not, we're going to replace the current one with
// the original.
if (mSavedSurfaceTexture == null) {
mSavedSurfaceTexture = st;
File sliders = ContentManager.getInstance().getPath(mMovieTag);
mPlayThread = new PlayMovieThread(sliders, new Surface(st), mCallback);
} else {
// Can't do it here in Android <= 4.4. The TextureView doesn't add a
// listener on the new SurfaceTexture, so it never sees any updates.
// Needs to happen from activity onCreate() -- see recreateView().
//Log.d(LTAG, "using saved st=" + mSavedSurfaceTexture);
//mTextureView.setSurfaceTexture(mSavedSurfaceTexture);
}
}
@Override
public void onSurfaceTextureSizeChanged(SurfaceTexture st, int width, int height) {
Log.d(LTAG, "onSurfaceTextureSizeChanged size=" + width + "x" + height + ", st=" + st);
}
@Override
public boolean onSurfaceTextureDestroyed(SurfaceTexture st) {
Log.d(LTAG, "onSurfaceTextureDestroyed st=" + st);
// The SurfaceTexture is already detached from the EGL context at this point, so
// we don't need to do that.
//
// The saved SurfaceTexture will be null if we're shutting down, so we want to
// return "true" in that case (indicating that TextureView can release the ST).
return (mSavedSurfaceTexture == null);
}
@Override
public void onSurfaceTextureUpdated(SurfaceTexture st) {
//Log.d(TAG, "onSurfaceTextureUpdated st=" + st);
}
}
/**
* Thread object that plays a movie from a file to a surface.
* <p>
* Currently loops until told to stop.
*/
private static class PlayMovieThread extends Thread {
private final File mFile;
private final Surface mSurface;
private final SpeedControlCallback mCallback;
private MoviePlayer mMoviePlayer;
/**
* Creates thread and starts execution.
* <p>
* The object takes ownership of the Surface, and will access it from the new thread.
* When playback completes, the Surface will be released.
*/
public PlayMovieThread(File file, Surface surface, SpeedControlCallback callback) {
mFile = file;
mSurface = surface;
mCallback = callback;
start();
}
/**
* Asks MoviePlayer to halt playback. Returns without waiting for playback to halt.
* <p>
* Call from UI thread.
*/
public void requestStop() {
mMoviePlayer.requestStop();
}
@Override
public void run() {
try {
mMoviePlayer = new MoviePlayer(mFile, mSurface, mCallback);
mMoviePlayer.setLoopMode(true);
mMoviePlayer.play();
} catch (IOException ioe) {
Log.e(TAG, "movie playback failed", ioe);
} finally {
mSurface.release();
Log.d(TAG, "PlayMovieThread stopping");
}
}
}
}
XML:
<强> activity_double_decode.xml 强>
<?xml version="1.0" encoding="utf-8"?>
<!-- portrait layout -->
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:baselineAligned="false"
android:orientation="vertical" >
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="horizontal"
android:layout_weight="1"
android:layout_marginBottom="8dp" >
<TextureView
android:id="@+id/double1_texture_view"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>
<LinearLayout
android:layout_width="fill_parent"
android:layout_height="fill_parent"
android:orientation="horizontal"
android:layout_weight="1" >
<TextureView
android:id="@+id/double2_texture_view"
android:layout_width="wrap_content"
android:layout_height="wrap_content" />
</LinearLayout>
</LinearLayout>
答案 1 :(得分:2)
在数组或ArrayList中添加所有视频路径并实现mediaplayer.setOnMediaPlayerCompletionListener,当播放媒体时,将从此处调用此接口初始化新媒体播放器实例提供新媒体并调用start()
我只是告诉你逻辑,我希望这会有用
答案 2 :(得分:1)
使用VideoView而不是ListView它可能会起作用。看看这里 http://developer.android.com/reference/android/widget/VideoView.html
答案 3 :(得分:1)
这个问题在这里有几个答案:stackoverflow.com/questions/31532893/i-want-to-display-multiple-video-in-listview-using-video-but-not-able-to-do-this。除非您的问题不同或更具体,否则此主题将被标记为重复。
答案 4 :(得分:0)
当前解决方案: 我建议使用JavaCV / OpenCV在Java中一次播放多个视频。它支持各种格式。
教程 - http://ganeshtiwaridotcomdotnp.blogspot.co.nz/search/label/OpenCV-JavaCV
JavaFX也可以播放一些.MP4视频格式。
旧解决方案: - 即使JMF可以同时播放多个视频,但它已过时且不再维护。