我正在努力从视频中生成缩略图。我能够做到这一点,但我只需要一个视频中的缩略图,但我得到的是在视频的不同时间不止一个图像。我使用以下代码生成缩略图。请建议我在下面的代码中修改一下,只从视频的中间部分获取一个缩略图。我使用的代码如下(我使用过Xuggler):
import java.awt.image.BufferedImage;
import java.io.File;
import java.io.IOException;
import javax.imageio.ImageIO;
import com.xuggle.mediatool.IMediaReader;
import com.xuggle.mediatool.MediaListenerAdapter;
import com.xuggle.mediatool.ToolFactory;
import com.xuggle.mediatool.event.IVideoPictureEvent;
import com.xuggle.xuggler.Global;
public class Main {
public static final double SECONDS_BETWEEN_FRAMES = 10;
private static final String inputFilename = "D:\\k\\Knock On Wood Lesson.flv";
private static final String outputFilePrefix = "D:\\pix\\";
// The video stream index, used to ensure we display frames from one and
// only one video stream from the media container.
private static int mVideoStreamIndex = -1;
// Time of last frame write
private static long mLastPtsWrite = Global.NO_PTS;
public static final long MICRO_SECONDS_BETWEEN_FRAMES =
(long) (Global.DEFAULT_PTS_PER_SECOND * SECONDS_BETWEEN_FRAMES);
public static void main(String[] args) {
IMediaReader mediaReader = ToolFactory.makeReader(inputFilename);
// stipulate that we want BufferedImages created in BGR 24bit color space
mediaReader.setBufferedImageTypeToGenerate(BufferedImage.TYPE_3BYTE_BGR);
mediaReader.addListener(new ImageSnapListener());
// read out the contents of the media file and
// dispatch events to the attached listener
while (mediaReader.readPacket() == null);
}
private static class ImageSnapListener extends MediaListenerAdapter {
public void onVideoPicture(IVideoPictureEvent event) {
if (event.getStreamIndex() != mVideoStreamIndex) {
// if the selected video stream id is not yet set, go ahead an
// select this lucky video stream
if (mVideoStreamIndex == -1) {
mVideoStreamIndex = event.getStreamIndex();
} // no need to show frames from this video stream
else {
return;
}
}
// if uninitialized, back date mLastPtsWrite to get the very first frame
if (mLastPtsWrite == Global.NO_PTS) {
mLastPtsWrite = event.getTimeStamp() - MICRO_SECONDS_BETWEEN_FRAMES;
}
// if it's time to write the next frame
if (event.getTimeStamp() - mLastPtsWrite
>= MICRO_SECONDS_BETWEEN_FRAMES) {
String outputFilename = dumpImageToFile(event.getImage());
// indicate file written
double seconds = ((double) event.getTimeStamp())
/ Global.DEFAULT_PTS_PER_SECOND;
System.out.printf("at elapsed time of %6.3f seconds wrote: %s\n",
seconds, outputFilename);
// update last write time
mLastPtsWrite += MICRO_SECONDS_BETWEEN_FRAMES;
}
}
private String dumpImageToFile(BufferedImage image) {
try {
String outputFilename = outputFilePrefix
+ System.currentTimeMillis() + ".png";
ImageIO.write(image, "png", new File(outputFilename));
return outputFilename;
} catch (IOException e) {
e.printStackTrace();
return null;
}
}
}
}
答案 0 :(得分:2)
您可以这样做。
public class ThumbsGenerator {
private static void processFrame(IVideoPicture picture, BufferedImage image) {
try {
File file=new File("C:\\snapshot\thimbnailpic.png");//name of pic
ImageIO.write(image, "png", file);
} catch (Exception e) {
e.printStackTrace();
}
}
@SuppressWarnings("deprecation")
public static void main(String[] args) throws NumberFormatException,IOException {
String filename = "your_video.mp4";
if (!IVideoResampler.isSupported(IVideoResampler.Feature.FEATURE_COLORSPACECONVERSION))
throw new RuntimeException("you must install the GPL version of Xuggler (with IVideoResampler support) for this demo to work");
IContainer container = IContainer.make();
if (container.open(filename, IContainer.Type.READ, null) < 0)
throw new IllegalArgumentException("could not open file: "
+ filename);
String seconds=container.getDuration()/(1000000*2)+""; // time of thumbnail
int numStreams = container.getNumStreams();
// and iterate through the streams to find the first video stream
int videoStreamId = -1;
IStreamCoder videoCoder = null;
for (int i = 0; i < numStreams; i++) {
// find the stream object
IStream stream = container.getStream(i);
// get the pre-configured decoder that can decode this stream;
IStreamCoder coder = stream.getStreamCoder();
if (coder.getCodecType() == ICodec.Type.CODEC_TYPE_VIDEO) {
videoStreamId = i;
videoCoder = coder;
break;
}
}
if (videoStreamId == -1)
throw new RuntimeException(
"could not find video stream in container: " + filename);
if (videoCoder.open() < 0)
throw new RuntimeException(
"could not open video decoder for container: " + filename);
IVideoResampler resampler = null;
if (videoCoder.getPixelType() != IPixelFormat.Type.BGR24) {
resampler = IVideoResampler.make(videoCoder.getWidth(), videoCoder
.getHeight(), IPixelFormat.Type.BGR24, videoCoder
.getWidth(), videoCoder.getHeight(), videoCoder
.getPixelType());
if (resampler == null)
throw new RuntimeException(
"could not create color space resampler for: "
+ filename);
}
IPacket packet = IPacket.make();
IRational timeBase = container.getStream(videoStreamId).getTimeBase();
System.out.println("Timebase " + timeBase.toString());
long timeStampOffset = (timeBase.getDenominator() / timeBase.getNumerator())
* Integer.parseInt(seconds);
System.out.println("TimeStampOffset " + timeStampOffset);
long target = container.getStartTime() + timeStampOffset;
container.seekKeyFrame(videoStreamId, target, 0);
boolean isFinished = false;
while(container.readNextPacket(packet) >= 0 && !isFinished ) {
if (packet.getStreamIndex() == videoStreamId) {
IVideoPicture picture = IVideoPicture.make(videoCoder
.getPixelType(), videoCoder.getWidth(), videoCoder
.getHeight());
int offset = 0;
while (offset < packet.getSize()) {
int bytesDecoded = videoCoder.decodeVideo(picture, packet,
offset);
if (bytesDecoded < 0) {
System.err.println("WARNING!!! got no data decoding " +
"video in one packet");
}
offset += bytesDecoded;
picture from
if (picture.isComplete()) {
IVideoPicture newPic = picture;
if (resampler != null) {
newPic = IVideoPicture.make(resampler
.getOutputPixelFormat(), picture.getWidth(),
picture.getHeight());
if (resampler.resample(newPic, picture) < 0)
throw new RuntimeException(
"could not resample video from: "
+ filename);
}
if (newPic.getPixelType() != IPixelFormat.Type.BGR24)
throw new RuntimeException(
"could not decode video as BGR 24 bit data in: "
+ filename);
BufferedImage javaImage = Utils.videoPictureToImage(newPic);
processFrame(newPic, javaImage);
isFinished = true;
}
}
}
}
if (videoCoder != null) {
videoCoder.close();
videoCoder = null;
}
if (container != null) {
container.close();
container = null;
}
} }
答案 1 :(得分:1)
我知道这是一个老问题,但我今天在玩 Xuggler 时发现了相同的教程代码。您获得多个缩略图的原因是由于以下行:
public static final double SECONDS_BETWEEN_FRAMES = 10;
此变量指定调用dumpImageToFile
之间的秒数。因此,帧缩略图将以0.00秒,10.00秒,20.00秒写入,依此类推:
if (event.getTimeStamp() - mLastPtsWrite >= MICRO_SECONDS_BETWEEN_FRAMES)
要从视频中间获取帧缩略图,您可以使用我在JavaCodeGeeks教程中找到的更多 Xuggler 功能来计算视频的持续时间。然后更改ImageSnapListener
中的代码,只有在IVideoPictureEvent
事件时间戳超过计算中点时才会写入一个帧。
我希望能帮助那些偶然发现这个问题的人。