有人可以推荐一个允许我以编程方式创建视频的Java库吗?具体来说,它将执行以下操作:
有人可以推荐什么吗?对于图像/声音混合,我甚至会使用一系列帧,并且对于每个帧,我必须提供与该帧相关联的未压缩声音数据的原始字节。
P.S。如果Java Media Framework有调用来实现上述目标,它甚至不必是一个“第三方库”,但是从我粗略的记忆中我感觉它没有。
答案 0 :(得分:5)
我已经使用下面提到的代码在纯Java中的需求列表上成功执行了第1,2和4项。值得一看,你可能想出如何包含#3。
http://www.randelshofer.ch/blog/2010/10/writing-quicktime-movies-in-pure-java/
答案 1 :(得分:4)
我找到了一个名为ffmpeg的工具,它可以将多媒体文件从一种格式转换为另一种格式。在ffmpeg中有一个名为libavfilter的过滤器,它是vhook的替代品,它允许在解码器和编码器之间修改或检查视频/音频。我认为应该可以输入原始帧并生成视频。 我研究了ffmpeg的任何java实现,并找到了标题为"Getting Started with FFMPEG-JAVA"的页面,它是使用JNA围绕FFMPEG的JAVA包装器。
答案 2 :(得分:2)
您可以尝试一个名为JCodec的纯Java编解码器库 它有一个非常基本的H.264(AVC)编码器和MP4复用器。以下是从他们的样本中获取的完整示例代码 - TranscodeMain。
private static void png2avc(String pattern, String out) throws IOException {
FileChannel sink = null;
try {
sink = new FileOutputStream(new File(out)).getChannel();
H264Encoder encoder = new H264Encoder();
RgbToYuv420 transform = new RgbToYuv420(0, 0);
int i;
for (i = 0; i < 10000; i++) {
File nextImg = new File(String.format(pattern, i));
if (!nextImg.exists())
continue;
BufferedImage rgb = ImageIO.read(nextImg);
Picture yuv = Picture.create(rgb.getWidth(), rgb.getHeight(), ColorSpace.YUV420);
transform.transform(AWTUtil.fromBufferedImage(rgb), yuv);
ByteBuffer buf = ByteBuffer.allocate(rgb.getWidth() * rgb.getHeight() * 3);
ByteBuffer ff = encoder.encodeFrame(buf, yuv);
sink.write(ff);
}
if (i == 1) {
System.out.println("Image sequence not found");
return;
}
} finally {
if (sink != null)
sink.close();
}
}
此示例更复杂,实际上显示了编码帧到MP4文件的多路复用:
private static void prores2avc(String in, String out, ProresDecoder decoder, RateControl rc) throws IOException {
SeekableByteChannel sink = null;
SeekableByteChannel source = null;
try {
sink = writableFileChannel(out);
source = readableFileChannel(in);
MP4Demuxer demux = new MP4Demuxer(source);
MP4Muxer muxer = new MP4Muxer(sink, Brand.MOV);
Transform transform = new Yuv422pToYuv420p(0, 2);
H264Encoder encoder = new H264Encoder(rc);
MP4DemuxerTrack inTrack = demux.getVideoTrack();
CompressedTrack outTrack = muxer.addTrackForCompressed(TrackType.VIDEO, (int) inTrack.getTimescale());
VideoSampleEntry ine = (VideoSampleEntry) inTrack.getSampleEntries()[0];
Picture target1 = Picture.create(ine.getWidth(), ine.getHeight(), ColorSpace.YUV422_10);
Picture target2 = null;
ByteBuffer _out = ByteBuffer.allocate(ine.getWidth() * ine.getHeight() * 6);
ArrayList<ByteBuffer> spsList = new ArrayList<ByteBuffer>();
ArrayList<ByteBuffer> ppsList = new ArrayList<ByteBuffer>();
Packet inFrame;
int totalFrames = (int) inTrack.getFrameCount();
long start = System.currentTimeMillis();
for (int i = 0; (inFrame = inTrack.getFrames(1)) != null && i < 100; i++) {
Picture dec = decoder.decodeFrame(inFrame.getData(), target1.getData());
if (target2 == null) {
target2 = Picture.create(dec.getWidth(), dec.getHeight(), ColorSpace.YUV420);
}
transform.transform(dec, target2);
_out.clear();
ByteBuffer result = encoder.encodeFrame(_out, target2);
if (rc instanceof ConstantRateControl) {
int mbWidth = (dec.getWidth() + 15) >> 4;
int mbHeight = (dec.getHeight() + 15) >> 4;
result.limit(((ConstantRateControl) rc).calcFrameSize(mbWidth * mbHeight));
}
spsList.clear();
ppsList.clear();
H264Utils.encodeMOVPacket(result, spsList, ppsList);
outTrack.addFrame(new MP4Packet((MP4Packet) inFrame, result));
if (i % 100 == 0) {
long elapse = System.currentTimeMillis() - start;
System.out.println((i * 100 / totalFrames) + "%, " + (i * 1000 / elapse) + "fps");
}
}
outTrack.addSampleEntry(H264Utils.createMOVSampleEntry(spsList, ppsList));
muxer.writeHeader();
} finally {
if (sink != null)
sink.close();
if (source != null)
source.close();
}
}
答案 3 :(得分:0)
为什么不使用FFMPEG?
似乎有一个Java包装器:
http://fmj-sf.net/ffmpeg-java/getting_started.php
以下是如何使用FFMPEG将各种媒体源编译为一个视频的示例:
http://howto-pages.org/ffmpeg/#multiple
最后,文档:
答案 4 :(得分:-1)