因此,在收到一些用于创建视频文件的Java包的建议之后,我决定尝试修改作为JMF样本的JPEGImagesToMovie文件。我设法得到所有编译,似乎它应该运行。新类应该使用Vector of BufferedImages,然后不是从文件中读取并转换为像以前一样的字节数组,而是直接从BufferedImage转换为字节数组。问题在于处理器不会配置(它锁定或类似的东西)。任何人都可以看到导致这种情况的明显缺陷吗?
此外,我仍然完全愿意接受更好/更简单/更简单的框架的建议。
编辑:这是我实际做了一些事情的代码。为了便于阅读,在这里清理了一点。class ImageDataSource extends PullBufferDataSource {
ImageSourceStream streams[];
ImageDataSource(int width, int height, int frameRate, Vector images) {
streams = new ImageSourceStream[1];
streams[0] = new ImageSourceStream(width, height, frameRate, images);
}
/**
* The source stream to go along with ImageDataSource.
*/
class ImageSourceStream implements PullBufferStream {
Vector images;
int width, height;
VideoFormat format;
int nextImage = 0; // index of the next image to be read.
boolean ended = false;
public ImageSourceStream(int width, int height, int frameRate, Vector images) {
this.width = width;
this.height = height;
this.images = images;
format = new VideoFormat(null,
new Dimension(width, height),
Format.NOT_SPECIFIED,
Format.byteArray,
(float)frameRate);
}
/**
* This is called from the Processor to read a frame worth
* of video data.
*/
public void read(Buffer buf) throws IOException {
// Check if we've finished all the frames.
if (nextImage >= images.size()) {
// We are done. Set EndOfMedia.
System.err.println("Done reading all images.");
buf.setEOM(true);
buf.setOffset(0);
buf.setLength(0);
ended = true;
return;
}
BufferedImage imageFile = (BufferedImage)images.elementAt(nextImage);
nextImage++;
byte data[] = ImageToByteArray.convertToBytes(imageFile);
// Check the input buffer type & size.
if (buf.getData() instanceof byte[])
data = (byte[])buf.getData();
// Check to see the given buffer is big enough for the frame.
buf.setData(data);
buf.setOffset(0);
buf.setLength((int)data.length);
buf.setFormat(format);
buf.setFlags(buf.getFlags() | buf.FLAG_KEY_FRAME);
}
我不确定的一件事是用作VideoFormat的编码。