使用自定义原始DataBuffer的Java RenderedImage实现

时间:2015-08-25 22:19:43

标签: java image image-processing

我从设备接收16位灰度图像,图像以未压缩的原始格式传送 ,这是一个8字节的例子,说明2X2图像如何使用这种格式(MSB优先):

21 27           33 F6          28 F3           27 F2
-----           -----          -----           -----
pixel 0,0(x,y)  pixel 1,0      pixel 1,0       pixel 1,1

我需要使用暴露Java的Kakadu JPEG2000库来压缩图像 ImageWriter实现,ImageWriter.write方法期望RenderedImage作为输入,我使用以下代码从原始图像数据创建BufferedImage:

int[] rasterData = new int[width * height];
int rawBufferOffset = 0;
for(int i=0;i<rasterData.length;i++) {
   rasterData[i] = ((int) rawBuffer[rawBufferOffset + 1] << 8) | ((int) rawBuffer[rawBufferOffset] & 0xFF);
   rawBufferOffset += 2;
}
BufferedImage image = new BufferedImage(width,   height,BufferedImage.TYPE_USHORT_GRAY);
image.getRaster().setPixels(0, 0, width, height, rasterData);

代码有效,但它显然不是此转换的最佳方法, 我正在考虑创建一个使用rawBuffer作为图像栅格数据源的RenderedImage实现,是否有人可以建议如何执行此操作或建议任何其他方法进行此转换?

1 个答案:

答案 0 :(得分:0)

最直接的方法,可能是使用ByteBuffer来交换字节顺序,并创建一个新的short数组来保存像素数据。

然后将{short)像素数据包装在DataBufferUShort中。创建匹配的WritableRasterColorModel,最后从中创建BufferedImage。此图像应与上面代码中的图像(BufferedImage.TYPE_USHORT_GRAY)相同,但创建速度稍快,因为您只复制一次像素(而不是代码中的两次)。

int w = 2;
int h = 2;
int stride = 1;

byte[] rawBytes = {0x21, 0x27, 0x33, (byte) 0xF6, 0x28, (byte) 0xF3, (byte) 0x27, (byte) 0xF2};
short[] rawShorts = new short[rawBytes.length / 2];

ByteBuffer.wrap(rawBytes)
        .order(ByteOrder.LITTLE_ENDIAN)
        .asShortBuffer()
        .get(rawShorts);

DataBuffer dataBuffer = new DataBufferUShort(rawShorts, rawShorts.length);
WritableRaster raster = Raster.createInterleavedRaster(dataBuffer, w, h, w * stride, stride, new int[]{0}, null);
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);

BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);

另一个稍微复杂但可能更快的方法(因为你根本不复制支持像素数组)是创建一个与MSB(小端)字节数据一起工作的自定义SampleModel,但是将它们公开为TYPE_USHORT。这将创建一个TYPE_CUSTOM图片。

int w = 2, h = 2, stride = 2;
byte[] rawBytes = {0x21, 0x27, 0x33, (byte) 0xF6, 0x28, (byte) 0xF3, (byte) 0x27, (byte) 0xF2};
DataBuffer dataBuffer = new DataBufferByte(rawBytes, rawBytes.length);

SampleModel sampleModel = new ComponentSampleModel(DataBuffer.TYPE_USHORT, w, h, stride, w * stride, new int[] {0}) {
    @Override
    public Object getDataElements(int x, int y, Object obj, DataBuffer data) {
        if ((x < 0) || (y < 0) || (x >= width) || (y >= height)) {
            throw new ArrayIndexOutOfBoundsException("Coordinate out of bounds!");
        }

        // Simplified, as we only support TYPE_USHORT
        int numDataElems = getNumDataElements();
        int pixelOffset = y * scanlineStride + x * pixelStride;

        short[] sdata;

        if (obj == null) {
            sdata = new short[numDataElems];
        }
        else {
            sdata = (short[]) obj;
        }

        for (int i = 0; i < numDataElems; i++) {
            sdata[i] = (short) (data.getElem(bankIndices[i], pixelOffset + bandOffsets[i] + 1) << 8|
                    data.getElem(bankIndices[i], pixelOffset + bandOffsets[i]));
        }

        return sdata;
    }
};
ColorModel colorModel = new ComponentColorModel(ColorSpace.getInstance(ColorSpace.CS_GRAY), false, false, Transparency.OPAQUE, DataBuffer.TYPE_USHORT);
WritableRaster raster = Raster.createWritableRaster(sampleModel, dataBuffer, null);

BufferedImage image = new BufferedImage(colorModel, raster, colorModel.isAlphaPremultiplied(), null);

我真的没有找到为此创建RenderedImage子类的原因。