通过网络发送AVPacket

时间:2019-09-26 18:00:59

标签: java ffmpeg

我正在用带有ffmpeg的编码器生成AVPackets,现在我想将它们与UDP一起发送到另一台计算机,并在那里显示它们。 问题是我不知道如何将数据包转换为字节并返回。我尝试使用此方法复制软件包:

AVPacket newPacket = avcodec.av_packet_alloc();


ByteBuffer byteBuffer = packet.buf().buffer().asByteBuffer();
int bufferSize = byteBuffer.capacity();
byte bytes[] = new byte[bufferSize];
byteBuffer.get(bytes);
AVBufferRef newBufferRef = avutil.av_buffer_alloc(bufferSize);
newBufferRef.data(new BytePointer(bytes));
newPacket.buf(newBufferRef);


ByteBuffer dataBuffer = packet.data().asByteBuffer();
int dataSize = dataBuffer.capacity();
byte dataBytes[] = new byte[dataSize];
dataBuffer.get(dataBytes);
BytePointer dataPointer = new BytePointer(dataBytes);
newPacket.data(dataPointer);


newPacket.dts(packet.dts());
newPacket.duration(packet.duration());
newPacket.flags(packet.flags());
newPacket.pos(packet.pos());
newPacket.pts(packet.pts());
newPacket.side_data_elems(0);
newPacket.size(packet.size());
newPacket.stream_index(packet.stream_index());


videoPlayer.sendPacket(newPacket);

这给了我这个错误:

[h264 @ 0000018951be8440] Invalid NAL unit size (3290676 > 77).
[h264 @ 0000018951be8440] Error splitting the input into NAL units.
[h264 @ 0000018951bf6480] Invalid NAL unit size (15305314 > 163).
[h264 @ 0000018951bf6480] Error splitting the input into NAL units.

问题是newPacket.data()。当我直接设置它时:newPacket.data(packet.data()) 有用。同样,packet.data().asByteBuffer().capacity()返回1,packet.data().capacity()返回0。

这是我创建解码器的方法:

private void startUnsafe() throws Exception
    {
        int result;

        convertContext = null;
        codec = null;
        codecContext = null;
        AVFrame = null;
        RGBAVFrame = null;
        frame = new Frame();

        codec = avcodec_find_decoder(codecID);
        if(codec == null)
        {
            throw new Exception("Unable to find decoder");
        }

        codecContext = avcodec_alloc_context3(codec);
        if(codecContext == null)
        {
            releaseUnsafe();
            throw new Exception("Unable to alloc codec context!");
        }

        AVCodecParameters para = avcodec_parameters_alloc();
        para.bit_rate(streamBitrate);
        para.width(streamWidth);
        para.height(streamHeight);
        para.codec_id(codecID);
        para.codec_type(AVMEDIA_TYPE_VIDEO);
        try
        {
            byte extradataByte[] = Files.readAllBytes(new File("extradata.byte").toPath());
            para.extradata(new BytePointer(extradataByte));
            para.extradata_size(extradataByte.length);
        }
        catch (IOException e1)
        {
            e1.printStackTrace();
            throw new Exception("extradata file not available");
        }

        result = avcodec_parameters_to_context(codecContext, para);
        if(result < 0)
        {
            throw new Exception("Unable to copy parameters to context! [" + result + "]");
        }

        codecContext.thread_count(0);

        result = avcodec_open2(codecContext, codec, new AVDictionary());
        if(result < 0)
        {
            releaseUnsafe();
            throw new Exception("Unable to open codec context![" + result + "]");
        }

        AVFrame = av_frame_alloc();
        if(AVFrame == null)
        {
            releaseUnsafe();
            throw new Exception("Unable to alloc AVFrame!");
        }

        RGBAVFrame = av_frame_alloc();
        if(RGBAVFrame == null)
        {
            releaseUnsafe();
            throw new Exception("Unable to alloc AVFrame!");
        }
        initRGBAVFrame();

        TimerTask task = new TimerTask() {

            @Override
            public void run()
            {
                timerTask();
            }
        };
        timer = new Timer();
        timer.scheduleAtFixedRate(task, 0, (long) (1000/streamFramerateDouble));

        window.setVisible(true);
    }

文件extradata.byte包含我从另一个视频中获取的一些字节,因为没有它们,它也无法正常工作。

编辑:

package org.stratostream.streaming;

import java.nio.ByteBuffer;


import org.bytedeco.javacpp.BytePointer;
import org.bytedeco.javacpp.Pointer;
import org.bytedeco.javacpp.avcodec;
import org.bytedeco.javacpp.avutil;
import org.bytedeco.javacpp.avcodec.AVPacket;
import org.bytedeco.javacpp.avcodec.AVPacketSideData;


public class PacketIO {


    public static final int SIDE_DATA_FIELD = 0;
    public static final int SIDE_ELEMENTS_FIELD = 4;
    public static final int SIDE_TYPE_FIELD = 8;
    public static final int DTS_FIELD = 12;
    public static final int PTS_FIELD = 20;
    public static final int FLAGS_FIELD = 28;
    public static final int DATA_OFFSET = 32;

    public static byte[] toByte(AVPacket packet) throws Exception
    {
        int dataSize = packet.size();
        ByteBuffer dataBuffer = packet.data().capacity(dataSize).asByteBuffer();
        byte dataBytes[] = new byte[dataSize];
        dataBuffer.get(dataBytes);

        AVPacketSideData sideData = packet.side_data();
        int sideSize = sideData.size();
        ByteBuffer sideBuffer = sideData.data().capacity(sideSize).asByteBuffer();
        byte sideBytes[] = new byte[sideSize];
        sideBuffer.get(sideBytes);

        int sideOffset = DATA_OFFSET + dataSize;
        int resultSize = sideOffset + sideSize;
        byte resultBytes[] = new byte[resultSize];
        System.arraycopy(dataBytes, 0, resultBytes, DATA_OFFSET, dataSize);
        System.arraycopy(sideBytes, 0, resultBytes, sideOffset, sideSize);
        resultBytes[SIDE_DATA_FIELD] = (byte) (sideOffset >>> 24);
        resultBytes[SIDE_DATA_FIELD+1] = (byte) (sideOffset >>> 16);
        resultBytes[SIDE_DATA_FIELD+2] = (byte) (sideOffset >>> 8);
        resultBytes[SIDE_DATA_FIELD+3] = (byte) (sideOffset >>> 0);

        int sideType = sideData.type();
        intToByte(resultBytes, SIDE_TYPE_FIELD, sideType);

        int sideElements = packet.side_data_elems();
        intToByte(resultBytes, SIDE_ELEMENTS_FIELD, sideElements);

        long dts = packet.dts();
        longToByte(resultBytes, DTS_FIELD, dts);

        long pts = packet.pts();
        longToByte(resultBytes, PTS_FIELD, pts);

        int flags = packet.flags();
        intToByte(resultBytes, FLAGS_FIELD, flags);

        return resultBytes;
    }

    public static AVPacket toPacket(byte bytes[]) throws Exception
    {
        AVPacket packet = avcodec.av_packet_alloc();

        int sideOffset = byteToInt(bytes, SIDE_DATA_FIELD);
        int sideElements = byteToInt(bytes, SIDE_ELEMENTS_FIELD);
        int sideType = byteToInt(bytes, SIDE_TYPE_FIELD);
        int dataSize = sideOffset - DATA_OFFSET;
        int sideSize = bytes.length - sideOffset;

        long pts = byteToLong(bytes, PTS_FIELD);
        long dts = byteToLong(bytes, DTS_FIELD);
        int flags = byteToInt(bytes, FLAGS_FIELD);

        packet.pts(pts);
        packet.dts(dts);
        packet.flags(flags);


        Pointer newDataPointer =  avutil.av_malloc(bytes.length);
        BytePointer dataPointer = new BytePointer(newDataPointer);
        byte dataBytes[] = new byte[dataSize];
        System.arraycopy(bytes, DATA_OFFSET, dataBytes, 0, dataSize);
        dataPointer.put(dataBytes);
        packet.data(dataPointer);
        packet.size(dataSize);

        Pointer newSidePointer = avutil.av_malloc(sideSize);
        BytePointer sidePointer = new BytePointer(newSidePointer);
        byte sideBytes[] = new byte[sideSize];
        System.arraycopy(bytes, sideOffset, sideBytes, 0, sideSize);
        sidePointer.put(sideBytes);
        AVPacketSideData sideData = new AVPacketSideData();
        sideData.data(sidePointer);
        sideData.type(sideType);
        sideData.size(sideSize);
        //packet.side_data(sideData);
        //packet.side_data_elems(sideElements);

        return packet;
    }

    private static void intToByte(byte[] bytes, int offset, int value)
    {
        bytes[offset] = (byte) (value >>> 24);
        bytes[offset+1] = (byte) (value >>> 16);
        bytes[offset+2] = (byte) (value >>> 8);
        bytes[offset+3] = (byte) (value >>> 0);
    }

    private static void longToByte(byte[] bytes, int offset, long value)
    {
        bytes[offset] = (byte) (value >>> 56);
        bytes[offset+1] = (byte) (value >>> 48);
        bytes[offset+2] = (byte) (value >>> 40);
        bytes[offset+3] = (byte) (value >>> 32);
        bytes[offset+4] = (byte) (value >>> 24);
        bytes[offset+5] = (byte) (value >>> 16);
        bytes[offset+6] = (byte) (value >>> 8);
        bytes[offset+7] = (byte) (value >>> 0);
    }

    private static int byteToInt(byte[] bytes, int offset)
    {
        return (bytes[offset]<<24)&0xff000000|(bytes[offset+1]<<16)&0x00ff0000|(bytes[offset+2]<<8)&0x0000ff00|(bytes[offset+3]<<0)&0x000000ff;
    }

    private static long byteToLong(byte[] bytes, int offset)
    {
        return (bytes[offset]<<56)&0xff00000000000000L|(bytes[offset+1]<<48)&0x00ff000000000000L|(bytes[offset+2]<<40)&0x0000ff0000000000L|(bytes[offset+3]<<32)&0x000000ff00000000L|(bytes[offset+4]<<24)&0x00000000ff000000L|(bytes[offset+5]<<16)&0x0000000000ff0000L|(bytes[offset+6]<<8)&0x000000000000ff00L|(bytes[offset+7]<<0)&0x00000000000000ffL;
    }

}

现在,我有一个可以在同一程序上正常运行的类,但是当我通过网络发送字节时,我得到了错误的输出,并且此错误被打印到控制台:

[h264 @ 00000242442acc40] Missing reference picture, default is 72646
[h264 @ 000002424089de00] Missing reference picture, default is 72646
[h264 @ 000002424089e3c0] mmco: unref short failure
[h264 @ 000002424081a580] reference picture missing during reorder
[h264 @ 000002424081a580] Missing reference picture, default is 72652
[h264 @ 000002424082c400] mmco: unref short failure
[h264 @ 000002424082c400] co located POCs unavailable
[h264 @ 000002424082c9c0] co located POCs unavailable
[h264 @ 00000242442acc40] co located POCs unavailable
[h264 @ 000002424089de00] mmco: unref short failure

我认为是因为我没有设置sidedata字段,但是当我尝试设置它时,编码器会与第二个数据包一起崩溃。

输出看起来像这样: Decoder Output

2 个答案:

答案 0 :(得分:0)

AVPacket.data()的调用返回一个BytePointer,它是本机C ++指针的包装。没有与指针相关联的容量信息,这意味着您需要手动设置容量。您可能需要类似的东西:

int packetSize = packet.size();
ByteBuffer dataBuffer = packet.data().capacity(packetSize).asByteBuffer();

答案 1 :(得分:0)

我找到了适合我的解决方案。

AVFrame转换为字节:

    int dataSize = packet.size();
    ByteBuffer dataBuffer = packet.data().capacity(dataSize).asByteBuffer();
    byte dataBytes[] = new byte[dataSize];
    dataBuffer.get(dataBytes);

    AVPacketSideData sideData = packet.side_data();
    int sideSize = sideData.size();
    ByteBuffer sideBuffer = sideData.data().capacity(sideSize).asByteBuffer();
    byte sideBytes[] = new byte[sideSize];
    sideBuffer.get(sideBytes);

    int sideType = sideData.type();
    int sideElements = packet.side_data_elems();
    long dts = packet.dts();
    long pts = packet.pts();
    int flags = packet.flags();

字节回到AVFrame:

    packet = avcodec.av_packet_alloc();
    packet.pts(pts);
    packet.dts(dts);
    packet.flags(flags);

    Pointer newDataPointer =  avutil.av_malloc(dataSize);
    BytePointer dataPointer = new BytePointer(newDataPointer);
    dataPointer.put(dataBytes);
    packet.data(dataPointer);
    packet.size(dataSize);

    Pointer newSidePointer = avutil.av_malloc(sideSize);
    BytePointer sidePointer = new BytePointer(newSidePointer);
    sidePointer.put(sideBytes);

    BytePointer sideDataPointer = avcodec.av_packet_new_side_data(packet, sideType, sideSize);
    AVPacketSideData sideData = new AVPacketSideData(sideDataPointer);
    sideData.data(sidePointer);
    sideData.type(sideType);
    sideData.size(sideSize);
    packet.side_data(sideData);
    packet.side_data_elems(sideElements);