将c-header函数转换为Delphi函数以传递YUV图像

时间:2014-05-26 15:33:22

标签: delphi header-files delphi-xe4 yuv

我需要翻译包含此功能的图像处理SDK的C头:

#define SDK_API __declspec(dllimport)
SDK_API   BOOL   WINAPI SetSourceYUVJ420(HANDLE Display, BYTE **YUV420P, int *LineSize, int srcWidth, int srcHeight);

此函数用于将YUVJ420帧传递给SDK。

在我的代码中,框架存储在PAVPicture记录中(在FFVCL组件库中定义),其中包含:

data: array[0..7] of pbyte;
linesize: array[0..7] of integer;

解码视频帧后,FFVCL会触发一个事件,其中帧可用作PAVPicture类型的APicture。

我这样翻译:

function  SetSourceYUVJ420(Display: UIntPtr; YUV420P: Pointer; LineSize: Pointer; srcWidth, srcHeight: integer): boolean stdcall; external 'SDK.DLL' name '_SetSourceYUVJ420@20';

并像这样使用它:

SetSourceYUVJ420(Display, @APicture.data[0], @APicture.linesize[0], W, H);

..但这样我错过了一些指针/地址。

SDK文档是Chinglish并且已过期。我有一个C示例,它使用FFMpeg进行解码,当帧完成时,部分pFrame(类型AVFrame)被传递给SDK:

unsigned WINAPI MYTest() {
   AVFormatContext *pFormatCtx;
   unsigned int    i, videoStream;
   AVCodecContext  *pCodecCtx;
   AVCodec         *pCodec;
   AVFrame         *pFrame = NULL;
   AVPacket        packet;
   int             frameFinished;

   av_register_all();

   pFormatCtx =  avformat_alloc_context();
   avformat_open_input(&pFormatCtx, "..\\..\\images\\Test.mp4",NULL,NULL);
   av_find_stream_info(pFormatCtx);

   videoStream=-1;
   for (i=0; i < pFormatCtx->nb_streams; i++) {
      if (pFormatCtx->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO) {
         videoStream = i;
         break;
      }
   }

   // Get a pointer to the codec context for the video stream
   pCodecCtx = pFormatCtx->streams[videoStream]->codec;
   pCodec = avcodec_find_decoder(pCodecCtx->codec_id);
   // Open codec
   avcodec_open2(pCodecCtx, pCodec,0) < 0;

   // Allocate video frame
   pFrame = av_frame_alloc();

   // Read frames and save first five frames to disk
   i = 0;

   BOOL fSetBuffer = FALSE;
   BOOL f = FALSE;
   while (av_read_frame(pFormatCtx, &packet) >= 0 && !fTerminate) {
      // Is this a packet from the video stream?
      if(packet.stream_index==videoStream) {
         // Decode video frame
          avcodec_decode_video2(pCodecCtx, pFrame, &frameFinished, &packet);

          if ( frameFinished ) {
            // This function told SDK, new video imcoming, with video format is YUV420
             SetSourceYUVJ420(gDisplay, pFrame->data, pFrame->linesize, pCodecCtx->width, pCodecCtx->height);
         }
    }

     // Free the packet that was allocated by av_read_frame
     av_free_packet(&packet);
  }
  // Free the YUV frame
  av_free(pFrame);
  // Close the codec
  avcodec_close(pCodecCtx);
  // Close the video file
  avformat_close_input(&pFormatCtx);

  return TRUE;
}

这是AVFrame的声明(取自FFMPeg的frame.h):

typedef struct AVFrame {
#define AV_NUM_DATA_POINTERS 8
    /**
     * pointer to the picture/channel planes.
     * This might be different from the first allocated byte
     *
     * Some decoders access areas outside 0,0 - width,height, please
     * see avcodec_align_dimensions2(). Some filters and swscale can read
     * up to 16 bytes beyond the planes, if these filters are to be used,
     * then 16 extra bytes must be allocated.
     */
    uint8_t *data[AV_NUM_DATA_POINTERS];

    /**
     * For video, size in bytes of each picture line.
     * For audio, size in bytes of each plane.
     *
     * For audio, only linesize[0] may be set. For planar audio, each channel
     * plane must be the same size.
     *
     * For video the linesizes should be multiplies of the CPUs alignment
     * preference, this is 16 or 32 for modern desktop CPUs.
     * Some code requires such alignment other code can be slower without
     * correct alignment, for yet other it makes no difference.
     *
     * @note The linesize may be larger than the size of usable data -- there
     * may be extra padding present for performance reasons.
     */
    int linesize[AV_NUM_DATA_POINTERS];
<SNIP>
} AVFrame;

问题:

  • 如何正确翻译此C函数定义
  • 如何使用提供的FFVCL类型PAVPicture
  • 调用此函数

1 个答案:

答案 0 :(得分:1)

这里缺少很多细节。我会尽力引导你朝着正确的方向前进,但请不要把这个答案视为明确的。为了编写二进制互操作代码,您确实需要一个完整而清晰的二进制接口规范。您可能需要进行一些挖掘以深入研究。

首先,让我们看一下HANDLE类型。这可能是Win32类型HANDLE,在这种情况下,它将转换为THandle。更有可能的是,它是由相关库定义的类型。也许它最好翻译为PointerNativeInt。我将假设后者。

BYTE**参数是BYTE*的数组。这似乎是由呼叫者分配的。您可能会将其翻译为Delphi PPByte。这是指向Byte的指针。

下一个参数是LineSize,类型为int*。这是int的数组。因此,字面翻译将是PInteger,指向Integer的指针。

最后两个参数是简单整数。

因此该函数将声明为:

function SetSourceYUVJ420(
  Display: NativeInt; 
  YUV420P: PPByte; 
  LineSize: PInteger; 
  srcWidth: Integer;
  srcHeight: integer
): LongBool; stdcall; external 'SDK.DLL' name '_SetSourceYUVJ420@20';

您还需要翻译结构。它是这样的:

type
  TAVFrame = record
    data: array [0..7] of PByte;
    linesize: array [0..7] of Integer;
  end;
  PAVFrame = ^TAVFrame;

显然,您的代码需要掌握显示句柄。我不知道你是怎么做的。大概你已经知道该怎么做了。同样,您需要创建一个调用av_frame_alloc的框架。同样,我只能假设你已经知道如何做到这一点。

因此,假设您已正确初始化以下变量:

var
  Display: NativeInt;
  Frame: PAVFrame;

然后电话会是这样的:

if not SetSourceYUVJ420(Display, @Frame.data[0], @Frame.linesize[0], W, H) then
  .... handle error

根据您显示的C代码判断,为了调用此函数,需要做很多事情。您需要一切正确才能使功能正常工作。也就是说,如果您的代码不起作用,问题可能完全出现在我们看不到的代码中,代码未包含在您的问题中。