在iOS上使用AVCodec将原始YUV420P编码为h264

时间:2012-10-30 00:18:52

标签: ios encoding ffmpeg

我正在尝试将从CMSampleBuffer收集的单个YUV420P图像编码为AVPacket,以便我可以通过RTMP在网络上发送h264视频。

发布的代码示例似乎有效avcodec_encode_video2返回0(成功),但got_output也是0AVPacket为空)。

有没有人有在iOS设备上编码视频的经验,可能知道我做错了什么?

- (void) captureOutput:(AVCaptureOutput *)captureOutput
 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
        fromConnection:(AVCaptureConnection *)connection {

  // sampleBuffer now contains an individual frame of raw video frames
  CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

  CVPixelBufferLockBaseAddress(pixelBuffer, 0);

  // access the data
  int width = CVPixelBufferGetWidth(pixelBuffer);
  int height = CVPixelBufferGetHeight(pixelBuffer);
  int bytesPerRow = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
  unsigned char *rawPixelBase = (unsigned char *)CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);


  // Convert the raw pixel base to h.264 format
  AVCodec *codec = 0;
  AVCodecContext *context = 0;
  AVFrame *frame = 0;
  AVPacket packet;

  //avcodec_init();
  avcodec_register_all();
  codec = avcodec_find_encoder(AV_CODEC_ID_H264);

  if (codec == 0) {
    NSLog(@"Codec not found!!");
    return;
  }

  context = avcodec_alloc_context3(codec);

  if (!context) {
    NSLog(@"Context no bueno.");
    return;
  }

  // Bit rate
  context->bit_rate = 400000; // HARD CODE
  context->bit_rate_tolerance = 10;
  // Resolution
  context->width = width;
  context->height = height;
  // Frames Per Second
  context->time_base = (AVRational) {1,25};
  context->gop_size = 1;
  //context->max_b_frames = 1;
  context->pix_fmt = PIX_FMT_YUV420P;

  // Open the codec
  if (avcodec_open2(context, codec, 0) < 0) {
    NSLog(@"Unable to open codec");
    return;
  }


  // Create the frame
  frame = avcodec_alloc_frame();
  if (!frame) {
    NSLog(@"Unable to alloc frame");
    return;
  }
  frame->format = context->pix_fmt;
  frame->width = context->width;
  frame->height = context->height;


  avpicture_fill((AVPicture *) frame, rawPixelBase, context->pix_fmt, frame->width, frame->height);

  int got_output = 0;
  av_init_packet(&packet);
  avcodec_encode_video2(context, &packet, frame, &got_output)

  // Unlock the pixel data
  CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
  // Send the data over the network
  [self uploadData:[NSData dataWithBytes:packet.data length:packet.size] toRTMP:self.rtmp_OutVideoStream];
}

注意:众所周知,此代码存在内存泄漏,因为我没有释放动态分配的内存。

更新

我更新了我的代码以使用@pogorskiy方法。我只是尝试上传帧,如果输出返回1并在我完成编码视频帧后清除缓冲区。

0 个答案:

没有答案