我正在构建一个iOS视频流聊天应用程序,我正在使用的库要求我发送视频数据,单独传递YUV(或我猜YCbCr)数据。
我设置了委托,但我不确定如何从CMSampleBufferRef
中添加单个YUV元素。很多苹果指南从我看到的,引用了关于捕获视频帧到UIImages的东西。
流格式
- (BOOL)setupWithError:(NSError **)error
{
AVCaptureDevice *videoCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoCaptureDevice error:error];
if (! videoInput) {
return NO;
}
[self.captureSession addInput:videoInput];
self.processingQueue = dispatch_queue_create("abcdefghijk", NULL);
[self.dataOutput setAlwaysDiscardsLateVideoFrames:YES];
NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
[self.dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:value
forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[self.dataOutput setSampleBufferDelegate:self queue:self.processingQueue];
return YES;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
if (! imageBuffer) {
return;
}
uint16_t width = CVPixelBufferGetHeight(imageBuffer);
uint16_t height = CVPixelBufferGetHeight(imageBuffer);
uint8_t yPlane[??] = ???
uint8_t uPlane[?] = ???
uint8_t vPlane[?] = ???
[self.library sendVideoFrametoFriend:self.friendNumber width:width height:height
yPlane:yPlane
uPlane:uPlane
vPlane:vPlane
error:nil];
}
有没有人有任何可以解决这个问题的例子或链接?
更新 根据{{3}},应该有更多Y元素,然后有U / V.图书馆也证实了这一点,下面没有:
* Y - plane should be of size: height * width
* U - plane should be of size: (height/2) * (width/2)
* V - plane should be of size: (height/2) * (width/2)
答案 0 :(得分:3)
更新我现在已经阅读了如何编写YUV缓冲区,这就是您阅读它的方式。我也确保我不会在每一帧都使用malloc。
玩得开心! ;)
//int bufferWidth = CVPixelBufferGetWidth(pixelBuffer);
int yHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,0);
int uvHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,1);
int yWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer,0);
int uvWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer,1);
int ybpr = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 0);
int uvbpr = CVPixelBufferGetBytesPerRowOfPlane(pixelBuffer, 1);
int ysize = yHeight * ybpr ;
int uvsize = uvHeight * uvbpr ;
static unsigned char *ypane;
if(!ypane)
ypane = (unsigned char*)malloc(ysize);
static unsigned char *upane;
if(!upane)
upane = (unsigned char*)malloc(uvsize);
static unsigned char *vpane;
if(!vpane)
vpane = (unsigned char*)malloc(uvsize);
unsigned char *yBase = CVPixelBufferGetBaseAddressOfPlane(ypane, 0);
unsigned char *uBase = CVPixelBufferGetBaseAddressOfPlane(upane, 1;
unsigned char *vBase = CVPixelBufferGetBaseAddressOfPlane(vpane, 2);
for(int y=0,y<yHeight;y++)
{
for(int x=0,x<yWidth;x++)
{
ypane[y*yWidth+x]=yBase[y*ybpr+x];
}
}
for(int y=0,y<uvHeight;y++)
{
for(int x=0,x<uvWidth;x++)
{
upane[y*uvWidth+x]=uBase[y*uvbpr+x];
vpane[y*uvWidth+x]=vBase[y*uvbpr+x];
}
}