我正在iOS应用中录制实时视频。在另一个StackOverflow页面(Link)上,我发现你可以使用vImage_Buffer来处理我的帧。
问题是我不知道如何从输出的vImage_buffer返回CVPixelBufferRef。
以下是另一篇文章中给出的代码:
NSInteger cropX0 = 100,
cropY0 = 100,
cropHeight = 100,
cropWidth = 100,
outWidth = 480,
outHeight = 480;
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
vImage_Buffer inBuff;
inBuff.height = cropHeight;
inBuff.width = cropWidth;
inBuff.rowBytes = bytesPerRow;
int startpos = cropY0*bytesPerRow+4*cropX0;
inBuff.data = baseAddress+startpos;
unsigned char *outImg= (unsigned char*)malloc(4*outWidth*outHeight);
vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4*outWidth};
vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
if (err != kvImageNoError) NSLog(@" error %ld", err);
现在我需要将outBuff
转换为CVPixelBufferRef。
我认为我需要使用vImageBuffer_CopyToCVPixelBuffer
,但我不确定如何。
我的第一次尝试因EXC_BAD_ACCESS而失败:
CVPixelBufferUnlockBaseAddress(ImageBuffer的,0);
CVPixelBufferRef pixelBuffer;
CVPixelBufferCreate(kCFAllocatorSystemDefault, 480, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );
vImage_CGImageFormat format = {
.bitsPerComponent = 8,
.bitsPerPixel = 32,
.bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst, //BGRX8888
.colorSpace = NULL, //sRGB
};
vImageBuffer_CopyToCVPixelBuffer(&outBuff,
&format,
pixelBuffer,
NULL,
NULL,
kvImageNoFlags); // Here is the crash!
CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );
有什么想法吗?
答案 0 :(得分:1)
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithBool : YES], kCVPixelBufferCGImageCompatibilityKey,
[NSNumber numberWithBool : YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
[NSNumber numberWithInt : 480], kCVPixelBufferWidthKey,
[NSNumber numberWithInt : 480], kCVPixelBufferHeightKey,
nil];
status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault, 480, 480, kCVPixelFormatType_32BGRA, outImg, bytesPerRow, NULL, NULL, (__bridge CFDictionaryRef)options, &pixbuffer);
您应该像上面一样生成一个新的pixelBuffer。
答案 1 :(得分:0)
有关您的问题的vImage的注意事项(不同情况可能有所不同)
CVPixelBufferCreateWithBytes不适用于vImageBuffer_CopyToCVPixelBuffer(),因为您需要将vImage_Buffer数据复制到“干净”或“空”的CVPixelBuffer中。
无需锁定/解锁-确保知道何时锁定和何时不锁定像素缓冲区。
您的inBuff vImage_Buffer仅需要从像素缓冲区数据中初始化,而无需手动进行初始化(除非您知道如何使用CGContexts等来初始化像素网格)
使用vImageBuffer_InitWithCVPixelBuffer()
vImageScale_ARGB8888会将整个CVPixel数据缩放为一个较小/较大的矩形。不会将缓冲区的一部分/裁剪区域缩放到另一个缓冲区。
使用vImageBuffer_CopyToCVPixelBuffer()时 必须正确填写vImageCVImageFormatRef和vImage_CGImageFormat。
CGColorSpaceRef dstColorSpace = CGColorSpaceCreateWithName(kCGColorSpaceITUR_709);
vImage_CGImageFormat format = {
.bitsPerComponent = 16,
.bitsPerPixel = 64,
.bitmapInfo = (CGBitmapInfo)kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder16Big ,
.colorSpace = dstColorSpace
};
vImageCVImageFormatRef vformat = vImageCVImageFormat_Create(kCVPixelFormatType_4444AYpCbCr16,
kvImage_ARGBToYpCbCrMatrix_ITU_R_709_2,
kCVImageBufferChromaLocation_Center,
format.colorSpace,
0);
CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
480,
480,
kCVPixelFormatType_4444AYpCbCr16,
NULL,
&destBuffer);
NSParameterAssert(status == kCVReturnSuccess && destBuffer != NULL);
err = vImageBuffer_CopyToCVPixelBuffer(&sourceBuffer, &format, destBuffer, vformat, 0, kvImagePrintDiagnosticsToConsole);
(注意-这些是使用Alpha的64位ProRes的设置-调整为32位)