我在iOS上与AVSampleBufferDisplayLayer挣扎。我想使用这个图层显示一个CVPixelBuffer,但是我无法让它在实际的iOS设备上运行。在我的示例应用程序中,我尝试使用以下代码来显示一个颜色的像素缓冲区:
@implementation ViewController {
AVSampleBufferDisplayLayer *videoLayer;
}
- (void)viewDidLoad {
[super viewDidLoad];
videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
videoLayer.frame = CGRectMake(50, 50, 300, 300);
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:videoLayer];
}
@implementation ViewController {
AVSampleBufferDisplayLayer *videoLayer;
}
- (void)viewDidLoad {
[super viewDidLoad];
videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
videoLayer.frame = CGRectMake(50, 50, 300, 300);
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:videoLayer];
}
- (void)viewDidAppear:(BOOL)animated {
[super viewDidAppear:animated];
[self startVideo];
}
- (void)startVideo {
[self drawPixelBuffer];
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:@selector(drawPixelBuffer) userInfo:nil repeats:YES];
}
- (void)drawPixelBuffer {
int imageSize = 100;
static const uint8_t pixel[] = {0x00, 0xAA, 0xFF, 0xFF};
NSMutableData *frame = [NSMutableData data];
for (int i = 0; i < imageSize * imageSize; i++) {
[frame appendBytes:pixel length:4];
}
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(NULL, imageSize, imageSize, kCVPixelFormatType_32BGRA, [frame bytes], imageSize * 4, NULL, NULL, NULL, &pixelBuffer);
CMSampleBufferRef sampleBuffer = [self sampleBufferFromPixelBuffer:pixelBuffer];
if (sampleBuffer) {
[videoLayer enqueueSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}
}
- (CMSampleBufferRef)sampleBufferFromPixelBuffer:(CVPixelBufferRef)pixelBuffer {
CMSampleBufferRef sampleBuffer = NULL;
OSStatus err = noErr;
CMVideoFormatDescriptionRef formatDesc = NULL;
err = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc);
if (err != noErr) {
return nil;
}
CMSampleTimingInfo sampleTimingInfo = kCMTimingInfoInvalid;
err = CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer);
if (sampleBuffer) {
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
}
if (err != noErr) {
return nil;
}
formatDesc = NULL;
return sampleBuffer;
}
@end
这在iOS模拟器中没有任何问题,但它不能在真实设备上工作(没有渲染)。视频图层的错误属性始终为nil,状态始终等于AVQueuedSampleBufferRenderingStatusRendering。
感谢您的帮助。
答案 0 :(得分:4)
模拟器中的图形实现更加强大,而且通常可以摆脱在设备上无法工作的东西。有两个常见原因:
您直接通过CVPixelBufferCreateWithBytes
映射该缓冲区。请使用CVPixelBufferCreate
再次尝试将kCVPixelBufferIOSurfacePropertiesKey
属性设置为空字典。
CVPixelBufferCreate(
NULL,
imageSize,
imageSize,
kCVPixelFormatType_32BGRA,
(__bridge CFDictionaryRef)@{
(id)kCVPixelBufferIOSurfacePropertiesKey: @{}
},
&pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *bytes = CVPixelBufferGetBaseAddress(pixelBuffer);
// Write image data directly to that address
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
实际上,无论生成这些像素,应尽可能直接写入CVPixelBufferRef
。
kCVPixelFormatType_32BGRA
似乎极不可能,但我看到了kCVPixelFormatType_422YpCbCr8
等其他人的模拟器支持。在这些情况下,必须首先将其转换为兼容格式,或者必须实现自定义渲染器(OpenGL,Metal等)。