VideoToolbox可以原生解码H264附件B吗?错误代码-8969 BadData

时间:2016-08-09 14:37:33

标签: macos avfoundation h.264 decoding video-toolbox

我的目标是将iDevice的屏幕镜像到OSX,尽可能无滞后。

据我所知,有两种方法:

  1. Airplay Mirroring(例如Reflector)
  2. CoreMediaIO通过Lightning(例如Quicktime Recording)
  3. 我选择了第二种方法,因为(据我所知),连接的iDevices可以在一次性设置后自动识别为DAL设备。

    关于如何执行此操作的主要资源是此博客:https://nadavrub.wordpress.com/2015/07/06/macos-media-capture-using-coremediaio/

    该博客深入探讨了如何使用CoreMediaIO,但是一旦您将已连接的iDevice识别为AVFoundation,您似乎可以使用AVCaptureDevice

    这个问题:How to mirror iOS screen via USB?已经发布了一个关于如何获取iDevice提供的H264(附件B)muxxed数据流的每一帧的解决方案。

    但是,我的问题是VideoToolbox无法正确解码(错误代码-8969,BadData),即使代码中没有任何差异。

      

    vtDecompressionDuctDecodeSingleFrame在/SourceCache/CoreMedia_frameworks/CoreMedia-1562.240/Sources/VideoToolbox/VTDecompressionSession.c第3241行发出错误信号err = -8969(错误)(VTVideoDecoderDecodeFrame返回错误)

    完整代码:

    #import "ViewController.h"
    
    @import CoreMediaIO;
    @import AVFoundation;
    @import AppKit;
    
    @implementation ViewController
    
    AVCaptureSession *session;
    AVCaptureDeviceInput *newVideoDeviceInput;
    AVCaptureVideoDataOutput *videoDataOutput;
    
    - (void)viewDidLoad {
        [super viewDidLoad];
    }
    
    - (instancetype)initWithCoder:(NSCoder *)coder
    {
        self = [super initWithCoder:coder];
        if (self) {
            // Allow iOS Devices Discovery
            CMIOObjectPropertyAddress prop =
            { kCMIOHardwarePropertyAllowScreenCaptureDevices,
                kCMIOObjectPropertyScopeGlobal,
                kCMIOObjectPropertyElementMaster };
            UInt32 allow = 1;
            CMIOObjectSetPropertyData( kCMIOObjectSystemObject,
                                      &prop, 0, NULL,
                                      sizeof(allow), &allow );
    
            // Get devices
            NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeMuxed];
            BOOL deviceAttahced = false;
            for (int i = 0; i < [devices count]; i++) {
                AVCaptureDevice *device = devices[i];
                if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
                    deviceAttahced = true;
                    [self startSession:device];
                    break;
                }
            }
    
        }
        return self;
    }
    
    - (void) deviceConnected:(AVCaptureDevice *)device {
        if ([[device uniqueID] isEqualToString:@"b48defcadf92f300baf5821923f7b3e2e9fb3947"]) {
            [self startSession:device];
        }
    }
    
    - (void) startSession:(AVCaptureDevice *)device {
    
        // Init capturing session
        session = [[AVCaptureSession alloc] init];
    
        // Star session configuration
        [session beginConfiguration];
    
        // Add session input
        NSError *error;
        newVideoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
            if (newVideoDeviceInput == nil) {
            dispatch_async(dispatch_get_main_queue(), ^(void) {
                NSLog(@"%@", error);
            });
        } else {
            [session addInput:newVideoDeviceInput];
        }
    
        // Add session output
        videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        videoDataOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (id)kCVPixelBufferPixelFormatTypeKey];
    
        dispatch_queue_t videoQueue = dispatch_queue_create("videoQueue", NULL);
    
        [videoDataOutput setSampleBufferDelegate:self queue:videoQueue];
        [session addOutput:videoDataOutput];
    
        // Finish session configuration
        [session commitConfiguration];
    
        // Start the session
        [session startRunning];
    }
    
    #pragma mark - AVCaptureAudioDataOutputSampleBufferDelegate
    
    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
        //NSImage *resultNSImage = [self imageFromSampleBuffer:sampleBuffer];
    
        //self.imageView.image = [self nsImageFromSampleBuffer:sampleBuffer];
        self.imageView.image = [[NSImage alloc] initWithData:imageToBuffer(sampleBuffer)];
    }    
    
    NSData* imageToBuffer( CMSampleBufferRef source) {
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(source);
        CVPixelBufferLockBaseAddress(imageBuffer,0);
    
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
        size_t height = CVPixelBufferGetHeight(imageBuffer);
        void *src_buff = CVPixelBufferGetBaseAddress(imageBuffer);
    
        NSData *data = [NSData dataWithBytes:src_buff length:bytesPerRow * height];
    
        CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
        return data;
    }  
    

1 个答案:

答案 0 :(得分:1)

不,您必须删除附件b起始码并将其替换为大小值。格式与MP4相同