从GLKView进行屏幕录制

时间:2014-03-25 15:08:13

标签: objective-c opengl-es

我正在尝试从GLKView记录用户操作的屏幕,我的视频文件在这里,我有正确的长度,但它只显示黑屏。

我已经将GLKView子类化,在其上添加了一个平移手势识别器,每当用户做某事时,我在我的视图上绘制点(比这更复杂,但你得到了它)。

以下是我初始化视频的方式

    NSError *error = nil;

    NSURL *url =  [NSURL fileURLWithPath:@"/Users/Dimillian/Documents/DEV/movie.mp4"];
    [[NSFileManager defaultManager]removeItemAtURL:url error:nil];
    self.assetWriter = [[AVAssetWriter alloc] initWithURL:url fileType:AVFileTypeAppleM4V error:&error];
    if (error != nil)
    {
        NSLog(@"Error: %@", error);
    }


    NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
    [outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
    [outputSettings setObject: [NSNumber numberWithInt: 954] forKey: AVVideoWidthKey];
    [outputSettings setObject: [NSNumber numberWithInt: 608] forKey: AVVideoHeightKey];


    self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    self.assetWriterVideoInput.expectsMediaDataInRealTime = YES;

    // You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                           [NSNumber numberWithInt:954], kCVPixelBufferWidthKey,
                                                           [NSNumber numberWithInt:608], kCVPixelBufferHeightKey,
                                                           nil];

    self.assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:
                             self.assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];

    [self.assetWriter addInput:self.assetWriterVideoInput];

    self.startTime = [NSDate date];
    self.lastTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);

    [self.assetWriter startWriting];
    [self.assetWriter startSessionAtSourceTime:kCMTimeZero];
}

现在这是我的识别器的简短版本

- (void)pan:(UIPanGestureRecognizer *)p {
   // Prepare vertex to be added on screen according to user input
   [self setNeedsDisplay];
}

现在这是我的直接方法

- (void)drawRect:(CGRect)rect
{
    glClearColor(1, 1, 1, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

    [effect prepareToDraw];

    //removed code about vertex drawing

    [self capturePixels];
}

最后我的capturePixels功能

- (void)capturePixels
{
    glFinish();

    CVPixelBufferRef pixel_buffer = NULL;

    CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, self.assetWriterPixelBufferInput.pixelBufferPool, &pixel_buffer);
    if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
    {
        NSLog(@"%d", status);
        NSLog(@"VIDEO FAILED");
        return;
    }
    else
    {
        CVPixelBufferLockBaseAddress(pixel_buffer, 0);
        glReadPixels(0, 0, 954, 608, GL_RGBA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(pixel_buffer));
    }

    // May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
    CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);

    if(![self.assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime])
    {
        NSLog(@"Problem appending pixel buffer at time: %lld", currentTime.value);
    }
    else
    {
        NSLog(@"%@", pixel_buffer);
        NSLog(@"Recorded pixel buffer at time: %lld", currentTime.value);
        self.lastTime = currentTime;
    }
    CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);

    CVPixelBufferRelease(pixel_buffer);
}

我还有另一个关闭视频输入的功能。

- (void)tearDownGL
{
    NSLog(@"Tear down");

    [self.assetWriterVideoInput markAsFinished];
    [self.assetWriter endSessionAtSourceTime:self.lastTime];
    [self.assetWriter finishWritingWithCompletionHandler:^{
        NSLog(@"finish video");
    }];

    [EAGLContext setCurrentContext:context];

    glDeleteBuffers(1, &vertexBuffer);
    glDeleteVertexArraysOES(1, &vertexArray);

    effect = nil;

    glFinish();

    if ([EAGLContext currentContext] == context) {
        [EAGLContext setCurrentContext:nil];
    }
    context = nil;
}

这似乎有效,因为我没有错误,最后视频的长度正确,但它只是黑色...... 我不是OpenGL的专家,它只是我iOS应用程序的一小部分,我想学习它,我正在努力,感谢来自@BradLarson(OpenGL ES 2.0 to Video on iPad/iPhone)的帖子我已经能够取得进步,但我现在真的被困住了。

0 个答案:

没有答案