导出Opengl ES视频

时间:2012-04-15 22:31:03

标签: xcode ipad opengl-es screen-grab

XCode能够从iPad捕获Opengl ES帧,这太棒了!我想扩展此功能并捕获我的应用程序的整个Opengl ES电影。有办法吗? 如果使用XCode是不可能的,我怎么能不费力而且对我的代码进行大的改动呢?非常感谢你!

1 个答案:

答案 0 :(得分:2)

我使用一种非常简单的技术,只需要几行代码。

您可以使用以下代码将每个OGL帧捕获到UIImage中:

- (UIImage*)captureScreen {

    NSInteger dataLength = framebufferWidth * framebufferHeight * 4;

    // Allocate array.
    GLuint *buffer = (GLuint *) malloc(dataLength);
    GLuint *resultsBuffer = (GLuint *)malloc(dataLength);
    // Read data
    glReadPixels(0, 0, framebufferWidth, framebufferHeight, GL_RGBA, GL_UNSIGNED_BYTE, buffer);

    // Flip vertical
    for(int y = 0; y < framebufferHeight; y++) {
        for(int x = 0; x < framebufferWidth; x++) {
            resultsBuffer[x + y * framebufferWidth] = buffer[x + (framebufferHeight - 1 - y) * framebufferWidth];
        }
    }

    free(buffer);

    // make data provider with data.
    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, resultsBuffer, dataLength, releaseScreenshotData);

    // prep the ingredients
    const int bitsPerComponent = 8;
    const int bitsPerPixel = 4 * bitsPerComponent;
    const int bytesPerRow = 4 * framebufferWidth;
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault;
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;

    // make the cgimage
    CGImageRef imageRef = CGImageCreate(framebufferWidth, framebufferHeight, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent);
    CGColorSpaceRelease(colorSpaceRef);
    CGDataProviderRelease(provider);

    // then make the UIImage from that
    UIImage *image = [UIImage imageWithCGImage:imageRef];
    CGImageRelease(imageRef);

    return image;
}

然后你将捕获主循环中的每一帧:

- (void)onTimer {

    // Compute and render new frame
    [self update];

    // Recording
    if (recordingMode == RecordingModeMovie) {

        recordingFrameNum++;

        // Save frame
        UIImage *image = [self captureScreen];
        NSString *fileName = [NSString stringWithFormat:@"%d.jpg", (int)recordingFrameNum];
        [UIImageJPEGRepresentation(image, 1.0) writeToFile:[basePath stringByAppendingPathComponent:fileName] atomically:NO];
    }
}

最后,您将拥有大量的JPEG文件,可以轻松地将其转换为电影 Time Lapse Assembler

如果你想拥有漂亮的30FPS电影,请将你的计算步骤硬盘修复为每帧1 / 30.0秒。