我尝试使用OpenGL ES和ffmpeg从图像创建视频,但在iPad(4.3)上我遇到了glReadPixels
的崩溃
-(NSData *) glToUIImage {
int numberOfComponents = NUMBER_OF_COMPONENTS; //4
int width = PICTURE_WIDTH;
int height = PICTURE_HEIGHT;
NSInteger myDataLength = width * height * numberOfComponents;
NSMutableData * buffer= [NSMutableData dataWithLength :myDataLength];
[self checkForGLError];
GLenum type = NUMBER_OF_COMPONENTS == 3 ? GL_RGB : GL_RGBA; //RGBA
glReadPixels(0, 0, width, height, type, GL_UNSIGNED_BYTE, [buffer mutableBytes]); //EXC_BAD_ACCESS here
return buffer;
}
它适用于iPhone 4(4.3)和iPod Touch,但在iPhone 3G(3.0)和iPad(4.3)上存在问题。你能帮我解决这个问题吗?
同样在iPhone 3G(3.0)和iPad(4.3)上我遇到视频问题 - 首先5-20个视频帧有垃圾。也许优化问题?还是建筑?
EDITED 堆栈:
#0 0x33be3964 in void BlockNxN<64ul, 16ul, 1, BLOCK_CONVERTER_NULL_32>(unsigned long, int, int, unsigned long, int, int, unsigned int, unsigned int, unsigned int, unsigned int) ()
#1 0x33be1c76 in glrBIFDetile ()
#2 0x33b586b2 in sgxGetImage(SGXImageReadParams const*) ()
#3 0x33b50d38 in gldReadPixels ()
#4 0x31813e16 in glReadPixels_Exec ()
#5 0x31e3c518 in glReadPixels ()
答案 0 :(得分:2)
我已经想通了!!!
我已经解决了这个问题大约两周了。
您必须在glReadPixels()
[(EAGLView *)eagleView presentFramebuffer];
因此,在读取像素之前必须绑定colorRenderbuffer。 最终方法列表:
int numberOfComponents = NUMBER_OF_COMPONENTS;
int width = PICTURE_WIDTH;
int height = PICTURE_HEIGHT;
NSInteger myDataLength = width * height * numberOfComponents;
NSMutableData * buffer= [NSMutableData dataWithLength :myDataLength];
glBindRenderbuffer(GL_RENDERBUFFER_OES, [((EAGLView *)eagleView) colorRenderbuffer]);
[self checkForGLError];
glPixelStorei(GL_PACK_ALIGNMENT, 4); // force 4 byte alignment
glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, [buffer mutableBytes]);
return buffer;