10
我想在iPhone上做一些圖像處理,基於來自Apple的GLImageProcessing示例。寫入然後從iPhone上的屏幕外FBO讀取;在模擬器上工作,但不在設備上?
最終,我想要做的是將圖像加載到紋理中,在示例代碼中執行一個或多個操作(色調,飽和度,亮度等),然後再讀取結果圖像用於以後的處理/保存。大多數情況下,這絕不需要觸摸屏幕,所以我認爲FBO可能是一條走向。首先,我已經拼湊了一個創建屏幕外FBO的小例子,繪製它,然後將數據作爲圖像讀出。在模擬器中完美工作時,我很興奮,然後因爲意識到自己剛剛在實際設備上出現黑屏而感到沮喪。
聲明:我的OpenGL已經夠老了,我已經學習了很多OpenGL ES的學習曲線,而且我從來沒有做過很多紋理嚮導。我知道該設備在幀緩衝區訪問方面具有與模擬器不同的特性(必須在屏幕外FBO和設備交換,在模擬器上直接訪問),但我一直無法找到我一直在做錯的事情,即使經過相當廣泛的搜索。
有什麼建議嗎?
// set up the offscreen FBO sizes
int renderBufferWidth = 1280;
int renderBufferHeight = 720;
// now the FBO
GLuint fbo = 0;
glGenFramebuffersOES(1, &fbo);
glBindFramebufferOES(GL_FRAMEBUFFER_OES, fbo);
GLuint renderBuffer = 0;
glGenRenderbuffersOES(1, &renderBuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, renderBuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES,
GL_RGBA8_OES,
renderBufferWidth,
renderBufferHeight);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES,
GL_COLOR_ATTACHMENT0_OES,
GL_RENDERBUFFER_OES,
renderBuffer);
GLenum status = glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES);
if (status != GL_FRAMEBUFFER_COMPLETE_OES) {
NSLog(@"Problem with OpenGL framebuffer after specifying color render buffer: %x", status);
}
// throw in a test drawing
glClearColor(0.5f, 0.5f, 0.5f, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
static const GLfloat triangleVertices[] = {
-0.5f, -0.33f,
0.5f, -0.33f,
-0.5f, 0.33f
};
static const GLfloat triangleColors[] = {
1.0, 0.0, 0.0, 0.5,
0.0, 1.0, 0.0, 0.5,
0.0, 0.0, 1.0, 0.5
};
GLint backingWidth = 320;
GLint backingHeight = 480;
NSLog(@"setting up view/model matrices");
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glVertexPointer(2, GL_FLOAT, 0, triangleVertices);
glEnableClientState(GL_VERTEX_ARRAY);
glColorPointer(4, GL_FLOAT, 0, triangleColors);
glEnableClientState(GL_COLOR_ARRAY);
// draw the triangle
glDrawArrays(GL_TRIANGLE_STRIP, 0, 3);
// Extract the resulting rendering as an image
int samplesPerPixel = 4; // R, G, B and A
int rowBytes = samplesPerPixel * renderBufferWidth;
char* bufferData = (char*)malloc(rowBytes * renderBufferHeight);
if (bufferData == NULL) {
NSLog(@"Unable to allocate buffer for image extraction.");
}
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
GL_BGRA,
GL_UNSIGNED_BYTE, bufferData);
NSLog(@"reading pixels from framebuffer");
// Flip it vertically - images read from OpenGL buffers are upside-down
char* flippedBuffer = (char*)malloc(rowBytes * renderBufferHeight);
if (flippedBuffer == NULL) {
NSLog(@"Unable to allocate flipped buffer for corrected image.");
}
for (int i = 0 ; i < renderBufferHeight ; i++) {
bcopy(bufferData + i * rowBytes,
flippedBuffer + (renderBufferHeight - i - 1) * rowBytes,
rowBytes);
}
// unbind my FBO
glBindFramebufferOES(GL_FRAMEBUFFER_OES, 0);
// Output the image to a file
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
int bitsPerComponent = 8;
CGBitmapInfo bitmapInfo = kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Host;
CGContextRef contextRef = CGBitmapContextCreate(flippedBuffer,
renderBufferWidth,
renderBufferHeight,
bitsPerComponent,
rowBytes, colorSpace, bitmapInfo);
if (contextRef == nil) {
NSLog(@"Unable to create CGContextRef.");
}
CGImageRef imageRef = CGBitmapContextCreateImage(contextRef);
if (imageRef == nil) {
NSLog(@"Unable to create CGImageRef.");
} else {
if (savedImage == NO) {
UIImage *myImage = [UIImage imageWithCGImage:imageRef];
UIImageWriteToSavedPhotosAlbum(myImage, nil, nil, nil);
savedImage = YES;
}
}
編輯:
答案,當然是位圖格式應該是GL_RGBA,不GL_BGRA:
// works on simulator with GL_BGRA, but not on device
glReadPixels(0, 0, renderBufferWidth,
renderBufferHeight,
**GL_RGBA**,
GL_UNSIGNED_BYTE, bufferData);
你有什麼進展嗎?我正在尋找類似的東西:http://stackoverflow.com/questions/4412587/how-can-i-access-the-raw-pixel-data-of-an-opengl-es-2-off-screen-渲染緩衝區 – akaru 2010-12-10 19:39:46
安德魯,你的編輯是否意味着改變解決了你的問題?如果是這樣,你應該添加它作爲答案並接受它。但是最讓我感興趣的是:你完成了基於GLImageProcessing的圖像處理庫嗎? – Palimondo 2011-02-23 02:06:20