2013-06-05 98 views
2

我正在爲iOS編寫OpenGL應用程序,並且需要呈現渲染場景的應用程序內截圖。當我不使用多重採樣時,所有工作都正常。但是當我打開多重採樣時,glReadPixels不會返回正確的數據(場景繪製正確 - 圖形質量比多重採樣好得多)。glReadPixels返回零點與多重採樣

我已經籤一堆類似的問題在SO,並在其他一些地方,但它們都沒有解決我的問題,因爲我已經在做了關於擬議方式:

  1. 我緩衝之後,採取截圖已解決,但呈現呈現緩衝區之前。
  2. glReadPixels不返回錯誤。
  3. 我甚至試圖設置kEAGLDrawablePropertyRetainedBackingYES並在緩衝區顯示後進行屏幕截圖 - 也不起作用。
  4. 我支持OpenGLES 1.x渲染API(背景與kEAGLRenderingAPIOpenGLES1初始化)

基本上我的想法是什麼可能是錯誤的。在SO上發佈問題是我的最後手段。

這是相關的源代碼:

創建幀緩衝器

- (BOOL)createFramebuffer 
{ 

    glGenFramebuffersOES(1, &viewFramebuffer); 
    glGenRenderbuffersOES(1, &viewRenderbuffer); 

    glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer); 
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); 
    [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:(CAEAGLLayer*)self.layer]; 
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, viewRenderbuffer); 

    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth); 
    glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight); 

    // Multisample support 

    glGenFramebuffersOES(1, &sampleFramebuffer); 
    glBindFramebufferOES(GL_FRAMEBUFFER_OES, sampleFramebuffer); 

    glGenRenderbuffersOES(1, &sampleColorRenderbuffer); 
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleColorRenderbuffer); 
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_RGBA8_OES, backingWidth, backingHeight); 
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES, GL_RENDERBUFFER_OES, sampleColorRenderbuffer); 

    glGenRenderbuffersOES(1, &sampleDepthRenderbuffer); 
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, sampleDepthRenderbuffer); 
    glRenderbufferStorageMultisampleAPPLE(GL_RENDERBUFFER_OES, 4, GL_DEPTH_COMPONENT16_OES, backingWidth, backingHeight); 
    glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, sampleDepthRenderbuffer); 

    // End of multisample support 

    if(glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES) { 
     NSLog(@"failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES)); 
     return NO; 
    } 

    return YES; 
} 

解決緩衝器部分,並採取快照

glBindFramebufferOES(GL_DRAW_FRAMEBUFFER_APPLE, viewFramebuffer); 
    glBindFramebufferOES(GL_READ_FRAMEBUFFER_APPLE, sampleFramebuffer); 
    glResolveMultisampleFramebufferAPPLE(); 
    [self checkGlError]; 

    //glFinish(); 

    if (capture) 
     captureImage = [self snapshot:self];  

    const GLenum discards[] = {GL_COLOR_ATTACHMENT0_OES,GL_DEPTH_ATTACHMENT_OES}; 
    glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE,2,discards); 

    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer);  

    [context presentRenderbuffer:GL_RENDERBUFFER_OES];  

快照方法(基本上從蘋果複製docs)

- (UIImage*)snapshot:(UIView*)eaglview 
{ 

    // Bind the color renderbuffer used to render the OpenGL ES view 
    // If your application only creates a single color renderbuffer which is already bound at this point, 
    // this call is redundant, but it is needed if you're dealing with multiple renderbuffers. 
    // Note, replace "_colorRenderbuffer" with the actual name of the renderbuffer object defined in your class.  
    glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); 


    NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight; 
    NSInteger dataLength = width * height * 4; 
    GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte)); 

    // Read pixel data from the framebuffer 
    glPixelStorei(GL_PACK_ALIGNMENT, 4); 
    [self checkGlError]; 
    glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data); 
    [self checkGlError]; 

    // Create a CGImage with the pixel data 
    // If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel 
    // otherwise, use kCGImageAlphaPremultipliedLast 
    CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL); 
    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB(); 
    CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast, 
           ref, NULL, true, kCGRenderingIntentDefault); 

    // OpenGL ES measures data in PIXELS 
    // Create a graphics context with the target size measured in POINTS 
    NSInteger widthInPoints, heightInPoints; 
    if (NULL != UIGraphicsBeginImageContextWithOptions) { 
     // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration 
     // Set the scale parameter to your OpenGL ES view's contentScaleFactor 
     // so that you get a high-resolution snapshot when its value is greater than 1.0 
     CGFloat scale = eaglview.contentScaleFactor; 
     widthInPoints = width/scale; 
     heightInPoints = height/scale; 
     UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale); 
    } 
    else { 
     // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext 
     widthInPoints = width; 
     heightInPoints = height; 
     UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints)); 
    } 

    CGContextRef cgcontext = UIGraphicsGetCurrentContext(); 

    // UIKit coordinate system is upside down to GL/Quartz coordinate system 
    // Flip the CGImage by rendering it to the flipped bitmap context 
    // The size of the destination area is measured in POINTS 
    CGContextSetBlendMode(cgcontext, kCGBlendModeCopy); 
    CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref); 

    // Retrieve the UIImage from the current context 
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); 

    UIGraphicsEndImageContext(); 

    // Clean up 
    free(data); 
    CFRelease(ref); 
    CFRelease(colorspace); 
    CGImageRelease(iref); 

    return image; 
} 

回答

2

您通過綁定viewFramebuffer爲平局幀緩衝和sampleFramebuffer爲已讀幀緩衝後做一個glResolveMultisampleFramebufferAPPLE解決多重採樣緩衝區如常。但是您是否還記得將viewFramebuffer作爲讀幀緩衝區(glBindFramebuffer(GL_READ_FRAMEBUFFER, viewFramebuffer))然後在glReadPixels之前綁定? glReadPixels將始終從當前綁定的讀取幀緩衝區中讀取,並且如果在多重採樣解析後沒有更改此綁定,這仍然是多重採樣幀緩衝區,而不是默認值。

我還發現您的glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer) -calls相當刺激性的,因爲這並沒有真正做什麼有意義的事情,當前綁定的渲染是僅適用於renderbuffers工作職能相關的(實際上僅glRenderbufferStorage)(但也可能是ES確實一些有意義的東西和綁定它需要[context presentRenderbuffer:GL_RENDERBUFFER_OES]工作)。但儘管如此,也許你認爲這約束力還控制glReadPixels將讀取緩衝區,但這不是情況下,它總是會從目前的幀緩衝勢必GL_READ_FRAMEBUFFER閱讀。

+0

感謝您的回答。我會在大約12個小時內到達我的電腦時嘗試它,如果它解決了我的問題,我會接受您的答案。多重採樣解決後,我沒有綁定framebuffer,所以你的回答是有道理的。出於某種原因,我認爲glResolveMultisampleFramebufferAPPLE會自動執行此操作。 (我正在搜索這個方法的文檔,沒有運氣)。關於glBindRenderBufferOES你可能是對的,但所有上面的代碼是或多或少從蘋果的例子複製粘貼,所以我只是想玩安全:) – Kovasandra

+0

這個答案解決了我的問題。再一次非常感謝你 :) – Kovasandra