2012-02-27 53 views
10

我有一堆難以獲取Isgl3d受控視圖的UIImage快照。看來我做了什麼,我只是最後一個黑色的方塊。捕獲Isgl3d輸出爲圖像

我在我的視圖中有一個工作攝像頭視圖和一個3d模型,我嘗試使用緩衝區方法和常規屏幕捕獲來獲取圖像,但沒有任何有效的結果。

有沒有人有一些源代碼,他們成功地拍攝了一個Isgl3d視圖的圖片?

回答

5

這裏是蘋果公司的說明&將GL視圖快照到UIImage的官方代碼(考慮到視網膜顯示,翻轉coords等),我一直在成功地使用它。當然,這不是iSGL3D特有的,但只要你能夠獲得合適的上下文和幀緩衝區來綁定,它就應該做正確的事情。 (由於頁的筆記,你之前-presentRenderbuffer:被稱爲所以渲染是有效的必須是一定要採取快照。)

https://developer.apple.com/library/ios/#qa/qa1704/_index.html

我只有與iSGL3D庫一個粗略的熟悉程度,以及它doesn」看起來好像有明顯的鉤子讓你渲染場景但不呈現它(或者先渲染它到屏幕外的緩衝區)。您可能需要介入的地方位於您正在使用的Isgl3dGLContext子類的-finalizeRender方法中,就在呼叫-presentRenderbuffer呼叫之前。這個上下文在這裏是一個內部框架類,所以你可能需要在庫中稍微改變一些東西來設置(比如說)一個來自上下文的委託,然後再回到視圖和導演之外,最終請求你的應用程序在「現在」通話之前採取任何行動,在此期間,如果您願意,您可以選擇運行截屏代碼,或者如果您不想做任何事情,請不要執行任何操作。

3

這是你想要的嗎?

這將從當前上下文和framebuffer中截取並保存到相冊中。

如果你不想保存到相冊,只需得到最終的UIImage。

還記得在完成繪圖之後,但在切換緩衝區之前調用它。

此外,如果您使用的是MSAA,則必須在glResolveMultisampleFramebufferAPPLE和新緩衝區綁定後調用。

#ifdef AUTOSCREENSHOT 

// callback for CGDataProviderCreateWithData 
void releaseData(void *info, const void *data, size_t dataSize) { 
    free((void*)data); 
} 

// callback for UIImageWriteToSavedPhotosAlbum 
- (void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo { 
    NSLog(@"Save finished"); 
    [image release]; 
} 

-(void)saveCurrentScreenToPhotoAlbum { 
    int height = (int)screenSize.y*retina; 
    int width = (int)screenSize.x*retina; 

    NSInteger myDataLength = width * height * 4; 
    GLubyte *buffer = (GLubyte *) malloc(myDataLength); 
    GLubyte *buffer2 = (GLubyte *) malloc(myDataLength); 
    glReadPixels(0, 0, width, height, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 
    for(int y = 0; y <height; y++) { 
     for(int x = 0; x < width * 4; x++) { 
      buffer2[(int)((height - 1 - y) * width * 4 + x)] = buffer[(int)(y * 4 * width + x)]; 
     } 
    } 
    free(buffer); 

    CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer2, myDataLength, releaseData); 
    int bitsPerComponent = 8; 
    int bitsPerPixel = 32; 
    int bytesPerRow = 4 * width; 
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault; 
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 
    CGImageRef imageRef = CGImageCreate(width, height, bitsPerComponent, bitsPerPixel, bytesPerRow, colorSpaceRef, bitmapInfo, provider, NULL, NO, renderingIntent); 

    CGColorSpaceRelease(colorSpaceRef); 
    CGDataProviderRelease(provider); 

    UIImage *image = [[UIImage alloc] initWithCGImage:imageRef]; 
    CGImageRelease(imageRef); 

    UIImageWriteToSavedPhotosAlbum(image, self, @selector(image:didFinishSavingWithError:contextInfo:), nil); 
} 

#endif 

我用這個代碼來保存定時截圖我玩,所以我有上帝的材料擺在應用商店的同時。

3

我在我的一個應用程序中成功使用此代碼段來執行OpenGL屏幕截圖。

enum { 
    red, 
    green, 
    blue, 
    alpha 
}; 

- (UIImage *)glToUIImage { 
    CGSize glSize = self.glView.bounds.size; 
    NSInteger bufDataLen = glSize.width * glSize.height * 4; 

    // Allocate array and read pixels into it. 
    GLubyte *buffer = (GLubyte *)malloc(bufDataLen); 
    glReadPixels(0, 0, glSize.width, glSize.height, GL_RGBA, GL_UNSIGNED_BYTE, buffer); 

    // We need to flip the image 
    NSUInteger maxRow = (NSInteger)glSize.height - 1; 
    NSUInteger bytesPerRow = (NSInteger)glSize.width * 4; 

    GLubyte *buffer2 = (GLubyte *)malloc(bufDataLen); 
    for(int y = maxRow; y >= 0; y--) { 
    for(int x = 0; x < bytesPerRow; x+=4) { 
     NSUInteger c0 = y * bytesPerRow + x; 
     NSUInteger c1 = (maxRow - y) * bytesPerRow + x; 
     buffer2[c0+red] = buffer[c1+red]; 
     buffer2[c0+green] = buffer[c1+green]; 
     buffer2[c0+blue] = buffer[c1+blue]; 
     buffer2[c0+alpha] = buffer[c1+alpha]; 
    } 
    } 
    free(buffer); 

    // Make data provider with data 
    CFDataRef imageData = CFDataCreate(NULL, buffer2, bufDataLen); 
    free(buffer2); 

    CGDataProviderRef provider = CGDataProviderCreateWithCFData(imageData); 
    CFRelease(imageData); 

    // Bitmap format 
    int bitsPerComponent = 8; 
    int bitsPerPixel = 32; 
    CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB(); 
    CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast; 
    CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault; 

    // Create the CGImage 
    CGImageRef imageRef = CGImageCreate(glSize.width, 
             glSize.height, 
             bitsPerComponent, 
             bitsPerPixel, 
             bytesPerRow, 
             colorSpaceRef, 
             bitmapInfo, 
             provider, 
             NULL, 
             NO, 
             renderingIntent); 

    // Clean up 
    CGColorSpaceRelease(colorSpaceRef); 
    CGDataProviderRelease(provider); 

    // Convert to UIImage 
    UIImage *image = [[UIImage alloc] initWithCGImage:imageRef]; 
    CGImageRelease(imageRef); 

    return [image autorelease]; 
} 

確保您綁定的幀緩衝區在這之前,像這樣

glBindFramebufferOES(GL_FRAMEBUFFER_OES, myFrameBuffer); 
glViewport(0, 0, myBackingWidth, myBackingHeight); 

並調用-glToUIImage呈現幀緩衝器之前!

有關更多信息Apple提供sample code從OpenGL截取屏幕截圖。

+0

看起來很有用,但你從哪裏得到你的紅色,綠色,藍色值? – 2012-03-06 15:19:09

+0

這只是一個枚舉來命名偏移量(0,1,2,3),忘記了在這裏添加這個。更新了片段。 – 2012-03-06 16:28:06

2

我想出了這個可能的解決方案。你必須修改一下isgl3d的庫。

的步驟是:

1.

爲Isgl3dGLContext1創建委託:

在Isgl3dGLContext1.h

@protocol ScreenShooterDelegate; 

#import <OpenGLES/ES1/gl.h> 
#import <OpenGLES/ES1/glext.h> 
#import "Isgl3dGLContext.h" 

@interface Isgl3dGLContext1 : Isgl3dGLContext { 

    NSObject<ScreenShooterDelegate>* __unsafe_unretained delegate; 

    GLuint _colorRenderBuffer; 
@private 
    EAGLContext * _context; 

    // The OpenGL names for the framebuffer and renderbuffer used to render to this view 
    GLuint _defaultFrameBuffer; 


    GLuint _depthAndStencilRenderBuffer; 
    GLuint _depthRenderBuffer; 
    GLuint _stencilRenderBuffer; 

    // OpenGL MSAA buffers 
    GLuint _msaaFrameBuffer; 
    GLuint _msaaColorRenderBuffer; 

    GLuint _msaaDepthAndStencilRenderBuffer; 
    GLuint _msaaDepthRenderBuffer; 
    GLuint _msaaStencilRenderBuffer; 
} 

- (id) initWithLayer:(CAEAGLLayer *) layer; 
@property (assign) NSObject<ScreenShooterDelegate>* delegate; 
@property BOOL takePicture; 
@property GLuint colorRenderBuffer; 

@end 

@protocol ScreenShooterDelegate 


@optional 

- (void)takePicture; 

@end 

2.

。該代碼添加到Isgl3dGLContext1.m:

@synthesize takePicture; 
@synthesize colorRenderBuffer = _colorRenderBuffer; 

前行[_context presentRenderbuffer:GL_RENDERBUFFER_OES]。在 - (空)finalizeRender:

if(takePicture){ 
     takePicture=NO; 
     if([delegate respondsToSelector:@selector(takePicture)]){ 
      [delegate takePicture]; 
     } 
    } 

3把這段代碼中的類要採取截圖:

In Class.h add <ScreenShooterDelegate> 

在該方法中Class.m

[Isgl3dDirector sharedInstance].antiAliasingEnabled = NO; 

Photos3DAppDelegate *appDelegate = (Photos3DAppDelegate *)[[UIApplication sharedApplication] delegate]; 
[appDelegate.inOutSceneView showSphere]; 

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView]; 
Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext]; 
_glContext.delegate=self; 
_glContext.takePicture=YES; 

在方法 - (void)takePicture {}將Apple代碼放在方法add [Isgl3dDirector sharedInstance] .antiAliasingEnabled = YES; (如果你使用它)

//https://developer.apple.com/library/ios/#qa/qa1704/_index.html 

-(void)takePicture{ 


NSLog(@"Creating Foto"); 

GLint backingWidth, backingHeight; 

Isgl3dEAGLView* eaglview=(Isgl3dEAGLView*)[[Isgl3dDirector sharedInstance] openGLView]; 
//Isgl3dGLContext1 * _glContext=(Isgl3dGLContext1*)[eaglview glContext]; 
//glBindRenderbufferOES(GL_RENDERBUFFER_OES, _glContext.colorRenderBuffer); 

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth); 
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight); 

NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight; 
NSInteger dataLength = width * height * 4; 
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte)); 

// Read pixel data from the framebuffer 
glPixelStorei(GL_PACK_ALIGNMENT, 4); 
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data); 

// Create a CGImage with the pixel data 
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel 
// otherwise, use kCGImageAlphaPremultipliedLast 
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL); 
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB(); 
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast, 
           ref, NULL, true, kCGRenderingIntentDefault); 

// OpenGL ES measures data in PIXELS 
// Create a graphics context with the target size measured in POINTS 
NSInteger widthInPoints, heightInPoints; 
if (NULL != UIGraphicsBeginImageContextWithOptions) { 
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration 
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor 
    // so that you get a high-resolution snapshot when its value is greater than 1.0 
    CGFloat scale = eaglview.contentScaleFactor; 
    widthInPoints = width/scale; 
    heightInPoints = height/scale; 
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale); 
} 
else { 
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext 
    widthInPoints = width; 
    heightInPoints = height; 
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints)); 
} 

CGContextRef cgcontext = UIGraphicsGetCurrentContext(); 

// UIKit coordinate system is upside down to GL/Quartz coordinate system 
// Flip the CGImage by rendering it to the flipped bitmap context 
// The size of the destination area is measured in POINTS 
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy); 
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref); 

// Retrieve the UIImage from the current context 
UIImage *image = UIGraphicsGetImageFromCurrentImageContext(); 

UIGraphicsEndImageContext(); 

// Clean up 
free(data); 
CFRelease(ref); 
CFRelease(colorspace); 
CGImageRelease(iref); 

UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil); 

[Isgl3dDirector sharedInstance].antiAliasingEnabled = YES; 
} 

注:對於我來說工作只是評論glBindRenderbufferOES(GL_RENDERBUFFER_OES,_colorRenderbuffer);和 在你的情況下,你可以用Isgl3dGLContext2而不是Isgl3dGLContext1來完成這些步驟。