當前我使用Image I/O在iOS中加載紋理,並使用Core Graphics提取其圖像數據。然後,我可以將圖像數據發送到OpenGL的是這樣的:Objective-c:將紋理加載到OpenGL的最有效方法
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, texture->width, texture->height, 0, GL_RGBA, GL_UNSIGNED_BYTE, texture->imageData);
的問題是,核芯顯卡部分實在是太慢了,我需要建立和核芯顯卡都僅提取圖像數據...我不想在屏幕上顯示它。必須有一個更有效的方式在iOS中提取圖像數據...
這裏是我的代碼:?
...
myTexRef = CGImageSourceCreateWithURL((__bridge CFURLRef)url, myOptions);
...
MyTexture2D* texture;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
void *imageData = malloc(tileSize.width * tileSize.height * 4);
CGContextRef imgContext = CGBitmapContextCreate(imageData, tileSize.width, tileSize.height, 8, 4 * tileSize.width, colorSpace, kCGImageAlphaNoneSkipLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextClearRect(imgContext, CGRectMake(0, 0, tileSize.width, tileSize.height));
CGContextTranslateCTM(imgContext, 0, 0);
...
CGImageRef tiledImage = CGImageCreateWithImageInRect (imageRef, tileArea);
CGRect drawRect = CGRectMake(0, 0, tileSize.width, tileSize.height);
// *** THIS CALL IS REALLY EXPENSIVE!
CGContextDrawImage(imgContext, drawRect, tiledImage);
CGImageRelease(tiledImage);
// TamTexture2D takes the ownership of imageData and will be responsible to free it
texture = new MyTexture2D(tileSize.width, tileSize.height, imageData);
CGContextRelease(imgContext);
你的代碼似乎不只是加載圖像數據(平鋪?)如果你想要優化的幫助,你應該發佈完整的代碼。 – 2013-09-02 20:39:08