您知道蘋果的CameraRipple效果的示例代碼嗎?那麼我試圖在openGL完成水的所有冷卻效果之後將相機輸出記錄在文件中。OpenGL ES到iOS中的視頻(渲染爲具有iOS 5紋理緩存的紋理)
我已經用glReadPixels完成了它,我讀取了void *緩衝區中的所有像素,創建CVPixelBufferRef並將其附加到AVAssetWriterInputPixelBufferAdaptor,但它太慢了,因爲readPixels需要大量時間。我發現使用FBO和紋理現金你可以做同樣的事情,但速度更快。下面是我在drawInRect方法的代碼,蘋果使用:
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe);
if (err)
{
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs2;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs2,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);
CVPixelBufferRef pixiel_bufer4e = NULL;
CVPixelBufferCreate(kCFAllocatorDefault,
(int)_screenWidth,
(int)_screenHeight,
kCVPixelFormatType_32BGRA,
attrs2,
&pixiel_bufer4e);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
coreVideoTextureCashe, pixiel_bufer4e,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)_screenWidth,
(int)_screenHeight,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs2);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0);
if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) {
float result = currentTime.value;
NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result);
currentTime = CMTimeAdd(currentTime, frameLength);
}
CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0);
CVPixelBufferRelease(pixiel_bufer4e);
CFRelease(renderTexture);
CFRelease(coreVideoTextureCashe);
它記錄了一個視頻,這是相當快的,但該視頻只是黑我覺得textureCasheRef是不正確的還是我填寫錯了。
作爲更新,這裏是我嘗試過的另一種方式。我肯定錯過了什麼。在viewDidLoad中,之後我設置的OpenGL上下文我這樣做:
CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe);
if (err)
{
NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d");
}
//creats the pixel buffer
pixel_buffer = NULL;
CVPixelBufferPoolCreatePixelBuffer (NULL, [pixelAdapter pixelBufferPool], &pixel_buffer);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCashe, pixel_buffer,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)screenWidth,
(int)screenHeight,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
然後在drawInRect:我這樣做:
if(isRecording&&writerInput.readyForMoreMediaData) {
CVPixelBufferLockBaseAddress(pixel_buffer, 0);
if([pixelAdapter appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) {
currentTime = CMTimeAdd(currentTime, frameLength);
}
CVPixelBufferLockBaseAddress(pixel_buffer, 0);
CVPixelBufferRelease(pixel_buffer);
}
然而,它與上渲染紋理,這是不爲零,但0x000000001 bad_acsess崩潰。
UPDATE
下面我實際上設法拉視頻文件,但也有一些綠色和紅色閃爍的代碼。我使用BGRA pixelFormatType。
在這裏,我創建了紋理緩存:
CVReturn err2 = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe);
if (err2)
{
NSLog(@"Error at CVOpenGLESTextureCacheCreate %d", err);
return;
}
然後在drawInRect我稱之爲:
if(isRecording&&writerInput.readyForMoreMediaData) {
[self cleanUpTextures];
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs2;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs2,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget);
CVPixelBufferRef pixiel_bufer4e = NULL;
CVPixelBufferCreate(kCFAllocatorDefault,
(int)_screenWidth,
(int)_screenHeight,
kCVPixelFormatType_32BGRA,
attrs2,
&pixiel_bufer4e);
CVOpenGLESTextureRef renderTexture;
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault,
coreVideoTextureCashe, pixiel_bufer4e,
NULL, // texture attributes
GL_TEXTURE_2D,
GL_RGBA, // opengl format
(int)_screenWidth,
(int)_screenHeight,
GL_BGRA, // native iOS format
GL_UNSIGNED_BYTE,
0,
&renderTexture);
CFRelease(attrs2);
CFRelease(empty);
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0);
CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0);
if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) {
float result = currentTime.value;
NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result);
currentTime = CMTimeAdd(currentTime, frameLength);
}
CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0);
CVPixelBufferRelease(pixiel_bufer4e);
CFRelease(renderTexture);
// CFRelease(coreVideoTextureCashe);
}
我知道我可以在這裏不能做所有這些事情優化這個有很多,但我想用它來工作。在cleanUpTextures我沖洗textureCache有:
CVOpenGLESTextureCacheFlush(coreVideoTextureCashe, 0);
的東西可能是錯誤的RGBA東西,或者我不知道,但它似乎仍然得到一種錯誤的高速緩存。
請提供您收到的內存警告或崩潰的屏幕截圖。 – Dayan 2012-07-30 12:52:16
當我按下記錄按鈕時,它會調用我的AssetWriter的startWriting,它會凍結第二個和收到的內存警告。出現 – user1562826 2012-07-30 13:10:48
您是否能夠使用此方法錄製視頻並同時將內容顯示在屏幕上? – 2013-09-10 06:52:58