2012-07-30 82 views
4

您知道蘋果的CameraRipple效果的示例代碼嗎?那麼我試圖在openGL完成水的所有冷卻效果之後將相機輸出記錄在文件中。OpenGL ES到iOS中的視頻(渲染爲具有iOS 5紋理緩存的紋理)

我已經用glReadPixels完成了它,我讀取了void *緩衝區中的所有像素,創建CVPixelBufferRef並將其附加到AVAssetWriterInputPixelBufferAdaptor,但它太慢了,因爲readPixels需要大量時間。我發現使用FBO和紋理現金你可以做同樣的事情,但速度更快。下面是我在drawInRect方法的代碼,蘋果使用:

CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); 
if (err) 
{ 
    NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d"); 
} 


CFDictionaryRef empty; // empty value for attr value. 
CFMutableDictionaryRef attrs2; 
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
          NULL, 
          NULL, 
          0, 
          &kCFTypeDictionaryKeyCallBacks, 
          &kCFTypeDictionaryValueCallBacks); 
attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault, 
            1, 
            &kCFTypeDictionaryKeyCallBacks, 
            &kCFTypeDictionaryValueCallBacks); 

CFDictionarySetValue(attrs2, 
        kCVPixelBufferIOSurfacePropertiesKey, 
        empty); 

//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget); 
CVPixelBufferRef pixiel_bufer4e = NULL; 

CVPixelBufferCreate(kCFAllocatorDefault, 
        (int)_screenWidth, 
        (int)_screenHeight, 
        kCVPixelFormatType_32BGRA, 
        attrs2, 
        &pixiel_bufer4e); 
CVOpenGLESTextureRef renderTexture; 
CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, 
               coreVideoTextureCashe, pixiel_bufer4e, 
               NULL, // texture attributes 
               GL_TEXTURE_2D, 
               GL_RGBA, // opengl format 
               (int)_screenWidth, 
               (int)_screenHeight, 
               GL_BGRA, // native iOS format 
               GL_UNSIGNED_BYTE, 
               0, 
               &renderTexture); 
CFRelease(attrs2); 
CFRelease(empty); 
glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); 

CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0); 

if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) { 
       float result = currentTime.value; 
      NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result); 
       currentTime = CMTimeAdd(currentTime, frameLength); 
     } 

CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0); 
CVPixelBufferRelease(pixiel_bufer4e); 
CFRelease(renderTexture); 
CFRelease(coreVideoTextureCashe); 

它記錄了一個視頻,這是相當快的,但該視頻只是黑我覺得textureCasheRef是不正確的還是我填寫錯了。

作爲更新,這裏是我嘗試過的另一種方式。我肯定錯過了什麼。在viewDidLoad中,之後我設置的OpenGL上下文我這樣做:

CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); 

    if (err) 
    { 
     NSAssert(NO, @"Error at CVOpenGLESTextureCacheCreate %d"); 
    } 

    //creats the pixel buffer 

    pixel_buffer = NULL; 
    CVPixelBufferPoolCreatePixelBuffer (NULL, [pixelAdapter pixelBufferPool], &pixel_buffer); 

    CVOpenGLESTextureRef renderTexture; 
    CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, coreVideoTextureCashe, pixel_buffer, 
                NULL, // texture attributes 
                GL_TEXTURE_2D, 
                GL_RGBA, // opengl format 
                (int)screenWidth, 
                (int)screenHeight, 
                GL_BGRA, // native iOS format 
                GL_UNSIGNED_BYTE, 
                0, 
                &renderTexture); 

    glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); 
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); 

然後在drawInRect:我這樣做:

if(isRecording&&writerInput.readyForMoreMediaData) { 
    CVPixelBufferLockBaseAddress(pixel_buffer, 0); 

    if([pixelAdapter appendPixelBuffer:pixel_buffer withPresentationTime:currentTime]) { 
     currentTime = CMTimeAdd(currentTime, frameLength); 
    } 
    CVPixelBufferLockBaseAddress(pixel_buffer, 0); 
    CVPixelBufferRelease(pixel_buffer); 
} 

然而,它與上渲染紋理,這是不爲零,但0x000000001 bad_acsess崩潰。

UPDATE

下面我實際上設法拉視頻文件,但也有一些綠色和紅色閃爍的代碼。我使用BGRA pixelFormatType。

在這裏,我創建了紋理緩存:

CVReturn err2 = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, (__bridge void *)_context, NULL, &coreVideoTextureCashe); 
if (err2) 
{ 
    NSLog(@"Error at CVOpenGLESTextureCacheCreate %d", err); 
    return; 
} 

然後在drawInRect我稱之爲:

if(isRecording&&writerInput.readyForMoreMediaData) { 
    [self cleanUpTextures]; 



    CFDictionaryRef empty; // empty value for attr value. 
    CFMutableDictionaryRef attrs2; 
    empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
          NULL, 
          NULL, 
          0, 
          &kCFTypeDictionaryKeyCallBacks, 
          &kCFTypeDictionaryValueCallBacks); 
    attrs2 = CFDictionaryCreateMutable(kCFAllocatorDefault, 
            1, 
            &kCFTypeDictionaryKeyCallBacks, 
            &kCFTypeDictionaryValueCallBacks); 

    CFDictionarySetValue(attrs2, 
        kCVPixelBufferIOSurfacePropertiesKey, 
        empty); 

//CVPixelBufferPoolCreatePixelBuffer (NULL, [assetWriterPixelBufferInput pixelBufferPool], &renderTarget); 
    CVPixelBufferRef pixiel_bufer4e = NULL; 

    CVPixelBufferCreate(kCFAllocatorDefault, 
        (int)_screenWidth, 
        (int)_screenHeight, 
        kCVPixelFormatType_32BGRA, 
        attrs2, 
        &pixiel_bufer4e); 
    CVOpenGLESTextureRef renderTexture; 
    CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, 
               coreVideoTextureCashe, pixiel_bufer4e, 
               NULL, // texture attributes 
               GL_TEXTURE_2D, 
               GL_RGBA, // opengl format 
               (int)_screenWidth, 
               (int)_screenHeight, 
               GL_BGRA, // native iOS format 
               GL_UNSIGNED_BYTE, 
               0, 
               &renderTexture); 
    CFRelease(attrs2); 
    CFRelease(empty); 
    glBindTexture(CVOpenGLESTextureGetTarget(renderTexture), CVOpenGLESTextureGetName(renderTexture)); 
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE); 
    glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE); 

    glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(renderTexture), 0); 

    CVPixelBufferLockBaseAddress(pixiel_bufer4e, 0); 

    if([pixelAdapter appendPixelBuffer:pixiel_bufer4e withPresentationTime:currentTime]) { 
     float result = currentTime.value; 
     NSLog(@"\n\n\4eta danni i current time e : %f \n\n",result); 
     currentTime = CMTimeAdd(currentTime, frameLength); 
    } 

    CVPixelBufferUnlockBaseAddress(pixiel_bufer4e, 0); 
    CVPixelBufferRelease(pixiel_bufer4e); 
    CFRelease(renderTexture); 
    // CFRelease(coreVideoTextureCashe); 
} 

我知道我可以在這裏不能做所有這些事情優化這個有很多,但我想用它來工作。在cleanUpTextures我沖洗textureCache有:

CVOpenGLESTextureCacheFlush(coreVideoTextureCashe, 0); 

的東西可能是錯誤的RGBA東西,或者我不知道,但它似乎仍然得到一種錯誤的高速緩存。

+0

請提供您收到的內存警告或崩潰的屏幕截圖。 – Dayan 2012-07-30 12:52:16

+0

當我按下記錄按鈕時,它會調用我的AssetWriter的startWriting,它會凍結第二個和收到的內存警告。出現 – user1562826 2012-07-30 13:10:48

+0

您是否能夠使用此方法錄製視頻並同時將內容顯示在屏幕上? – 2013-09-10 06:52:58

回答

3

對於錄製視頻,這不是我要用的方法。您正在爲每個渲染幀創建一個新的像素緩衝區,這將緩慢,並且您永遠不會釋放它,因此獲取內存警告並不令人驚訝。

取而代之,按照我在this answer中描述的內容進行操作。我爲緩存紋理創建一個像素緩衝區一次,將該紋理分配給我渲染的FBO,然後使用每個幀的AVAssetWriter的像素緩衝區輸入添加該像素緩衝區。使用單個像素緩衝區比重新創建每個幀要快得多。您還希望保留與FBO的紋理目標關聯的像素緩衝區,而不是將其關聯到每個幀。

我封裝在我的開源GPUImage框架GPUImageMovieWriter內該記錄代碼,如果你想看看它是如何工作的實踐。正如我在上面鏈接的答案中指出的那樣,以這種方式進行記錄會導致編碼速度非常快。

+0

好的,這是我在drawInRect中做的,它速度很快,但它記錄了黑色視頻。我認爲textureCasheRef它是空的或不是正確的,不知道 我更新了問題 – user1562826 2012-07-31 14:11:02

+0

@ user1562826 - 未來,隨時更新您的原始問題與新信息。我已經在這裏爲你做了。你不是試圖在與渲染不同的線程上訪問這個像素緩衝區及其綁定的紋理,對嗎?從多個線程同時訪問OpenGL ES上下文可能會導致崩潰。 – 2012-07-31 16:23:14

+0

不,我只使用一個線程。我設法獲得了視頻文件,但我仍然認爲紋理緩存不能從OpenGL上下文中正確提取,因爲在視頻中,屏幕的兩個角落有這條線,並且它在線之上有點紅色,並且有點在綠色下方並且有這些奇怪的閃光。我更新我的問題。謝謝您的幫助! – user1562826 2012-08-01 12:46:12