2010-10-02 47 views
7

我對這一個瘋狂 - 已到處尋找,並嘗試了任何事情,我可以想到的一切。AVCapture appendSampleBuffer

我正在製作一個使用AVFoundation的iPhone應用程序 - 具體來說AVCapture使用iPhone相機捕捉視頻。

我需要在包含在錄像中的視頻供稿上覆蓋自定義圖像。

到目前爲止,我已經設置了AVCapture會話,可以顯示feed,訪問框架,將其保存爲UIImage並將覆蓋圖像疊加到它上面。然後將這個新的UIImage轉換爲一個CVPixelBufferRef。再次檢查bufferRef是否正在工作,我將其轉換回UIImage,並且仍然顯示圖像。

當我嘗試將CVPixelBufferRef轉換爲CMSampleBufferRef以追加到AVCaptureSessions assetWriterInput時,麻煩就開始了。當我嘗試創建它時,CMSampleBufferRef總是返回NULL。

這裏是 - (無效)captureOutput功能

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection 
{ 

UIImage *botImage = [self imageFromSampleBuffer:sampleBuffer]; 
UIImage *wheel = [self imageFromView:wheelView]; 

UIImage *finalImage = [self overlaidImage:botImage :wheel]; 
//[previewImage setImage:finalImage]; <- works -- the image is being merged into one UIImage 

CVPixelBufferRef pixelBuffer = NULL; 
CGImageRef cgImage = CGImageCreateCopy(finalImage.CGImage); 
CFDataRef image = CGDataProviderCopyData(CGImageGetDataProvider(cgImage)); 
int status = CVPixelBufferCreateWithBytes(NULL, 
      self.view.bounds.size.width, 
      self.view.bounds.size.height, 
      kCVPixelFormatType_32BGRA, 
      (void*)CFDataGetBytePtr(image), 
      CGImageGetBytesPerRow(cgImage), 
      NULL, 
      0, 
      NULL, 
      &pixelBuffer); 
if(status == 0){ 
    OSStatus result = 0; 

    CMVideoFormatDescriptionRef videoInfo = NULL; 
    result = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoInfo); 
    NSParameterAssert(result == 0 && videoInfo != NULL); 

    CMSampleBufferRef myBuffer = NULL; 
    result = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, 
      pixelBuffer, true, NULL, NULL, videoInfo, NULL, &myBuffer); 
    NSParameterAssert(result == 0 && myBuffer != NULL);//always null :S 

    NSLog(@"Trying to append"); 

    if (!CMSampleBufferDataIsReady(myBuffer)){ 
    NSLog(@"sampleBuffer data is not ready"); 
    return; 
    } 

    if (![assetWriterInput isReadyForMoreMediaData]){ 
    NSLog(@"Not ready for data :("); 
    return; 
    } 

    if (![assetWriterInput appendSampleBuffer:myBuffer]){ 
    NSLog(@"Failed to append pixel buffer"); 
    } 



} 

} 

另一種解決方案我一直聽到有人在使用它消除了需要做的凌亂CMSampleBufferRef包裝一個AVAssetWriterInputPixelBufferAdaptor。然而,我已經搜索了堆疊和蘋果開發者論壇和文檔,並且找不到關於如何設置或如何使用它的明確說明或示例。如果任何人有一個它的工作的例子,請你給我看看,或幫助我解決上述問題 - 一直在這個不停的工作一個星期,並在結束。

讓我知道如果你需要任何其他信息

由於提前,

邁克爾

+0

對不起,缺少代碼格式 - 它在預覽中顯得很好:S – 2010-10-02 15:20:56

回答

6

你需要AVAssetWriterInputPixelBufferAdaptor,這裏是代碼來創建它:

// Create dictionary for pixel buffer adaptor 
NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey, nil]; 

// Create pixel buffer adaptor 
m_pixelsBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:assetWriterInput sourcePixelBufferAttributes:bufferAttributes]; 

而且使用它的代碼:

// If ready to have more media data 
if (m_pixelsBufferAdaptor.assetWriterInput.readyForMoreMediaData) { 
    // Create a pixel buffer 
    CVPixelBufferRef pixelsBuffer = NULL; 
    CVPixelBufferPoolCreatePixelBuffer(NULL, m_pixelsBufferAdaptor.pixelBufferPool, &pixelsBuffer); 

    // Lock pixel buffer address 
    CVPixelBufferLockBaseAddress(pixelsBuffer, 0); 

    // Create your function to set your pixels data in the buffer (in your case, fill with your finalImage data) 
    [self yourFunctionToPutDataInPixelBuffer:CVPixelBufferGetBaseAddress(pixelsBuffer)]; 

    // Unlock pixel buffer address 
    CVPixelBufferUnlockBaseAddress(pixelsBuffer, 0); 

    // Append pixel buffer (calculate currentFrameTime with your needing, the most simplest way is to have a frame time starting at 0 and increment each time you write a frame with the time of a frame (inverse of your framerate)) 
    [m_pixelsBufferAdaptor appendPixelBuffer:pixelsBuffer withPresentationTime:currentFrameTime]; 

    // Release pixel buffer 
    CVPixelBufferRelease(pixelsBuffer); 
} 

並且不要忘記釋放你的pixelsBufferAdaptor。

+0

這是否真的有效?我嘗試手動創建像素緩衝區,並使用像素緩衝池進行嘗試。當我嘗試像上面定義的像素緩衝池時,它在模擬器中工作,但在設備上運行時從未分配像素緩衝池。 – 2011-06-04 19:51:00

1

我通過使用CMSampleBufferCreateForImageBuffer()來做到這一點。

OSStatus ret = 0; 
CMSampleBufferRef sample = NULL; 
CMVideoFormatDescriptionRef videoInfo = NULL; 
CMSampleTimingInfo timingInfo = kCMTimingInfoInvalid; 
timingInfo.presentationTimeStamp = pts; 
timingInfo.duration = duration; 

ret = CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixel, &videoInfo); 
if (ret != 0) { 
    NSLog(@"CMVideoFormatDescriptionCreateForImageBuffer failed! %d", (int)ret); 
    goto done; 
} 
ret = CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixel, true, NULL, NULL, 
             videoInfo, &timingInfo, &sample); 
if (ret != 0) { 
    NSLog(@"CMSampleBufferCreateForImageBuffer failed! %d", (int)ret); 
    goto done; 
} 
+0

你能指定如何設置timingInfo嗎?我看到「pts」和「duration」是那些常量?或者那些是什麼......你如何設置它們? – omarojo 2015-10-19 10:49:00

+0

在我的情況下,它不工作......但是錯誤是空的......但是當我得到CMVideoFormatDescriptionRef ....它的返回零。 – 2017-05-18 07:14:42

相關問題