2016-04-29 66 views
0

未壓縮這是我試過到目前爲止相機的配置:的UIImage從AVCaptureStillImageOutput

AVCaptureSession *session = [[AVCaptureSession alloc] init]; 
    [session setSessionPreset:AVCaptureSessionPresetInputPriority]; 

    AVCaptureDevice *videoDevice = [AVCamViewController deviceWithMediaType:AVMediaTypeVideo preferringPosition:AVCaptureDevicePositionBack]; 

    NSError *errorVideo; 

    AVCaptureDeviceFormat *deviceFormat = nil; 
    for (AVCaptureDeviceFormat *format in videoDevice.formats) { 
     CMVideoDimensions dim = CMVideoFormatDescriptionGetDimensions(format.formatDescription); 

     if (dim.width == 2592 && dim.height == 1936) { 
      deviceFormat = format; 
      break; 
     } 
    } 

    [videoDevice lockForConfiguration:&errorVideo]; 
    if (deviceFormat) { 
     videoDevice.activeFormat = deviceFormat; 

     if ([videoDevice isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure]) { 
      [videoDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure]; 
     } 

     if ([videoDevice isAutoFocusRangeRestrictionSupported]) { 
      [videoDevice setAutoFocusRangeRestriction:AVCaptureAutoFocusRangeRestrictionFar]; 
     } 
    } 
    [videoDevice unlockForConfiguration]; 

    AVCaptureDeviceInput *videoDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error]; 

    if ([session canAddInput:videoDeviceInput]) { 
     [session addInput:videoDeviceInput]; 
    } 

    AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; 

    if ([session canAddOutput:stillImageOutput]) { 
     [stillImageOutput setOutputSettings:@{(id)kCVPixelBufferPixelFormatTypeKey:@(kCVPixelFormatType_32BGRA)}]; 
     [session addOutput:stillImageOutput]; 
    } 

這是我試圖從CMSamplebuffer得到的UIImage:

[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { 

     if (imageDataSampleBuffer && !error) { 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       UIImage *image = [self imageFromSampleBuffer:imageDataSampleBuffer]; 
      }); 
     } 
    }]; 

這是一個蘋果示例代碼:

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{ 
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

CVPixelBufferLockBaseAddress(imageBuffer, 0); 

void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

// Get the number of bytes per row for the pixel buffer 
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 



// Create a device-dependent RGB color space 
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

// Create a bitmap graphics context with the sample buffer data 
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
              bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
// Create a Quartz image from the pixel data in the bitmap graphics context 
CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
// Unlock the pixel buffer 
CVPixelBufferUnlockBaseAddress(imageBuffer,0); 


// Free up the context and color space 
CGContextRelease(context); 
CGColorSpaceRelease(colorSpace); 

// Create an image object from the Quartz image 
UIImage *image = [UIImage imageWithCGImage:quartzImage]; 

// Release the Quartz image 
CGImageRelease(quartzImage); 

return (image); 
} 

但是圖像始終爲零。 進行一些調試後。我發現這個函數總是返回nil CMSampleBufferGetImageBuffer(sampleBuffer);

任何人都可以幫忙嗎?

+0

你確定你正確設置了所有的輸入和東西嗎? – SeanLintern88

+0

添加了包含所有配置的代碼的精簡版本。對於相機和輸出 – Bogus

回答

1

這是因爲CMSampleBufferRef必須立即處理,因爲它非常快速且高效地釋放。

這裏是我的生成圖像代碼:

let connection = imageFileOutput.connectionWithMediaType(AVMediaTypeVideo) 

if connection != nil { 
    imageFileOutput.captureStillImageAsynchronouslyFromConnection(connection) { [weak self] (buffer, err) -> Void in 
     if CMSampleBufferIsValid(buffer) { 
      let imageDataJpeg = self?.imageFromSampleBuffer(buffer) 
     } else { 
      print(err) 
     } 
    } 
} 

正如你可以看到我把它變成一個圖像,同時仍然在這個函數的範圍。一旦它是一張圖片,我將它發送出去處理。

+0

,請將jpegstillimagensdatarepresentation更改爲我提供的方法。我知道這不是相關的,但可能是後代。在未壓縮的標題中我特別難過。再次感謝! – Bogus

+0

當你說未壓縮時,你的意思是你想要最高分辨率?如果是這樣,您可以將會話中的預設更改爲AVCaptureSessionPresetPhoto? – SeanLintern88

+0

蘋果文檔:_「在iOS上目前唯一支持的鍵是AVVideoCodecKey和kCVPixelBufferPixelFormatTypeKey的關鍵是相互排斥的,只有一個可能存在的推薦值是kCMVideoCodecType_JPEG爲AVVideoCodecKey和kCVPixelFormatType_420YpCbCr8BiPlanarFullRange和kCVPixelFormatType_32BGRA爲kCVPixelBufferPixelFormatTypeKey。」 _如果一個人想要有原始sampleBuffer非JPEG編碼的sampleBuffer,應使用kCVPixelBufferPixelFormatTypeKey。 – Bogus