2016-12-01 91 views
-1

在osx上我使用AVFoundation從USB攝像頭捕捉圖像,一切正常,但我得到的圖像比實時視頻更暗。AVFoundation拍攝的圖像是黑暗的

設備捕獲配置

-(BOOL)prepareCapture{ 
captureSession = [[AVCaptureSession alloc] init]; 
NSError *error; 

imageOutput=[[AVCaptureStillImageOutput alloc] init]; 
NSNumber * pixelFormat = [NSNumber numberWithInt:k32BGRAPixelFormat]; 
[imageOutput setOutputSettings:[NSDictionary dictionaryWithObject:pixelFormat forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; 

videoOutput=[[AVCaptureMovieFileOutput alloc] init]; 

AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:MyVideoDevice error:&error]; 
if (videoInput) { 
    [captureSession beginConfiguration]; 
    [captureSession addInput:videoInput]; 
    [captureSession setSessionPreset:AVCaptureSessionPresetHigh]; 
    //[captureSession setSessionPreset:AVCaptureSessionPresetPhoto]; 
    [captureSession addOutput:imageOutput]; 
    [captureSession addOutput:videoOutput]; 
    [captureSession commitConfiguration]; 
} 
else { 
    // Handle the failure. 
    return NO; 
} 
return YES; 
} 

用於實時預覽添加視圖

-(void)settingPreview:(NSView*)View{ 
// Attach preview to session 
previewView = View; 
CALayer *previewViewLayer = [previewView layer]; 
[previewViewLayer setBackgroundColor:CGColorGetConstantColor(kCGColorBlack)]; 
AVCaptureVideoPreviewLayer *newPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession]; 
[newPreviewLayer setFrame:[previewViewLayer bounds]]; 
[newPreviewLayer setAutoresizingMask:kCALayerWidthSizable | kCALayerHeightSizable]; 
[previewViewLayer addSublayer:newPreviewLayer]; 
//[self setPreviewLayer:newPreviewLayer]; 
[captureSession startRunning]; 
} 

代碼來捕捉圖像

-(void)captureImage{ 
AVCaptureConnection *videoConnection = nil; 
for (AVCaptureConnection *connection in imageOutput.connections) { 
    for (AVCaptureInputPort *port in [connection inputPorts]) { 
     if ([[port mediaType] isEqual:AVMediaTypeVideo]) { 
      videoConnection = connection; 
      break; 
     } 
    } 
    if (videoConnection) { break; } 
} 
[imageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: 
^(CMSampleBufferRef imageSampleBuffer, NSError *error) { 
    CFDictionaryRef exifAttachments = 
    CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL); 
    if (exifAttachments) { 
     // Do something with the attachments. 
    } 
    // Continue as appropriate. 
    //IMG is a global NSImage 
    IMG = [self imageFromSampleBuffer:imageSampleBuffer]; 
    [[self delegate] imageReady:IMG]; 
}]; 
} 

從樣本緩衝區中的數據創建一個NSImage中,我認爲問題在這裏

- (NSImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{ 
// Get a CMSampleBuffer's Core Video image buffer for the media data 
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 
// Lock the base address of the pixel buffer 
CVPixelBufferLockBaseAddress(imageBuffer, 0); 

// Get the number of bytes per row for the pixel buffer 
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

// Get the number of bytes per row for the pixel buffer 
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 

// Create a device-dependent RGB color space 
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

// Create a bitmap graphics context with the sample buffer data 
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8, 
              bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
// Create a Quartz image from the pixel data in the bitmap graphics context 
CGImageRef quartzImage = CGBitmapContextCreateImage(context); 
// Unlock the pixel buffer 
CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

// Free up the context and color space 
CGContextRelease(context); 
CGColorSpaceRelease(colorSpace); 

// Create an image object from the Quartz image 
//UIImage *image = [UIImage imageWithCGImage:quartzImage]; 
NSImage * image = [[NSImage alloc] initWithCGImage:quartzImage size:NSZeroSize]; 
// Release the Quartz image 
CGImageRelease(quartzImage); 

return (image); 
} 
+0

我沒有發現

// Continue as appropriate. //IMG = [self imageFromSampleBuffer:imageSampleBuffer]; CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer); if (imageBuffer) { CVBufferRetain(imageBuffer); NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]]; IMG = [[NSImage alloc] initWithSize: [imageRep size]]; [IMG addRepresentation: imageRep]; CVBufferRelease(imageBuffer); } 

代碼通過您的代碼快速瀏覽任何內容......但只有一個原因AVCapture圖像可能會出現黑屏,因爲相機需要一些時間來自動調整焦點,曝光等。您是否可以在開始運行捕捉會話的'settingPreview'方法之後立即調用您的'captureImage'方法? – rickster

+0

代碼大部分來自蘋果示例。 **程序啓動時會調用settingPreview **。 重點是保存的圖像更暗一點,與現場的區別很小,就好像我稍微降低了亮度。我認爲問題發生在從** CMSampleBufferRef **轉換爲** NSImage ** – Mex

回答

0

發現的解決方案

的問題是在imageFromSampleBuffer 我用這個代碼和圖片是十全十美的answer