如果你試圖操縱像素,你可以把你要AVCaptureVideoDataOutputSampleBufferDelegate分配爲代表的類下面的方法:
-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
CVImageBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
if(CVPixelBufferLockBaseAddress(pb, 0)) //zero is success
NSLog(@"Error");
size_t bufferHeight = CVPixelBufferGetHeight(pb);
size_t bufferWidth = CVPixelBufferGetWidth(pb);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pb);
unsigned char* rowBase= CVPixelBufferGetBaseAddress(pb);
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
if (colorSpace == NULL)
NSLog(@"Error");
// Create a bitmap graphics context with the sample buffer data.
CGContextRef context= CGBitmapContextCreate(rowBase,bufferWidth,bufferHeight, 8,bytesPerRow, colorSpace, kCGImageAlphaNone);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
UIImage *currentImage=[UIImage imageWithCGImage:quartzImage];
// Free up the context and color space
CFRelease(quartzImage);
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
if(CVPixelBufferUnlockBaseAddress(pb, 0)) //zero is success
NSLog(@"Error");
}
然後將該圖像連接到View控制器中的UIImageView。 Lookinto kCGImageAlphaNone標誌。這將取決於你在做什麼。
爲什麼你不想使用AVCaptureVideoPreviewLayer? – 2013-02-15 20:32:32
我只是想評估所有其他選項。 – ijason03 2013-02-15 20:48:12