我想減慢iPhone 4S上的視頻設備的幀速率,以便didOutputSampleBuffer委託的調用頻率降低。這是爲了提高性能,因爲我處理每個幀,並需要一個大框架的細節。iPhone如何設置幀速率和減速AVCapture didOutputSampleBuffer委託人
我嘗試使用以下方法來做到這一點,當我安裝我的AVSession:
AVCaptureConnection *conn = [self.output connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoMinFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
[conn setVideoMaxFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
但這並沒有影響,我可以從1更改CAPTURE_FRAMES_PER_SECOND到60,看看在性能上沒有差異或減緩下來視頻拍攝。爲什麼這不起作用?如何減慢視頻設備的捕獲幀速率?
設置我的會話使用下面的代碼:
// Define the devices and the session and the settings
self.session = [[AVCaptureSession alloc] init];
//self.session.sessionPreset = AVCaptureSessionPresetPhoto;
//self.session.sessionPreset = AVCaptureSessionPresetHigh;
self.session.sessionPreset = AVCaptureSessionPreset1280x720;
self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
// Add the video frame output
self.output = [[AVCaptureVideoDataOutput alloc] init];
[self.output setAlwaysDiscardsLateVideoFrames:YES];
self.output.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
// A dispatch queue to get frames
dispatch_queue_t queue;
queue = dispatch_queue_create("frame_queue", NULL);
// Setup the frame rate
AVCaptureConnection *conn = [self.output connectionWithMediaType:AVMediaTypeVideo];
[conn setVideoMinFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
[conn setVideoMaxFrameDuration:CMTimeMake(1, CAPTURE_FRAMES_PER_SECOND)];
// Setup input and output and set the delegate to self
[self.output setSampleBufferDelegate:self queue:queue];
[self.session addInput:self.input];
[self.session addOutput:self.output];
// Start the session
[self.session startRunning];
我捕捉使用「didOutputSampleBuffer」委託下面的實施框架:
// The delegate method where we get our image data frames from
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
// Extract a UImage
CVPixelBufferRef pixel_buffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixel_buffer];
// Capture the image
CGImageRef ref = [self.context createCGImage:ciImage fromRect:ciImage.extent];
// This sets the captured image orientation correctly
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:UIImageOrientationLeft];
// Release the CGImage
CGImageRelease(ref);
// Update the UI on the main thread but throttle the processing
[self performSelectorOnMainThread:@selector(updateUIWithCapturedImageAndProcessWithImage:) withObject:image waitUntilDone:YES];
}
太好了。如果這有效,請將其標記爲正確答案。謝謝 – Spectravideo328 2013-02-25 17:37:08