2013-02-13 92 views
0

我正在構建iOS的本機擴展,我想實現條碼掃描器。IOS的​​視頻相機本機擴展

我跟着AVCam的例子,我試過它在本地應用程序(完整的xcode),它工作正常。

現在,我想從Flex移動項目開始使用此代碼。我已經能夠創建ANE並將其放在Flex Mobile項目上,並且可以調用ANE的功能。

它似乎工作正常,但我的問題是,我看不到你通過相機看到什麼。我的意思是,我有一個方法,我打電話啓動相機並初始化捕捉。我也實現了captureOutput委託,最奇怪的是當我運行我的應用程序時,我可以看到initcapture和captureOutput中的日誌,就像應用程序捕獲數據一樣,但在iPad中我看不到相機。

這是我使用的代碼的一部分:

- (void)initCapture 
{ 
    NSLog(@"camera view capture init"); 
    /*We setup the input*/ 
    self.device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; 

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil]; 
    /*We setupt the output*/ 
    captureOutput = [[AVCaptureVideoDataOutput alloc] init]; 
    // If the queue is blocked when new frames are captured, those frames will be automatically dropped 
    captureOutput.alwaysDiscardsLateVideoFrames = YES; 
    //captureOutput.minFrameDuration = CMTimeMake(1, 10); Uncomment it to specify a minimum duration for each video frame 
    [captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; 
    // Set the video output to store frame in BGRA (It is supposed to be faster) 

    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    // Set the video output to store frame in 422YpCbCr8(It is supposed to be faster) 

    //************************Note this line 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]; 

    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 

    //And we create a capture session 
    self.captureSession = [[AVCaptureSession alloc] init]; 
    //We add input and output 
    [self.captureSession addInput:captureInput]; 
    [self.captureSession addOutput:captureOutput]; 


    if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) 
    { 
     NSLog(@"camera view Set preview port to 1280X720"); 
     self.captureSession.sessionPreset = AVCaptureSessionPreset1280x720; 
    } else 
     //set to 640x480 if 1280x720 not supported on device 
     if ([self.captureSession canSetSessionPreset:AVCaptureSessionPreset640x480]) 
     { 
      NSLog(@"camera view Set preview port to 640X480"); 
      self.captureSession.sessionPreset = AVCaptureSessionPreset640x480; 
     } 


    /*We add the preview layer*/ 

    self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession]; 

    if ([self.prevLayer respondsToSelector:@selector(connection)]) 
     self.prevLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeLeft; 
    else 
     self.prevLayer.orientation = AVCaptureVideoOrientationLandscapeLeft; 

    self.prevLayer.frame = CGRectMake(150, 0, 700, 700); 
    self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspect; 
    [self.view.layer addSublayer: self.prevLayer]; 
} 

- (void) startScanning { 
    NSLog(@"camera view start scanning"); 
    self.state = LAUNCHING_CAMERA; 
    [self.captureSession startRunning]; 
    self.prevLayer.hidden = NO; 
    self.state = CAMERA; 
} 

#pragma mark AVCaptureSession delegate 

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
     fromConnection:(AVCaptureConnection *)connection 
{ 
    NSLog(@"camera view Capture output"); 
} 

我應該如何解決這個問題?

非常感謝。

回答

1

我想我已經解決了它。

相反的:

[self.view.layer addSublayer: self.prevLayer]; 

我放:現在

UIViewController *mainController = [UIApplication sharedApplication].keyWindow.rootViewController; 
[mainController.view.layer addSublayer: self.prevLayer]; 

,我可以看到我的Flex應用程序的攝像頭。