2017-03-05 212 views
5

我正在使用WebRTC在兩個用戶之間構建視頻聊天。我想拍一張localView視圖的快照,其中顯示了其中一個人。拍攝AVCaptureVideoPreviewLayer視圖的快照

這是我的課與configureLocalPreview方法,視頻流與UIViews連接:

@IBOutlet var remoteView: RTCEAGLVideoView! 
@IBOutlet var localView: UIView! 

var captureSession: AVCaptureSession? 
var videoSource: RTCAVFoundationVideoSource? 
var videoTrack: RTCVideoTrack? 

func configureLocalPreview() { 
    self.videoTrack = self.signaling.localMediaStream.self.videoTracks.first as! RTCVideoTrack? 
    self.videoSource = (self.videoTrack?.source as? RTCAVFoundationVideoSource) 
    self.captureSession = self.videoSource?.self.captureSession 

    self.previewLayer = AVCaptureVideoPreviewLayer.init(session: self.captureSession) 
    self.previewLayer.frame = self.localView.bounds 
    self.localView.layer.addSublayer(self.previewLayer) 
    self.localView.isUserInteractionEnabled = true 
    //self.localView.layer.position = CGPointMake(100, 100); 
} 

在我要訪問快照的地方,我呼籲:

self.localView.pb_takeSnapshot() 

pb_takeSnapshot來從我在另一篇文章中找到的UIView擴展。它的定義是這樣的:

extension UIView { 
    func pb_takeSnapshot() -> UIImage { 
    UIGraphicsBeginImageContextWithOptions(bounds.size, false, UIScreen.main.scale) 

    drawHierarchy(in: self.bounds, afterScreenUpdates: true) 

    let image = UIGraphicsGetImageFromCurrentImageContext()! 
    UIGraphicsEndImageContext() 
    return image 
    } 
} 

當我一起來看看在Xcode調試的形象,它看起來完全是綠色的人,我可以在iPhone屏幕上看到(這種觀點裏面),是不是有:

screenshot of the snapshot

那會是什麼人是不可見的原因是什麼?難道只是無法做出一個流的快照?感謝您看一看!

回答

2

因爲AVCaptureVideoPreviewLayer是作爲OpenGL層實現的,所以不能使用常規的CoreGraphic上下文。我可以建議嘗試訪問原始數據。

與委託添加AVCaptureVideoDataOutput

previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) 

let captureVideoOutput = AVCaptureVideoDataOutput() 
captureVideoOutput.setSampleBufferDelegate(self, queue: DispatchQueue.main) 
captureSession?.addOutput(captureVideoOutput) 

previewLayer.frame = localView.bounds 

順應您的控制器(或其他),以AVCaptureVideoDataOutputSampleBufferDelegate

聲明shouldCaptureFrame變量並在需要拍照時進行設置。

var shouldCaptureFrame: Bool = false 
... 
func takeSnapshot() { 
    shouldCaptureFrame = true 
} 

並實行didOutputSampleBuffer從委託:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) { 
    if !shouldCaptureFrame { 
    return 
    } 

    let image = UIImage.from(sampleBuffer: sampleBuffer) 
    shouldCaptureFrame = false 
} 

最後,擴展與from(sampleBuffer:)功能:

extension UIImage { 

    static func from(sampleBuffer: CMSampleBuffer) -> UIImage? { 
     guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { 
      return nil 
     } 
     CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0)) 
     let baseAddresses = CVPixelBufferGetBaseAddress(imageBuffer) 
     let colorSpace = CGColorSpaceCreateDeviceRGB() 
     let context = CGContext(
      data: baseAddresses, 
      width: CVPixelBufferGetWidth(imageBuffer), 
      height: CVPixelBufferGetHeight(imageBuffer), 
      bitsPerComponent: 8, 
      bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer), 
      space: colorSpace, 
      bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue 
     ) 
     let quartzImage = context?.makeImage() 
     CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0)) 

     if let quartzImage = quartzImage { 
      let image = UIImage(cgImage: quartzImage) 
      return image 
     } 

     return nil 
    } 

} 
3

您應該創建一個使用RTCEAGLVideoView代替的UIView的localView。我使用相同的爲我的localView,並能夠使用您的文章中提到的相同的代碼段採取快照。

下面是示例代碼將啓動你的相機,並顯示在本地預覽:

class ViewController: UIViewController,RTCEAGLVideoViewDelegate { 

var captureSession: AVCaptureSession? 
var previewLayer :AVCaptureVideoPreviewLayer? 
var peerConnectionFactory: RTCPeerConnectionFactory! 
var videoSource:RTCAVFoundationVideoSource! 
var localTrack :RTCVideoTrack! 

@IBOutlet var myView: UIView! 
override func viewDidLoad() { 
    super.viewDidLoad() 
    /*myView = UIView(frame: CGRect(x: 0, 
           y: 0, 
           width: UIScreen.main.bounds.size.width, 
           height: UIScreen.main.bounds.size.height))*/ 
    startCamera() 
    // Do any additional setup after loading the view, typically from a nib. 
} 

fileprivate func startCamera() { 

    peerConnectionFactory = RTCPeerConnectionFactory() 
    RTCInitializeSSL(); 
    RTCSetupInternalTracer(); 
    RTCSetMinDebugLogLevel(RTCLoggingSeverity.info) 

    videoSource = peerConnectionFactory.avFoundationVideoSource(with: nil); 


    localTrack = peerConnectionFactory.videoTrack(with: videoSource, trackId: "ARDAMSv0") 



    let localScaleX = CGFloat(1.0) 
    let localView : RTCEAGLVideoView = RTCEAGLVideoView(frame: self.view.bounds) 
    self.view.insertSubview(localView, at: 1) 
    localView.frame = self.view.bounds; 
    localView.transform = CGAffineTransform(scaleX: localScaleX, y: 1) 

    localTrack.add(localView) 
} 


override func didReceiveMemoryWarning() { 
    super.didReceiveMemoryWarning() 
    // Dispose of any resources that can be recreated. 
} 

override func viewDidAppear(_ animated: Bool) { 
    //previewLayer?.frame.size = myView.frame.size 
} 

func videoView(_ videoView: RTCEAGLVideoView, didChangeVideoSize size: CGSize) { 
    print("Inside didChangeVideoSize") 
} 

} 
+0

感謝您的回答,我有點困惑如何剪切實際的代碼看起來像,因爲我已經嘗試了這麼多不同的代碼版本。 – Linus

+0

@Linus我已經更新了可以用來啓動攝像頭並查看本地預覽的示例代碼片段。 –

1

對於你應該使用RTCEAGLVideoView的意見的WebRTC視頻層。欲瞭解更多詳情,請看看這些在這裏的WebRTC示例應用程序AppRTC App