經過相當長的時間後,我們花費了大量時間解決問題而沒有結果,於是我決定在這裏提問。在Swift 3處使用AVCaptureVideoDataOutput錄製視頻
我們使用AVCaptureVideoDataOutput
來獲取攝像機的實況視頻像素數據,並在captureOutput
函數中使用。但我們也想用這些數據來錄製視頻。另外,我們想知道這個視頻錄製是否會被壓縮到和AVCaptureMovieFileOutput
錄製的視頻一樣多。
我想通知你,我們使用AVCaptureMovieFileOutput
錄製沒有問題。但是AVCaptureMovieFileOutput
和AVCaptureVideoDataOutput
不能同時工作。
您可以在下面找到我們的captureOutput
函數;
func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let baseAddress = CVPixelBufferGetBaseAddressOfPlane(imageBuffer, 0)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
videoWidth = CVPixelBufferGetWidth(imageBuffer)
videoHeight = CVPixelBufferGetHeight(imageBuffer)
let colorSpace = CGColorSpaceCreateDeviceRGB()
var bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
let context = CGContext(data: baseAddress, width: videoWidth, height: videoHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
let imageRef = context!.makeImage()
CVPixelBufferUnlockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
let data = imageRef!.dataProvider!.data as! NSData
let pixels = data.bytes.assumingMemoryBound(to: UInt8.self)
/* Because what we are doing with pixel data irrelevant to the question we omitted the rest of the code to make it simple */
}