2015-04-03 143 views
7

我有一個NSURL包含一個視頻,我想記錄該視頻的幀每秒十次。我的代碼能夠捕捉到我的播放器的圖像,但我無法將其設置爲每秒捕捉10幀。我正在嘗試這樣的事情,但是它會返回視頻的相同初始幀,正確的次數?這裏是我有什麼:iOS採取多個屏幕截圖

AVAsset *asset = [AVAsset assetWithURL:videoUrl]; 
CMTime vidLength = asset.duration; 
float seconds = CMTimeGetSeconds(vidLength); 
int frameCount = 0; 
for (float i = 0; i < seconds;) { 
    AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset]; 
    CMTime time = CMTimeMake(i, 10); 
    CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL]; 
    UIImage *thumbnail = [UIImage imageWithCGImage:imageRef]; 
           CGImageRelease(imageRef); 
NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", frameCount]; 
NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename]; 

[UIImagePNGRepresentation(thumbnail) writeToFile: pngPath atomically: YES]; 
frameCount++; 
i = i + 0.1; 
} 

但是,而不是獲得當前時間我的視頻幀,我只是得到初始幀?

我該如何獲得每秒10次的視頻幀?

感謝您的幫助:)

回答

2

隨着CMTimeMake(A, B)你存儲一個有理數,一個確切的分數A/B秒,而這個函數的第一個參數接受int值。對於20秒的視頻,您將在循環的最後一次迭代中捕獲一個幀(時間((int)19.9)/ 10 = 1.9秒)。使用CMTimeMakeWithSeconds(i, NSEC_PER_SEC)函數來解決這個時間問題。

10

你得到,因爲你正試圖與浮點值的幫助下創建CMTime初步框架:

​​

由於CMTimeMake功能需要的int64_t值作爲第一個參數,你的浮點值將四捨五入到整數,你會得到不正確的結果。

讓我們改變你的代碼位:

1)首先,你需要找到總幀數計算,你需要從視頻中得到的。你寫,你需要每秒10幀,所以代碼將是:

int requiredFramesCount = seconds * 10; 

2)接下來你需要找到一個值,將在每一步增加你的CMTime值:

int64_t step = vidLength.value/requiredFramesCount; 

3)最後,你需要requestedTimeToleranceBefore和requestedTimeToleranceAfter設置爲kCMTimeZero,以獲得在精確的時間框架:

imageGenerator.requestedTimeToleranceAfter = kCMTimeZero; 
imageGenerator.requestedTimeToleranceBefore = kCMTimeZero; 

這裏是你的代碼將如何看起來像:

CMTime vidLength = asset.duration; 
float seconds = CMTimeGetSeconds(vidLength); 

int requiredFramesCount = seconds * 10; 
int64_t step = vidLength.value/requiredFramesCount; 

int value = 0; 

for (int i = 0; i < requiredFramesCount; i++) { 

    AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset]; 
    imageGenerator.requestedTimeToleranceAfter = kCMTimeZero; 
    imageGenerator.requestedTimeToleranceBefore = kCMTimeZero; 

    CMTime time = CMTimeMake(value, vidLength.timescale); 

    CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL]; 
    UIImage *thumbnail = [UIImage imageWithCGImage:imageRef]; 
    CGImageRelease(imageRef); 
    NSString* filename = [NSString stringWithFormat:@"Documents/frame_%d.png", i]; 
    NSString* pngPath = [NSHomeDirectory() stringByAppendingPathComponent:filename]; 

    [UIImagePNGRepresentation(thumbnail) writeToFile: pngPath atomically: YES]; 

    value += step; 
} 
+0

使用斯威夫特 同樣的問題http://stackoverflow.com/questions/32286320/grab-frames-from-video-using-swift/32297251 – arpo 2015-08-30 14:13:39