2012-12-29 219 views
7

我正在錄製小視頻剪輯(大約一秒左右,前後相機都有可能的不同方向)。然後嘗試使用AVAssetExportSession將它們合併。我基本上做了一個作品和一個videoComposition與適當的變換和音頻&視頻軌道。exportAsAschronouslyWithCompletionHandler失敗並顯示多個視頻文件(代碼= -11820)

問題是,在iOS 5上,如果您有超過4個視頻剪輯,並且在iOS 6上限制似乎是16個剪輯,則會失敗。

這對我來說似乎非常令人費解。 AVAssetExportSession會做些奇怪的事情,還是會對傳遞給它的剪輯數量有一些無法證明的限制?下面是我的代碼的一些摘錄:

-(void)exportVideo 
{ 
    AVMutableComposition *composition = video.composition; 
    AVMutableVideoComposition *videoComposition = video.videoComposition; 
    NSString * presetName = AVAssetExportPresetMediumQuality; 

    AVAssetExportSession *_assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:presetName]; 
    self.exportSession = _assetExport; 

    videoComposition.renderSize = CGSizeMake(640, 480); 
    _assetExport.videoComposition = videoComposition; 

    NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent: @"export.mov"]; 
    NSURL *exportUrl = [NSURL fileURLWithPath:exportPath]; 

    // Delete the currently exported files if it exists 
    if([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
     [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil]; 

    _assetExport.outputFileType = AVFileTypeQuickTimeMovie; 
    _assetExport.outputURL = exportUrl; 
    _assetExport.shouldOptimizeForNetworkUse = YES; 

    [_assetExport exportAsynchronouslyWithCompletionHandler:^{ 
     switch (_assetExport.status) 
     { 
      case AVAssetExportSessionStatusCompleted: 
       NSLog(@"Completed exporting!"); 
       break; 
      case AVAssetExportSessionStatusFailed: 
       NSLog(@"Failed:%@", _assetExport.error.description); 
       break; 
      case AVAssetExportSessionStatusCancelled: 
       NSLog(@"Canceled:%@", _assetExport.error); 
       break; 
      default: 
       break; 
     } 
    }]; 
} 

而這裏的組成是如何製造:

-(void)setVideoAndExport 
{ 
    video = nil; 
    video = [[VideoComposition alloc] initVideoTracks]; 

    CMTime localTimeline = kCMTimeZero; 

    // Create the composition of all videofiles 
    for (NSURL *url in outputFileUrlArray) { 
     AVAsset *asset = [[AVURLAsset alloc]initWithURL:url options:nil]; 
     [video setVideo:url at:localTimeline]; 
     localTimeline = CMTimeAdd(localTimeline, asset.duration); // Increment the timeline 
    } 
    [self exportVideo]; 
} 

而這裏的VideoComposition類的肉:

-(id)initVideoTracks 
{ 
    if((self = [super init])) 
    { 
     composition = [[AVMutableComposition alloc] init]; 
     addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
     mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 
     instructions = [[NSMutableArray alloc] init]; 
     videoComposition = [AVMutableVideoComposition videoComposition]; 
    } 
    return self; 
} 


-(void)setVideo:(NSURL*) url at:(CMTime)to 
{ 
    asset = [[AVURLAsset alloc]initWithURL:url options:nil]; 

    AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

    AVMutableCompositionTrack *compositionTrackVideo = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil]; 

    AVMutableCompositionTrack *compositionTrackAudio = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
    [compositionTrackAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:to error:nil]; 

    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeAdd(to, asset.duration)); 

    AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrackVideo]; 

    [layerInstruction setTransform: assetTrack.preferredTransform atTime: kCMTimeZero]; 
    [layerInstruction setOpacity:0.0 atTime:CMTimeAdd(to, asset.duration)]; 
    [instructions addObject:layerInstruction]; 

    mainInstruction.layerInstructions = instructions; 
    videoComposition.instructions = [NSArray arrayWithObject:mainInstruction]; 
    videoComposition.frameDuration = CMTimeMake(1, 30); 
} 

回答

2

我也遇到過類似的問題。我已經設法通過將資源插入到合成中來修復它,而不是追蹤可變軌道。所以,在你的「setVideo」,而不是這行代碼:

[compositionTrackVideo insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack: assetTrack atTime:to error:nil]; 

試試這個:

[self insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofAsset:asset atTime:to error:nil] 
+1

不錯,我想我需要嘗試一下。我已經通過在每次錄製後(在後臺)總是進行合成來避免這個問題,所以實際上不會有兩個以上的視頻合成。 – Karvapallo

+0

我真的無法得到這個工作。也許這些轉換需要以不同的方式應用(或者他們不會工作)。你會碰巧有一些示例代碼躺在嗎? – Karvapallo

+0

最近使用AVCompositionTrackSegment重寫了我們項目中創建的代碼。我們創建一個軌道段的數組,驗證它們並分配給AVMutableCompositionTrack的段屬性。我們正在做很多時間伸展/壓縮(改變視頻速度),這種方法提供了更精確的構圖。 – Jeepston

7

好吧,我還聯繫了蘋果公司關於這個問題,他們給予了迴應:

「這是一個已知的條件,你正在達到AVFoundation中設置的解碼器限制。」

他們還要求我提交有關該問題的錯誤報告,因爲如果模糊並具有誤導性,AVAssetExportSession會給出錯誤消息。所以我向蘋果提交了一個錯誤報告,抱怨錯誤信息不好。

因此,AVAssetExportSession中的這些限制被確認。在iOS 5中,解碼器限制爲4,在iOS 6中將其提高到16個。主要問題在於AVAssetExportSession報告的錯誤不好,因爲它只報告:11820「無法完成導出」,而不是實際告訴我們我們有達到極限。

相關問題