2014-11-24 169 views
3

我試圖導出錄製的視頻。並取得成功。但音頻缺少最終導出的視頻。所以我搜索了它並在音頻下面添加了以下代碼。iOS:導出的視頻中缺少音頻

if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0) 
{ 
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 

         ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 

         atTime:kCMTimeZero error:nil]; 
} 

但是我不能在添加上面的代碼後保存視頻。我得到一個錯誤:

「session.status 4錯誤錯誤域= AVFoundationErrorDomain代碼= -11841 」停止運行「 的UserInfo = {0x17027e140 = NSLocalizedDescription停止運行,NSLocalizedFailureReason =該視頻無法組成。}」

- (void)exportDidFinish:(AVAssetExportSession*)session { 

NSLog(@"session.status %ld error %@",session.status,session.error);} 

這是我用於導出視頻的代碼。那麼你有什麼想法如何實現我的導出視頻與音頻的目標?謝謝!!

- (void)getVideoOutput{  
exportInProgress=YES; 
NSLog(@"videoOutputFileUrl %@",videoOutputFileUrl); 
AVAsset *videoAsset = [AVAsset assetWithURL:videoOutputFileUrl]; 
NSLog(@"videoAsset %@",videoAsset); 
// 1 - Early exit if there's no video file selected 

NSLog(@"video asset %@",videoAsset); 

if (!videoAsset) { 

    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First" 

                delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil]; 

    [alert show]; 

    return; 

} 



// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. 

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; 



// 3 - Video track 

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 

                    preferredTrackID:kCMPersistentTrackID_Invalid]; 

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 

        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 

        atTime:kCMTimeZero error:nil]; 

/* getting an error AVAssetExportSessionStatusFailed 
if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0) 
{ 
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 

         ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 

         atTime:kCMTimeZero error:nil]; 
}*/ 


// 3.1 - Create AVMutableVideoCompositionInstruction 

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; 

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration); 



// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation. 

AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; 

AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; 

UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp; 

BOOL isVideoAssetPortrait_ = NO; 

CGAffineTransform videoTransform = videoAssetTrack.preferredTransform; 

if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) { 

    videoAssetOrientation_ = UIImageOrientationRight; 

    isVideoAssetPortrait_ = YES; 

} 

if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) { 

    videoAssetOrientation_ = UIImageOrientationLeft; 

    isVideoAssetPortrait_ = YES; 

} 

if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) { 

    videoAssetOrientation_ = UIImageOrientationUp; 

} 

if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) { 

    videoAssetOrientation_ = UIImageOrientationDown; 

} 

[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero]; 

[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration]; 



// 3.3 - Add instructions 

mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil]; 



AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition]; 



CGSize naturalSize; 

if(isVideoAssetPortrait_){ 

    naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width); 

} else { 

    naturalSize = videoAssetTrack.naturalSize; 

} 



float renderWidth, renderHeight; 

renderWidth = naturalSize.width; 

renderHeight = naturalSize.height; 

mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight); 

mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction]; 

mainCompositionInst.frameDuration = CMTimeMake(1, 30); 


int totalSeconds= (int) CMTimeGetSeconds(videoAsset.duration); 

[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize videoDuration:totalSeconds]; 



// 4 - Get path 

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); 

NSString *documentsDirectory = [paths objectAtIndex:0]; 

NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent: 

         [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]]; 

NSURL *url = [NSURL fileURLWithPath:myPathDocs]; 



// 5 - Create exporter 

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition 

                    presetName:AVAssetExportPresetHighestQuality]; 

exporter.outputURL=url; 

exporter.outputFileType = AVFileTypeQuickTimeMovie; 

exporter.shouldOptimizeForNetworkUse = YES; 

exporter.videoComposition = mainCompositionInst; 


[exporter exportAsynchronouslyWithCompletionHandler:^{ 


    //dispatch_async(dispatch_get_main_queue(), ^{ 

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ 


     [self exportDidFinish:exporter]; 


    }); 

}]; 

}

回答

2

我不知道這是否幫助,但這裏是我是如何做的一個項目:

  1. 準備最終組成

    AVMutableComposition *composition = [[AVMutableComposition alloc] init]; 
    
  2. 準備錄像帶

    AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; 
    
  3. 準備音軌

    AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; 
    
  4. 插入從資產視頻軌道

    AVAssetTrack *video = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject]; 
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:video atTime:kCMTimeZero error:&error]; 
    
  5. 視頻數據插入從資產的音頻數據,在音頻軌

    AVAssetTrack *audio = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject]; 
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audio atTime:kCMTimeZero error:&error]; 
    
  6. 然後,你可以添加一些ins tructions處理您的視頻和/或音頻數據

  7. 最後,你應該能夠使用導出:

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality]; 
    [exporter exportAsynchronouslyWithCompletionHandler:^{ /* code when the export is complete */ }]; 
    

另外,請檢查您是否正確地記錄的音頻。
相機第一次被觸發iOS應詢問您是否允許使用麥克風。如果允許,請檢查您的設備設置。

另一種選擇是,您可以使用Xcode中的窗口>設備窗口來檢索原始資產。
選擇您的設備並將數據導出到您的計算機。然後,查找記錄的資產並使用VLC打開它。使用Cmd + I檢查流是否有音頻和視頻軌道。

+0

感謝您的幫助。我想到了。我正在視頻軌道中添加音頻。它爲它創建單獨的音軌後工作正常。 – 2014-11-24 16:55:59