2012-02-22 101 views
4

我必須從iPhone到服務器實時發送視頻。我創建捕獲會話並使用AVCaptureMovieFileOutput。如何發送流式視頻從iOS設備到服務器?

NSError *error = nil;

captureSession = [[AVCaptureSession alloc] init]; 
// find, attach devices 
AVCaptureDevice *muxedDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeMuxed]; 
if (muxedDevice) { 
    NSLog (@"got muxedDevice"); 
    AVCaptureDeviceInput *muxedInput = [AVCaptureDeviceInput deviceInputWithDevice:muxedDevice 
                      error:&error]; 
    if (muxedInput) { 
     [captureSession addInput:muxedInput]; 
    } 
} else { 
    AVCaptureDevice *videoDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo]; 
    if (videoDevice) { 
     NSLog (@"got videoDevice"); 
     AVCaptureDeviceInput *videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice 
                       error:&error]; 
     if (videoInput) { 
      [captureSession addInput: videoInput]; 
     } 
    } 
    AVCaptureDevice *audioDevice = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeAudio]; 
    if (audioDevice) { 
     NSLog (@"got audioDevice"); 
     AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioDevice 
                       error:&error]; 
     if (audioInput) { 
      [captureSession addInput: audioInput]; 
     } 
    } 
} 

// create a preview layer from the session and add it to UI 
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; 
previewLayer.frame = view.layer.bounds; 
previewLayer.videoGravity = AVLayerVideoGravityResizeAspect; 
previewLayer.orientation = AVCaptureVideoOrientationPortrait; 
[view.layer addSublayer:previewLayer]; 

// create capture file output 

captureMovieOutput = [[AVCaptureMovieFileOutput alloc] init]; 
if (! captureMovieURL) { 
    captureMoviePath = [[self getMoviePathWithName:MOVIE_FILE_NAME] retain]; 
    captureMovieURL = [[NSURL alloc] initFileURLWithPath:captureMoviePath]; 
} 
NSLog (@"recording to %@", captureMovieURL); 
[captureSession addOutput:captureMovieOutput]; 

我用AVAssetExportSession得到與持續時間10秒視頻。

 AVURLAsset *asset = [AVURLAsset URLAssetWithURL:captureMovieURL options:[NSDictionary dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];

AVMutableComposition *composition = [AVMutableComposition composition]; 

CMTime endTime; 
CMTime duration = CMTimeMake(6000, 600); 
if (asset.duration.value - startFragment.value < 6000) 
{ 
    endTime = asset.duration; 
} 
else 
{ 
    endTime = CMTimeMake(startFragment.value + 6000, 600);   
} 
CMTimeRange editRange = CMTimeRangeMake(startFragment, duration); 
startFragment = CMTimeMake(endTime.value, 600); 
    NSError *editError = nil; 
// and add into your composition 

[組合物insertTimeRange:editRange ofAsset:資產atTime:composition.duration錯誤:& editError];

AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:composition presetName:AVAssetExportPresetPassthrough]; exportSession.shouldOptimizeForNetworkUse = YES; NSString *name = [NSString stringWithFormat:MOVUE_SEGMENT_NAME, countMovies]; NSString *path = [NSString stringWithFormat:@"file://localhost%@", [self getMoviePathWithName:name]]; NSURL *url = [NSURL URLWithString:path]; NSLog(@"urlsegment = %@", url); exportSession.outputFileType = AVFileTypeMPEG4; exportSession.outputURL = url; [exportSession exportAsynchronouslyWithCompletionHandler:^{ if (AVAssetExportSessionStatusCompleted == exportSession.status) { countMovies++; NSLog(@"AVAssetExportSessionStatusCompleted"); } else if (AVAssetExportSessionStatusFailed == exportSession.status) { NSLog(@"AVAssetExportSessionStatusFailed: %@", [exportSession.error localizedDescription]); } else { NSLog(@"Export Session Status: %d", exportSession.status); } }];

我視頻發送到服務器,如果出口會話狀態完成。但速度很慢。要獲得持續10秒的電影,然後發送到服務器需要15秒。如果膠片的尺寸小於10秒,則沒有任何變化。 我該如何解決這個問題?做這個的最好方式是什麼?我怎麼解決這個問題?什麼更好地用於服務器上的視頻流?

回答

0

使用ffmpeg編碼元數據,它可能比AVAssetExportSession更好。但ffmpeg編碼比AVAssetExportSession更難;

相關問題