2014-10-03 92 views
0

我正在嘗試創建視頻文件的圖像文件。我在NSArray中設置圖像文件的名稱。當圖像文件的數量很大(超過80或100)時,我收到內存警告,有時應用程序崩潰。這是我的代碼:從圖像陣列創建mp4視頻時的內存警告

-(void)writeImageAsMovie:(NSArray *)images toPath:(NSString*)path size:(CGSize)size duration:(int)duration 
{ 

    NSError *error = nil; 

    videoWriter = [[AVAssetWriter alloc] initWithURL: 
        [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie 
               error:&error]; 


    NSParameterAssert(videoWriter); 

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
            AVVideoCodecH264, AVVideoCodecKey, 
            [NSNumber numberWithInt:size.width], AVVideoWidthKey, 
            [NSNumber numberWithInt:size.height], AVVideoHeightKey, 
            nil]; 
    AVAssetWriterInput* writerInput = [AVAssetWriterInput 
             assetWriterInputWithMediaType:AVMediaTypeVideo 
             outputSettings:videoSettings] ; 




    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor 
                assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput 
                sourcePixelBufferAttributes:nil]; 


    NSParameterAssert(writerInput); 
    NSParameterAssert([videoWriter canAddInput:writerInput]); 
    [videoWriter addInput:writerInput]; 


    //Start a session: 
    [videoWriter startWriting]; 
    [videoWriter startSessionAtSourceTime:CMTimeMake(0, 1000)]; 

    CVPixelBufferRef buffer = NULL; 

    //convert uiimage to CGImage. 

    //Write samples: 
    for (int i=0; i<images.count ; i++) { 

     UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]]; 
     int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue]; 
     buffer = [self pixelBufferFromCGImage:image.CGImage]; 
     while(! adaptor.assetWriterInput.readyForMoreMediaData); 
     [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)]; 
     image=nil; 
    } 

    while(!adaptor.assetWriterInput.readyForMoreMediaData); 


    //Finish the session: 
    [writerInput markAsFinished]; 

    [videoWriter finishWritingWithCompletionHandler:^(){ 
     NSLog (@"finished writing %d",images.count); 
    }]; 

    NSLog(@"%d",[videoWriter status]); 
    while([videoWriter status] != AVAssetWriterStatusFailed && [videoWriter status] != AVAssetWriterStatusCompleted) { 
     NSLog(@"Status: %d", [videoWriter status]); 
     sleep(1); 
    } 
    NSLog(@"%d",[videoWriter status]); 
    NSString *tmpdir = NSTemporaryDirectory(); 
    NSString *mydir = [tmpdir stringByAppendingPathComponent:@"vidimages"]; 
    [[NSFileManager defaultManager] removeItemAtPath:mydir error:nil]; 

    } 




    - (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image 
    { 
     CGFloat screenWidth = [[UIScreen mainScreen] bounds].size.width; 

     NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys: 
           [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey, 
           [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey, 
           nil]; 
     CVPixelBufferRef pxbuffer = NULL; 

     CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, screenWidth, 
               screenWidth, kCVPixelFormatType_32BGRA, (__bridge CFDictionaryRef) options, 
               &pxbuffer); 
     NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL); 

     CVPixelBufferLockBaseAddress(pxbuffer, 0); 
     void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer); 

     NSParameterAssert(pxdata != NULL); 

     CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB(); 
     CGContextRef context1 = CGBitmapContextCreate(pxdata, screenWidth, 
                 screenWidth, 8, 4*screenWidth, rgbColorSpace, 
                 kCGImageAlphaNoneSkipLast); 
     NSParameterAssert(context1); 
     CGContextConcatCTM(context1, CGAffineTransformMakeRotation(0)); 
     CGContextDrawImage(context1, CGRectMake(0, 0, screenWidth, 
               screenWidth), image); 
     CGColorSpaceRelease(rgbColorSpace); 
     CGContextRelease(context1); 

     CVPixelBufferUnlockBaseAddress(pxbuffer, 0); 

     return pxbuffer; 
    } 
+0

儘可能使用內存。這就是爲什麼你被警告和/或系統崩潰。這不是你的代碼的問題,而是系統的問題。虛擬內存有時可能會有所幫助,並可能導致系統更加不穩定(更常出現故障)。 – 2014-10-03 04:58:02

回答

0

看起來在這裏,自動釋放的UIImage和NSData的對象的板都被分配:

for (int i=0; i<images.count ; i++) { 

    UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]]; 
    int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue]; 
    buffer = [self pixelBufferFromCGImage:image.CGImage]; 
    while(! adaptor.assetWriterInput.readyForMoreMediaData); 
    [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)]; 
    image=nil; 
} 

,以獲得自動釋放對象釋放每通過循環,並停止內存使用斜坡上升迅速,添加一個自動釋放池,即:

for (int i=0; i<images.count ; i++) { 
    @autoreleasepool { 
     UIImage *image = [UIImage imageWithData:[NSData dataWithContentsOfFile:[[images objectAtIndex:i] objectForKey:@"image"]]]; 
     int time = [[[images objectAtIndex:i] objectForKey:@"time"] intValue]; 
     buffer = [self pixelBufferFromCGImage:image.CGImage]; 
     while(! adaptor.assetWriterInput.readyForMoreMediaData); 
     [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(time,1000)]; 
     image=nil; 
    } 
} 

看在蘋果DOCO爲自動釋放池,如果你是unfamiliar-是他們仍然與有關ARC 。

相關問題