2012-03-27 63 views
2

我在從iPhone庫中選擇的現有視頻中讀取視頻幀時遇到內存問題。首先,我將UIImage框架本身添加到一個數組中,但我認爲該數組對於內存太長了一段時間,所以我將UIImages保存在文檔文件夾中並將圖像路徑添加到數組中。但是,即使使用儀器進行分配檢查,仍然會收到相同的內存警告。總的分配內存從來沒有超過2.5MB。也沒有發現泄漏...任何人都可以想到什麼?閱讀視頻幀時發生內存問題iPhone

-(void)addFrame:(UIImage *)image 
{ 
    NSString *imgPath = [NSString stringWithFormat:@"%@/Analysis%d-%d.png", docFolder, currentIndex, framesArray.count];  
    [UIImagePNGRepresentation(image) writeToFile:imgPath atomically:YES]; 
    [framesArray addObject:imgPath];  
    frameCount++;  
} 

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info 
{ 
    [picker dismissModalViewControllerAnimated:YES]; 
    [framesArray removeAllObjects];  
    frameCount = 0;   

    // incoming video 
    NSURL *videoURL = [info valueForKey:UIImagePickerControllerMediaURL]; 
    //NSLog(@"Video : %@", videoURL); 

    // AVURLAsset to read input movie (i.e. mov recorded to local storage) 
    NSDictionary *inputOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; 
    AVURLAsset *inputAsset = [[AVURLAsset alloc] initWithURL:videoURL options:inputOptions];  

    // Load the input asset tracks information 
    [inputAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler: ^{   

     NSError *error = nil; 
     nrFrames = CMTimeGetSeconds([inputAsset duration]) * 30; 
     NSLog(@"Total frames = %d", nrFrames); 

     // Check status of "tracks", make sure they were loaded  
     AVKeyValueStatus tracksStatus = [inputAsset statusOfValueForKey:@"tracks" error:&error]; 
     if (!tracksStatus == AVKeyValueStatusLoaded) 
      // failed to load 
      return;   

     /* Read video samples from input asset video track */ 
     AVAssetReader *reader = [AVAssetReader assetReaderWithAsset:inputAsset error:&error]; 

     NSMutableDictionary *outputSettings = [NSMutableDictionary dictionary]; 
     [outputSettings setObject: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey: (NSString*)kCVPixelBufferPixelFormatTypeKey]; 
     AVAssetReaderTrackOutput *readerVideoTrackOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:[[inputAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] outputSettings:outputSettings]; 


     // Assign the tracks to the reader and start to read 
     [reader addOutput:readerVideoTrackOutput]; 
     if ([reader startReading] == NO) { 
      // Handle error 
      NSLog(@"Error reading"); 
     } 

     NSAutoreleasePool *pool = [NSAutoreleasePool new]; 
     while (reader.status == AVAssetReaderStatusReading) 
     {    
      if(!memoryProblem) 
      { 
       CMSampleBufferRef sampleBufferRef = [readerVideoTrackOutput copyNextSampleBuffer]; 
       if (sampleBufferRef) 
       { 
        CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBufferRef); 
        /*Lock the image buffer*/ 
        CVPixelBufferLockBaseAddress(imageBuffer,0); 
        /*Get information about the image*/ 
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer); 
        size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
        size_t width = CVPixelBufferGetWidth(imageBuffer); 
        size_t height = CVPixelBufferGetHeight(imageBuffer); 

        /*We unlock the image buffer*/ 
        CVPixelBufferUnlockBaseAddress(imageBuffer,0); 

        /*Create a CGImageRef from the CVImageBufferRef*/ 
        CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 
        CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst); 
        CGImageRef newImage = CGBitmapContextCreateImage(newContext); 

        /*We release some components*/ 
        CGContextRelease(newContext); 
        CGColorSpaceRelease(colorSpace); 

        UIImage *image= [UIImage imageWithCGImage:newImage scale:[UIScreen mainScreen].scale orientation:UIImageOrientationRight];   
        //[self addFrame:image]; 
        [self performSelectorOnMainThread:@selector(addFrame:) withObject:image waitUntilDone:YES]; 

        /*We release the CGImageRef*/ 
        CGImageRelease(newImage);      

        CMSampleBufferInvalidate(sampleBufferRef); 
        CFRelease(sampleBufferRef); 
        sampleBufferRef = NULL; 
       } 
      } 
      else 
      {     
       break; 
      }    
     } 
     [pool release]; 

     NSLog(@"Finished");   
    }]; 
} 

回答

2

你做了一件事,並嘗試。

NSAutoreleasePool移到while迴路中,並將其排到迴路中。

因此,它會像如下:

while (reader.status == AVAssetReaderStatusReading) 
{    
    NSAutoreleasePool *pool = [NSAutoreleasePool new]; 

    ..... 

    [pool drain]; 
} 
+0

你是天才,它的工作原理!我很想知道,你是怎麼想的?爲什麼在將while循環放入autoreleasepool時應該這樣工作? – 2012-03-27 12:06:56

+1

如果它在循環之外,那麼該池將僅在循環結束時排空。但在循環結束之前,內存會積累並崩潰。所以如果它在裏面,在每次迭代中,自動釋放的對象將被釋放。 – Ilanchezhian 2012-03-27 12:11:12

+0

嗯,當然,現在看起來合乎邏輯。無論如何,非常感謝! – 2012-03-27 12:20:56