2010-10-22 64 views
9

我必須使用PHP下載大文件(1xx MB)。如何使用PHP下載大文件(內存使用率低)

如何在不浪費內存(RAM)的情況下下載臨時文件?

當我使用

$something=file_get_contents('http://somehost.example/file.zip'); 
file_put_contents($something,'myfile.zip'); 

我需要有這麼多的內存文件的該大小。

也許有可能使用任何其他方式下載它?

舉例來說,在部件(例如1024B),寫入到磁盤,下載另一部分重複,直到文件將被完全下載?

+0

http://stackoverflow.com/questions/3697748/fastest-way-to-serve-a-file-using-php的[使用curl下載大型文件 – 2011-11-18 16:44:59

+0

可能重複](http://stackoverflow.com/questions/6409462/downloading-a-large-file-using-curl) – dynamic 2013-02-12 15:44:07

回答

22

一次複製

/** 
* Copy remote file over HTTP one small chunk at a time. 
* 
* @param $infile The full URL to the remote file 
* @param $outfile The path where to save the file 
*/ 
function copyfile_chunked($infile, $outfile) { 
    $chunksize = 10 * (1024 * 1024); // 10 Megs 

    /** 
    * parse_url breaks a part a URL into it's parts, i.e. host, path, 
    * query string, etc. 
    */ 
    $parts = parse_url($infile); 
    $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5); 
    $o_handle = fopen($outfile, 'wb'); 

    if ($i_handle == false || $o_handle == false) { 
     return false; 
    } 

    if (!empty($parts['query'])) { 
     $parts['path'] .= '?' . $parts['query']; 
    } 

    /** 
    * Send the request to the server for the file 
    */ 
    $request = "GET {$parts['path']} HTTP/1.1\r\n"; 
    $request .= "Host: {$parts['host']}\r\n"; 
    $request .= "User-Agent: Mozilla/5.0\r\n"; 
    $request .= "Keep-Alive: 115\r\n"; 
    $request .= "Connection: keep-alive\r\n\r\n"; 
    fwrite($i_handle, $request); 

    /** 
    * Now read the headers from the remote server. We'll need 
    * to get the content length. 
    */ 
    $headers = array(); 
    while(!feof($i_handle)) { 
     $line = fgets($i_handle); 
     if ($line == "\r\n") break; 
     $headers[] = $line; 
    } 

    /** 
    * Look for the Content-Length header, and get the size 
    * of the remote file. 
    */ 
    $length = 0; 
    foreach($headers as $header) { 
     if (stripos($header, 'Content-Length:') === 0) { 
      $length = (int)str_replace('Content-Length: ', '', $header); 
      break; 
     } 
    } 

    /** 
    * Start reading in the remote file, and writing it to the 
    * local file one chunk at a time. 
    */ 
    $cnt = 0; 
    while(!feof($i_handle)) { 
     $buf = ''; 
     $buf = fread($i_handle, $chunksize); 
     $bytes = fwrite($o_handle, $buf); 
     if ($bytes == false) { 
      return false; 
     } 
     $cnt += $bytes; 

     /** 
     * We're done reading when we've reached the conent length 
     */ 
     if ($cnt >= $length) break; 
    } 

    fclose($i_handle); 
    fclose($o_handle); 
    return $cnt; 
} 

調整$ CHUNKSIZE變量您需要的文件中的一個小塊。這只是輕度測試。由於多種原因,它很容易破裂。

用法:

copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg'); 
+0

代碼看起來不錯,但不可能允許用戶以這種方式打開遠程文件。也許你有類似的代碼使用fsockopen? – marc 2010-10-23 14:29:09

+1

如果在PHP中打開allow_url_fopen指令,它應該可以工作。但我會更新我的示例以顯示套接字使用情況。 – mellowsoon 2010-10-23 17:57:15

+0

非常感謝。它的工作很好;) – marc 2010-10-24 13:26:17

6

你可以掏出來使用exec()這將導致最低的內存使用量的wget

<?php 
exec("wget -o outputfilename.tar.gz http://pathtofile/file.tar.gz") 
?> 

您也可以嘗試使用fopen()fread()fwrite()。這樣,你一次只能將x字節下載到內存中。