2017-08-26 46 views
1

我需要python sftp客戶端從sftp服務器上下載文件。我開始使用Paramiko。 KB中的小文件運行良好,但是當我嘗試下載600 MB文件時,它會在下載20 MB文件後無限期地掛起。無法找出問題所在。增加窗口大小也沒有解決。任何幫助將非常感激!paramiko掛上後得到20 MB的文件自己加載

host = config.getsafe(section, "host") 
username = config.getsafe(section, "username") 
port = config.getsafe(section, "port") 
remote_dir = config.getsafe(section, "remote_dir") 
download_dir = config.getsafe(section, "download_dir") 
archive_dir = config.getsafe(section, "archive_dir") if config.has_option(section, "archive_dir") else \ 
    None 
password = config.getsafe(section, "password") if config.has_option(section, "password") else None 
file_pattern = config.getsafe(section, "file_pattern") if config.has_option(section, "file_pattern") \ 
    else "*" 
passphrase = config.getsafe(section, "passphrase") if config.has_option(section, "passphrase") else None 
gnupg_home = config.getsafe(section, "gnupg_home") if config.has_option(section, "gnupg_home") else None 

ssh = paramiko.SSHClient() 
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy()) 
ssh.connect(hostname=host, port=int(port), username=username, password=password) 

sftp = ssh.open_sftp() 
sftp.sshclient = ssh 

sftp.get("/SFTP/PL_DEV/test.dat", "C:/import/download/test.dat") 
+0

做一個數據包捕獲(使用Wireshark)來找出到底發生了什麼。另外嘗試使用獨立的SFTP客戶端下載相同的文件,看看是否有效。 –

+0

獨立sftp客戶端像filezilla完美作品 – Ram

回答

0

我做了兩件事情解決了類似的問題:

  1. increase window size - 你說你嘗試過太多;對我來說,這有助於從一個幾十MB的去半個GB,但沒有進一步的

  2. effectively disable rekeying - 這可能會帶來安全隱患,但幫我從一個奇怪的窗口SFTP服務器上獲得了一個GB的文件

    with paramiko.Transport((_SFTP['host'], 22)) as transport: 
        # SFTP FIXES 
        transport.default_window_size=paramiko.common.MAX_WINDOW_SIZE 
        transport.packetizer.REKEY_BYTES = pow(2, 40) # 1TB max, this is a security degradation! 
        transport.packetizer.REKEY_PACKETS = pow(2, 40) # 1TB max, this is a security degradation! 
        #/SFTP FIXES 
    
        transport.connect(username=_SFTP['user'], password=_SFTP['password']) 
         with paramiko.SFTPClient.from_transport(transport) as sftp: 
          listdir = sftp.listdir() 
          # ... 
          sftp.get(remotepath=filename, localpath=localpath) 
    
+0

讓我補充說,~2GB後傳輸仍然減速與此方法:( –