2015-06-19 172 views
2

我成功通過AWS進行身份驗證,並使用Bucket對象上的'put_object'方法上傳文件。現在我想使用multipart API來完成這個大文件。我發現在這個問題上接受的答案: How to save S3 object to a file using boto3Python Boto3 AWS分段上傳語法

但是當試圖實現我得到「未知的方法」的錯誤。我究竟做錯了什麼?我的代碼如下。謝謝!

## Get an AWS Session 
self.awsSession = Session(aws_access_key_id=accessKey, 
aws_secret_access_key=secretKey, 
aws_session_token=session_token, 
region_name=region_type) 

...   

# Upload the file to S3 
s3 = self.awsSession.resource('s3') 
s3.Bucket('prodbucket').put_object(Key=fileToUpload, Body=data) # WORKS 
#s3.Bucket('prodbucket').upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK 
#s3.upload_file(dataFileName, 'prodbucket', fileToUpload) # DOESNT WORK 
+0

您是否看到過boto3中新增的文件上傳高級界面?有關詳細信息,請參閱https://boto3.readthedocs.org/en/latest/reference/customizations/s3.html#module-boto3.s3.transfer,但它使分段上傳更容易。 – garnaat

回答

2

upload_file方法尚未被移植到存儲桶資源。現在你需要使用客戶端對象直接做到這一點:

client = self.awsSession.client('s3') 
client.upload_file(...) 
+0

感謝這工作! – PhilBot

0

Libcloud S3 wrapper透明地處理你的所有部件的分裂和上傳。

使用upload_object_via_stream方法,這樣做:

from libcloud.storage.types import Provider 
from libcloud.storage.providers import get_driver 

# Path to a very large file you want to upload 
FILE_PATH = '/home/user/myfile.tar.gz' 

cls = get_driver(Provider.S3) 
driver = cls('api key', 'api secret key') 

container = driver.get_container(container_name='my-backups-12345') 

# This method blocks until all the parts have been uploaded. 
extra = {'content_type': 'application/octet-stream'} 

with open(FILE_PATH, 'rb') as iterator: 
    obj = driver.upload_object_via_stream(iterator=iterator, 
              container=container, 
              object_name='backup.tar.gz', 
              extra=extra) 

有關S3多成分的官方文檔,請AWS Official Blog