我想使用python複製s3存儲桶中的文件。如何使用boto將文件上傳到S3存儲桶中的目錄
例如:我有桶名稱=測試。並在桶中,我有2個文件夾名稱「轉儲」&「輸入」。現在我想複製一個文件從本地目錄到S3「轉儲」文件夾使用python ...任何人都可以幫助我嗎?
我想使用python複製s3存儲桶中的文件。如何使用boto將文件上傳到S3存儲桶中的目錄
例如:我有桶名稱=測試。並在桶中,我有2個文件夾名稱「轉儲」&「輸入」。現在我想複製一個文件從本地目錄到S3「轉儲」文件夾使用python ...任何人都可以幫助我嗎?
嘗試......
import boto
import boto.s3
import sys
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
bucket_name = AWS_ACCESS_KEY_ID.lower() + '-dump'
conn = boto.connect_s3(AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY)
bucket = conn.create_bucket(bucket_name,
location=boto.s3.connection.Location.DEFAULT)
testfile = "replace this with an actual filename"
print 'Uploading %s to Amazon S3 bucket %s' % \
(testfile, bucket_name)
def percent_cb(complete, total):
sys.stdout.write('.')
sys.stdout.flush()
k = Key(bucket)
k.key = 'my test file'
k.set_contents_from_filename(testfile,
cb=percent_cb, num_cb=10)
[更新] 我不是pythonist,所以感謝擡起頭對import語句。另外,我不建議在你自己的源代碼中放置證書。如果您正在運行這裏面AWS使用與實例概要文件IAM憑證(http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2_instance-profiles.html),並保持相同的行爲在你的開發/測試環境中,使用這樣的全息圖將AdRoll(https://github.com/AdRoll/hologram)
我用這個,這是非常簡單地實現
import tinys3
conn = tinys3.Connection('S3_ACCESS_KEY','S3_SECRET_KEY',tls=True)
f = open('some_file.zip','rb')
conn.upload('some_file.zip',f,'my_bucket')
我不認爲這適用於大文件。我不得不使用這個:http://docs.pythonboto.org/en/latest/s3_tut.html#storing-large-data – wordsforthewise 2016-10-12 03:10:49
這也導致我這個修復:https://github.com/boto/boto/問題/ 2207#issuecomment-60682869 和這個: http://stackoverflow.com/questions/5396932/why-are-no-amazon-s3-authentication-handlers-ready – wordsforthewise 2016-10-12 03:36:25
由於tinys3項目被放棄,你不應該使用這個。 https://github.com/smore-inc/tinys3/issues/45 – 2018-01-27 12:24:46
沒有必要弄得這麼複雜:
s3_connection = boto.connect_s3()
bucket = s3_connection.get_bucket('your bucket name')
key = boto.s3.key.Key(bucket, 'some_file.zip')
with open('some_file.zip') as f:
key.send_file(f)
from boto3.s3.transfer import S3Transfer
import boto3
#have all the variables populated which are required below
client = boto3.client('s3', aws_access_key_id=access_key,aws_secret_access_key=secret_key)
transfer = S3Transfer(client)
transfer.upload_file(filepath, bucket_name, folder_name+"/"+filename)
什麼是文件路徑,什麼是文件夾名稱+文件名?它很混亂 – colintobing 2017-08-03 02:41:22
@colintobing文件路徑是羣集上的文件路徑,而folder_name /文件名是你想要的內部s3存儲區的命名約定 – 2017-08-29 11:47:51
哇,爲什麼有50種方法可以做到這一點...... – 2018-01-24 10:33:21
這也將工作:
import os
import boto
import boto.s3.connection
from boto.s3.key import Key
try:
conn = boto.s3.connect_to_region('us-east-1',
aws_access_key_id = 'AWS-Access-Key',
aws_secret_access_key = 'AWS-Secrete-Key',
# host = 's3-website-us-east-1.amazonaws.com',
# is_secure=True, # uncomment if you are not using ssl
calling_format = boto.s3.connection.OrdinaryCallingFormat(),
)
bucket = conn.get_bucket('YourBucketName')
key_name = 'FileToUpload'
path = 'images/holiday' #Directory Under which file should get upload
full_key_name = os.path.join(path, key_name)
k = bucket.new_key(full_key_name)
k.set_contents_from_filename(key_name)
except Exception,e:
print str(e)
print "error"
import boto
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = ''
AWS_SECRET_ACCESS_KEY = ''
END_POINT = '' # eg. us-east-1
S3_HOST = '' # eg. s3.us-east-1.amazonaws.com
BUCKET_NAME = 'test'
FILENAME = 'upload.txt'
UPLOADED_FILENAME = 'dumps/upload.txt'
# include folders in file path. If it doesn't exist, it will be created
s3 = boto.s3.connect_to_region(END_POINT,
aws_access_key_id=AWS_ACCESS_KEY_ID,
aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
host=S3_HOST)
bucket = s3.get_bucket(BUCKET_NAME)
k = Key(bucket)
k.key = UPLOADED_FILENAME
k.set_contents_from_filename(FILENAME)
import boto3
s3 = boto3.resource('s3')
BUCKET = "test"
s3.Bucket(BUCKET).upload_file("your/local/file", "dump/file")
您能解釋這一行嗎? s3.Bucket(BUCKET).upload_file(「your/local/file」,「dump/file「) – venkat 2018-03-06 12:44:39
@venkat」your/local/file「是一個文件路徑,例如使用python/boto的計算機上的」/home/file.txt「,」dump/file「是一個存儲文件的關鍵名稱S3 Bucket。請參閱:http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Bucket.upload_file – 2018-03-06 22:16:56
我會避免多進口線,而不是Python的。將導入行移動到頂部,對於boto,可以使用from boto.s3.connection import S3Connection; conn = S3Connection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY); bucket = conn.create_bucket(bucketname ...); bucket.new_key(keyname,...)。set_contents_from_filename .... – cgseller 2015-06-29 22:51:15