我如何能看到什么是在S3桶与boto3?(例如,写一个“ls”)?
做以下事情:
import boto3
s3 = boto3.resource('s3')
my_bucket = s3.Bucket('some/path/')
返回:
s3.Bucket(name='some/path/')
我如何看到它的内容?
我如何能看到什么是在S3桶与boto3?(例如,写一个“ls”)?
做以下事情:
import boto3
s3 = boto3.resource('s3')
my_bucket = s3.Bucket('some/path/')
返回:
s3.Bucket(name='some/path/')
我如何看到它的内容?
当前回答
首先,创建一个s3客户端对象:
s3_client = boto3.client('s3')
接下来,创建一个变量来保存bucket名称和文件夹。注意文件夹名后面的斜杠“/”:
bucket_name = 'my-bucket'
folder = 'some-folder/'
接下来,调用s3_client。List_objects_v2获取文件夹内容对象的元数据:
response = s3_client.list_objects_v2(
Bucket=bucket_name,
Prefix=folder
)
最后,使用对象的元数据,您可以通过调用s3_client来获取S3对象。get_object功能:
for object_metadata in response['Contents']:
object_key = object_metadata['Key']
response = s3_client.get_object(
Bucket=bucket_name,
Key=object_key
)
object_body = response['Body'].read()
print(object_body)
如你所见,字符串格式的对象内容可以通过调用response['Body'].read()来获得。
其他回答
也可以这样做:
csv_files = s3.list_objects_v2(s3_bucket_path)
for obj in csv_files['Contents']:
key = obj['Key']
这是解决方案
import boto3
s3=boto3.resource('s3')
BUCKET_NAME = 'Your S3 Bucket Name'
allFiles = s3.Bucket(BUCKET_NAME).objects.all()
for file in allFiles:
print(file.key)
为了处理大型键列表(即当目录列表大于1000项时),我使用以下代码将多个列表中的键值(即文件名)累积起来(感谢上面的阿梅里奥的第一行)。代码是针对python3的:
from boto3 import client
bucket_name = "my_bucket"
prefix = "my_key/sub_key/lots_o_files"
s3_conn = client('s3') # type: BaseClient ## again assumes boto.cfg setup, assume AWS S3
s3_result = s3_conn.list_objects_v2(Bucket=bucket_name, Prefix=prefix, Delimiter = "/")
if 'Contents' not in s3_result:
#print(s3_result)
return []
file_list = []
for key in s3_result['Contents']:
file_list.append(key['Key'])
print(f"List count = {len(file_list)}")
while s3_result['IsTruncated']:
continuation_key = s3_result['NextContinuationToken']
s3_result = s3_conn.list_objects_v2(Bucket=bucket_name, Prefix=prefix, Delimiter="/", ContinuationToken=continuation_key)
for key in s3_result['Contents']:
file_list.append(key['Key'])
print(f"List count = {len(file_list)}")
return file_list
从lambda函数运行aws cli命令也是一个不错的选择
import subprocess
import logging
logger = logging.getLogger()
logger.setLevel(logging.INFO)
def run_command(command):
command_list = command.split(' ')
try:
logger.info("Running shell command: \"{}\"".format(command))
result = subprocess.run(command_list, stdout=subprocess.PIPE);
logger.info("Command output:\n---\n{}\n---".format(result.stdout.decode('UTF-8')))
except Exception as e:
logger.error("Exception: {}".format(e))
return False
return True
def lambda_handler(event, context):
run_command('/opt/aws s3 ls s3://bucket-name')
如果你想传递ACCESS和SECRET密钥(你不应该这样做,因为这是不安全的):
from boto3.session import Session
ACCESS_KEY='your_access_key'
SECRET_KEY='your_secret_key'
session = Session(aws_access_key_id=ACCESS_KEY,
aws_secret_access_key=SECRET_KEY)
s3 = session.resource('s3')
your_bucket = s3.Bucket('your_bucket')
for s3_file in your_bucket.objects.all():
print(s3_file.key)