s3-wrapper is a wrapper around s3-related functionalities of AWS's boto3 package.
First, install the library:
pip install s3-wrapper
Next, set up credentials (in e.g. ~/.aws/credentials
):
[default]
aws_access_key_id = YOUR_KEY
aws_secret_access_key = YOUR_SECRET
You should set the following values in environment variables:
- AWS_PROFILE_NAME: AWS profile name
- S3_BUCKET_NAME (Optional): Default bucket name
If S3_BUCKET_NAME is not found in environment variables, you must set the default directory before using any utilities:
s3 = S3Utils()
s3.set_default_bucket('test_bucket')
You can use python-dotenv for loading environment variables.
Sets the default bucket for s3-related operations. Usage:
s3.set_default_bucket('bucket_name')
Assigns a new key to the object inside a bucket. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:
s3.move_object('directory/subdirectory1/file.json', 'directory/subdirectory2/file.json')
If you want to peform this operation on a bucket other than default, use:
s3.move_object('directory/subdirectory1/file.json', 'directory/subdirectory2/file.json', 'bucket_name')
Assigns a new key to the object inside a bucket. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:
s3.copy_object('new_object_key', 'src_object_key')
If you want to peform this operation on a bucket other than default, use:
s3.copy_object('new_object_key', 'src_object_key', 'bucket_name')
Creates a new object inside a bucket and sets its content/body. The process involves creating a new object, copy the old object to new object, and delete old object. Usage:
import json
data = {
'message': 'Hello world',
'created_at': '2020-06-03 05:36:00'
}
formatted_data = json.dumps(data)
s3.create_object('key', formatted_data)
If you want to peform this operation on a bucket other than default, use:
s3.create_object('key', formatted_data, 'bucket_name')
Uploads a file on disk storage as an object on S3. Usage:
file_path = os.path.join('/tmp', 'subdirectory', 'response.json')
s3.upload_file('file_key', file_path)
If you want to peform this operation on a bucket other than default, use:
file_path = os.path.join('/tmp', 'subdirectory', 'response.json')
s3.upload_file('file_key', file_path, 'bucket_name')
Deletes an object from a bucket on S3. Usage:
s3.delete_object('key')
If you want to peform this operation on a bucket other than default, use:
s3.delete_object('key', 'bucket_name')
Deletes objects matching the supplied keys from a bucket. Usage:
s3.delete_objects(['key1, key2', 'key3'])
If you want to peform this operation on a bucket other than default, use:
s3.delete_objects(['key1, key2', 'key3'], 'bucket_name')
Finds files/objects matching the given prefix. This is helpful if you want to get objects in a specific (hypothetical) directory. Usage:
s3.find_files_with_prefix('/directory/subdirectory/prefix')
If you want to peform this operation on a bucket other than default, use:
s3.find_files_with_prefix('/directory/subdirectory/prefix', 'bucket_name')
Returns true if a file exists in a given bucket. Usage:
exists = s3.file_exists('object_key')
If you want to peform this operation on a bucket other than default, use:
exists = s3.file_exists('object_key', 'bucket_name')
Generates a presigned-url for an object in a bucket which expires after given expiration seconds.
url = s3.generate_presigned_url('object_key', 3600)
If you want to peform this operation on a bucket other than default, use:
url = s3.generate_presigned_url('object_key', 3600, 'bucket_name')
Downloads a file from a bucket to a given path on disk.
s3.download_file('object_key', '/home/directory/path.json')
If you want to peform this operation on a bucket other than default, use:
s3.download_file('object_key', '/home/directory/path.json', 'bucket_name')