Skip to main content

A tool built on top of boto3 that allows you to easily manage your S3 buckets.

Project description

s3hive

A tool built on top of boto3 that allows you to easily manage your S3 buckets.

License: MIT

Overview

s3hive is a Python tool that provides a high-level interface for working with S3 buckets. With this tool, you can easily perform common operations on your S3 buckets such as creating, deleting, listing, uploading files, etc.

This tool uses the popular boto3 library to interact with the S3 API, making it simple and intuitive to use.

s3hive is designed to be easy to use, with a simple and consistent API that abstracts away many of the complexities of working with S3 buckets. Whether you're a seasoned developer or just getting started, s3hive can help you streamline your S3 operations and save time.

Features

  • Create a new S3 bucket
  • Delete an existing S3 bucket
  • Generate a presigned URL to share an S3 object
  • List all S3 buckets
  • Upload files to an S3 bucket
  • Download files from an S3 bucket
  • List files in an S3 bucket
  • Delete files from an S3 bucket

This tool is a wrapper around the boto3 library. It provides a simple interface to manage your S3 buckets.

Getting Started

Installation

You can install s3hive using pip:

$ pip install s3hive

Usage

Here's an example of how to use s3hive to list all your S3 buckets:

import s3hive as s3
import os

ENDPOINT_URL = os.environ.get('ENDPOINT_URL')
REGION = os.environ.get('REGION')
AWS_ACCESS_KEY_ID = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')

s3hive = s3.Bucket(
    endpoint_url=ENDPOINT_URL,
    region=REGION,
    aws_access_key_id=AWS_ACCESS_KEY_ID,
    aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
)

buckets = s3hive.list_buckets()

print(buckets)

# Output:
# [{
#      'name': 'bucket1',
#      'creation_date': datetime.datetime(2020, 5, 1, 12, 0, 0, tzinfo=tzutc())
# }]

For more examples and detailed documentation, please visit our GitHub repository.

Methods

Method Description
_get_client() Get the S3 client. Returns a boto3 client object for the S3 service.
create_bucket(bucket: str, acl: str = "private") Create an S3 bucket in a specified region. bucket is the name of the bucket to create, and acl is the access control list. Returns True if the bucket was created successfully, or raises an exception if an error occurs.
delete_bucket(bucket: str) Delete an S3 bucket. bucket is the name of the bucket to delete. Returns True if the bucket was deleted successfully, or raises an exception if an error occurs.
list_buckets(names_only: bool = False) List all buckets in the S3 account. If names_only is True, return only the bucket names. Otherwise, return a list of dictionaries, with each dictionary containing the bucket name and creation date. Raises an exception if an error occurs.
list_objects(bucket: str, keys_only: bool = False) List all objects in the specified bucket. If keys_only is True, return only the object keys. Otherwise, return a list of dictionaries, with each dictionary containing the object key, size, and last modified date. Raises an exception if an error occurs.
create_presigned_url(bucket: str, key: str, expiration: int = 3600) Generate a presigned URL to share an S3 object. bucket is the name of the bucket containing the object, key is the object key, and expiration is the time in seconds for the presigned URL to remain valid. Returns the presigned URL as a string, or raises an exception if an error occurs.
upload(bucket: str, file_name: str, key: str = None, extraArgs: dict = None, filesize: int = None) Upload an object to an S3 bucket. file_name is the path to the file to upload, bucket is the name of the bucket to upload to, key is the S3 object name. If not specified, then file_name is used. extraArgs is a dictionary of extra arguments that may be passed to the S3 API. Returns True if the file was uploaded successfully, or raises an exception if an error occurs.
download(bucket: str, key: str, local_dir: str = ROOT_DIR) Download an object from S3 bucket to local directory. key is the S3 object key, and local_dir is the local directory to download the file to (if local_dir not provided object will stored on the root folder). Returns True if the file was downloaded successfully, or raises an exception if an error occurs.
delete(bucket: str, key: str) Delete an object from an S3 bucket. bucket is the name of the bucket containing the object, and key is the object key. Returns True if the object was deleted successfully, or raises an exception if an error occurs.

License

s3hive is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3hive-1.0.0.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

s3hive-1.0.0-py3-none-any.whl (6.7 kB view details)

Uploaded Python 3

File details

Details for the file s3hive-1.0.0.tar.gz.

File metadata

  • Download URL: s3hive-1.0.0.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.2

File hashes

Hashes for s3hive-1.0.0.tar.gz
Algorithm Hash digest
SHA256 8795c5740748324c3b3fa43f9f5a6df3bcc603081330116629d6533578728f40
MD5 664a60711c9377ab0080a859b1267aec
BLAKE2b-256 2b29599a733f09ce0553e28a0d85c4b5806fbc741878cb0b00182aa936b5e0a4

See more details on using hashes here.

File details

Details for the file s3hive-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: s3hive-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 6.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.2

File hashes

Hashes for s3hive-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 49e5d52fc48f3e0a216ebfeaf29d545cc76ff83faea984f673ae8d8786baf72c
MD5 dba919f8a6eb031bd8b79ccc22086975
BLAKE2b-256 d5e8a040be4796c20c89ef04dd30d17f8ff1445799fef9896df2326c57e2cb93

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page