Skip to main content

CloudSpark is a powerful Python package designed to simplify the management of AWS S3 and Lambda services. Whether you're working on the frontend or backend, CloudSpark provides an intuitive interface to generate presigned URLs and handle file uploads seamlessly.

Project description

CloudSpark

cloudspark is a Python package that provides a convenient way to manage AWS S3 buckets and generate presigned URLs using boto3. It supports bucket creation, CORS management, policy setting, and presigned URL generation.

Features

  • S3 Management: Effortlessly manage your S3 buckets, objects, and file uploads with built-in methods.

  • Presigned URL Generation: Generate secure presigned URLs for your S3 objects, enabling users to upload files directly from the frontend without exposing your credentials.

  • Seamless Integration: Designed to work smoothly with both frontend and backend applications, making file uploads and cloud function management more accessible.

This repository contains several components. For more details on each component, refer to the respective README files.

Installation

You can install cloudspark via pip. Make sure you have boto3 installed as well.

pip install cloudspark 

Usage

Importing the Library

from cloudspark import S3Connection

Initializing the Connection

Create an instance of S3Connection by providing your AWS credentials and region name

s3_conn = S3Connection(access_key='YOUR_ACCESS_KEY',

                       secret_access_key='YOUR_SECRET_ACCESS_KEY',

                       region_name='YOUR_REGION_NAME')

Connecting to a Bucket

Establish a connection to an S3 bucket:

s3_bucketclient = s3_conn.connect(bucket_name='your-bucket-name')

returns an S3 client instance.

Returns the current S3 client instance

s3_bucketclient = s3_conn.get_instance()

returns an S3 client instance.

Creating a bucket

Creates an S3 bucket with the provided name.

s3_client = s3_conn.connect()

s3_bucketclient = s3_conn.create_s3bucket(bucket_name='new-bucket-name')

Set Bucket CORS

Set the CORS configuration for the connected bucket

s3_conn.set_bucket_cors()



# CORSRules: A list of dictionaries containing CORS rules.

# If None, a default CORS rule allowing all origins and methods is applied.

cors_rules = [

    {

        'AllowedHeaders': ['*'],

        'AllowedMethods': ['GET', 'POST'],

        'AllowedOrigins': ['*'],

        'ExposeHeaders': [],

        'MaxAgeSeconds': 3000

    }

]



s3_conn.set_bucket_cors(CORSRules=cors_rules)

Get Bucket CORS

Retrieve the CORS configuration for the connected bucket

cors_config = s3_conn.get_bucket_cors()

Delete Bucket CORS

Delete the CORS configuration from the connected bucket

s3_conn.delete_bucket_cors()

Set Bucket Policy

Set or update the bucket policy for the connected bucket

s3_conn.set_bucket_policy()



# A JSON string or dictionary representing the bucket policy. 

# If None, a default public read policy is applied.



policy = {

    "Version": "2012-10-17",

    "Statement": [

        {

            "Sid": "PublicReadGetObject",

            "Effect": "Allow",

            "Principal": "*",

            "Action": "s3:GetObject",

            "Resource": f"arn:aws:s3:::{bucket_name}/*"

        }

    ]

}

s3_conn.set_bucket_policy(bucket_policy=policy)

Delete Bucket Policy

Delete the bucket policy from the connected bucket

s3_conn.delete_bucket_policy()

List User Policies

List inline policies for an IAM user

policies = s3_conn.list_user_policies(UserName='user-name')

Block or Allow Public Access

Block or allow public access to the bucket

s3_conn.public_access(block=True)  # Block public access

s3_conn.public_access(block=False) # Allow public access

Generate Presigned Create URL

Generate a presigned URL for creating an object

responce = s3_conn.presigned_create_url(

    object_name='object_name',

    params={'key': 'value'},

    fields={'field': 'value'},

    conditions=[{'condition': 'value'}],

    expiration=3600

)

object_name : The name of the object to be created in the S3 bucket.

params: (Optional) Additional request parameters to include in the presigned URL.

fields : (Optional) Pre-filled form fields to include in the presigned URL.

conditions: (Optional) Conditions to include in the presigned URL.

expiration : (Optional) Time in seconds for which the presigned URL should remain valid. Default is 3600 seconds (1 hour).

Generate Presigned Get URL

Generate a presigned URL for accessing an object

responce = s3_conn.presigned_get_url(object_name='object_name', expiration=3600)

Uploads a file to the connected S3 bucket.

responce = s3_conn.upload_object(file=file, key_name="object_name")



# example:

with open('file_name',"rb") as file_obj:

    s3_conn.upload_object(file=file_obj, key_name="object_name")

file: Bytes of the file to upload

key_name: S3 object name (e.g., 'folder/filename.txt').

Retrieves an object from the connected S3 bucket.

return: The object metadata

key_object = s3_conn.get_object(key_name="object_name")

Deletes an object from the connected S3 bucket

key_object = s3_conn.delete_object(key_name="object_name")

Lists objects in the connected S3 bucket.

key_object = s3_conn.get_objects()
key_object = s3_conn.get_objects(only_objects=True)

only_objects: If True, returns a list of object metadata (excluding keys).

key_object = s3_conn.get_objects(only_keys=True)

only_keys: If True, returns a list of object keys.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cloudspark-1.0.10.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cloudspark-1.0.10-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file cloudspark-1.0.10.tar.gz.

File metadata

  • Download URL: cloudspark-1.0.10.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.4

File hashes

Hashes for cloudspark-1.0.10.tar.gz
Algorithm Hash digest
SHA256 793dbd8c3696d06f8ef899c73e1b519c94440a56012f4ebe4d1bab2de3d2200c
MD5 ac7633b6f565e59b74525d5952dfb457
BLAKE2b-256 ddd0e3e8569343983462d599758d045ebc1cf2771b6c3559232aafd9fc5b7d4c

See more details on using hashes here.

File details

Details for the file cloudspark-1.0.10-py3-none-any.whl.

File metadata

  • Download URL: cloudspark-1.0.10-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.4

File hashes

Hashes for cloudspark-1.0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 fa159684b14635b79238e8cb213f347094fb6ce6bed1f3b311e318c780c70a84
MD5 b791bbab8a9c9e1f0f2d5868dedf7d04
BLAKE2b-256 041545871ebe6cb8f3772633c9645a701a9f3073c7c104082cf7bec1a2e2b570

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page