Skip to main content

Async S3 client using httpx & anyio

Project description

handtruck

The simple module for putting and getting object from Amazon S3 compatible endpoints.

Installation

pip install handtruck

Usage

from http import HTTPStatus

from httpx import AsyncClient
from handtruck import S3Client


client = S3Client(
    url="http://s3-url",
    client=AsyncClient(),
    access_key_id="key-id",
    secret_access_key="hackme",
    region="us-east-1"
)

# Upload str object to bucket "bucket" and key "str"
resp = await client.put("bucket/str", "hello, world")
assert resp.status_code == HTTPStatus.OK

# Upload bytes object to bucket "bucket" and key "bytes"
resp = await client.put("bucket/bytes", b"hello, world")
assert resp.status_code == HTTPStatus.OK

# Upload AsyncIterable to bucket "bucket" and key "iterable"
async def gen():
    yield b'some bytes'

resp = await client.put("bucket/file", gen())
assert resp.status_code == HTTPStatus.OK

# Upload file to bucket "bucket" and key "file"
resp = await client.put_file("bucket/file", "/path_to_file" )
assert resp.status_code == HTTPStatus.OK

# Check object exists using bucket+key
resp = await client.head("bucket/key")
assert resp.status_code == HTTPStatus.OK

# Get object by bucket+key
resp = await client.get("bucket/key")
data = resp.content

# Delete object using bucket+key
resp = await client.delete("bucket/key")
assert resp == HTTPStatus.NO_CONTENT

# List objects by prefix
async for result in client.list_objects_v2("bucket/", prefix="prefix"):
    # Each result is a list of metadata objects representing an object
    # stored in the bucket.
    do_work(result)

Bucket may be specified as subdomain or in object name:

import httpx
from handtruck import S3Client


client = S3Client(url="http://bucket.your-s3-host",
                  client=httpx.AsyncClient())
resp = await client.put("key", gen())
...

client = S3Client(url="http://your-s3-host",
                  client=httpx.AsyncClient())
resp = await client.put("bucket/key", gen())
...

client = S3Client(url="http://your-s3-host/bucket",
                  client=httpx.AsyncClient())
resp = await client.put("key", gen())
...

Auth may be specified with keywords or in URL:

import httpx
from handtruck import S3Client

client_credentials_as_kw = S3Client(
    url="http://your-s3-host",
    access_key_id="key_id",
    secret_access_key="access_key",
    client=httpx.AsyncClient(),
)

client_credentials_in_url = S3Client(
    url="http://key_id:access_key@your-s3-host",
    client=httpx.AsyncClient(),
)

Credentials

By default S3Client trying to collect all available credentials from keyword arguments like access_key_id= and secret_access_key=, after that from the username and password from passed url argument, so the nex step is environment variables parsing and the last source for collection is the config file.

You can pass credentials explicitly using the handtruck.credentials module.

Parallel Transfer

Parallel transfer is fully supported, either with multipart uploading or the Range header.

import httpx
from handtruck import S3Client


client = S3Client(url="http://your-s3-host", client=httpx.AsyncClient())

await client.put_file_multipart(
    "test/bigfile.csv",
    headers={
        "Content-Type": "text/csv",
    },
    workers_count=8,
)

await client.get_file_parallel(
    "dump/bigfile.csv",
    "/home/user/bigfile.csv",
    workers_count=8,
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

handtruck-0.1.9.tar.gz (21.9 kB view details)

Uploaded Source

File details

Details for the file handtruck-0.1.9.tar.gz.

File metadata

  • Download URL: handtruck-0.1.9.tar.gz
  • Upload date:
  • Size: 21.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.1 CPython/3.14.2 Linux/6.17.10-300.fc43.x86_64

File hashes

Hashes for handtruck-0.1.9.tar.gz
Algorithm Hash digest
SHA256 4c23b8a656a9e38b12b7313648f7f6db6bf407201303bc9a017df6dec9e49059
MD5 dc77b3337a3c1444c0ba1f290426c260
BLAKE2b-256 f3b7bfb23f371daa0815265320e442b5d25c8165bfc4f5f61da9e612663ac06b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page