Configurable S3 and R2 storage client for Python services.
Project description
noriastore
Configurable object storage client for S3-compatible providers, with first-class support for AWS S3 and Cloudflare R2.
The package gives you a single Python API for:
- storing objects
- reading object metadata
- deleting objects
- creating presigned upload and download URLs
- generating public URLs
- customizing key resolution and URL generation
- normalizing storage failures into one error type
Install
pip install noriastore
Python requirement: >=3.11
Main Exports
from noriastore import (
DEFAULT_DOWNLOAD_EXPIRES_IN,
DEFAULT_R2_REGION,
DEFAULT_S3_REGION,
DEFAULT_UPLOAD_EXPIRES_IN,
MAX_PRESIGN_EXPIRES_IN,
DeleteObjectResult,
HeadObjectResult,
PresignedRequest,
PutObjectResult,
ResolvedStoragePublicUrlInput,
StorageClient,
StorageError,
StorageObjectDescriptor,
StorageOperationContext,
create_storage_client,
join_storage_key,
)
Quick Start
from noriastore import StorageClient
storage = StorageClient(
bucket="documents",
region="eu-west-1",
key_prefix="tenant-a",
public_base_url="https://cdn.example.com",
)
result = storage.put_object(
key=["invoices", "march-2026.pdf"],
body=b"hello",
content_type="application/pdf",
metadata={"source": "admin"},
)
upload = storage.create_presigned_upload_url(
key=["uploads", "avatar.png"],
content_type="image/png",
)
Constructor
storage = StorageClient(
bucket="documents",
provider="s3",
region="eu-west-1",
endpoint=None,
account_id=None,
credentials=None,
public_base_url="https://cdn.example.com",
key_prefix="tenant-a",
force_path_style=None,
url_style=None,
default_metadata={"source": "api"},
default_tags={"project": "noria"},
default_content_type="application/octet-stream",
default_cache_control=None,
default_content_disposition=None,
default_content_encoding=None,
default_content_language=None,
default_upload_expires_in=900,
default_download_expires_in=3600,
client=None,
presign_url=None,
s3_client_config=None,
resolve_key=None,
build_public_url=None,
)
Constructor Options
bucket: required default bucketprovider:s3orr2, defaults3region: optional; defaults depend on providerendpoint: optional explicit S3-compatible endpointaccount_id: optional; used to derive the R2 endpoint whenprovider="r2"credentials: optional mapping withaccess_key_id,secret_access_key,session_tokenpublic_base_url: optional base URL used when building public URLskey_prefix: optional prefix prepended to every keyforce_path_style: legacy-style addressing switchurl_style: explicit addressing style,pathorvirtual-hosteddefault_metadata: default object metadata merged into uploadsdefault_tags: default object tags merged into uploadsdefault_content_typedefault_cache_controldefault_content_dispositiondefault_content_encodingdefault_content_languagedefault_upload_expires_in: default900default_download_expires_in: default3600client: optional prebuilt boto S3 clientpresign_url: optional custom presign functions3_client_config: optional boto client optionsresolve_key: optional hook for custom key rewritingbuild_public_url: optional hook for custom public URL generation
There is also a convenience alias:
from noriastore import create_storage_client
storage = create_storage_client(bucket="documents")
Provider Defaults
AWS S3 Defaults
provider="s3"- default region:
us-east-1 - default URL style:
virtual-hosted
Cloudflare R2 Defaults
provider="r2"- default region:
auto - default URL style:
path - if
account_idis set andendpointis not, the endpoint becomes:
https://{account_id}.r2.cloudflarestorage.com
Key Normalization
Keys can be passed as:
- a plain string
- a list or tuple of string segments
- nested sequences of string segments
Use join_storage_key() when you want the same normalization outside the client:
from noriastore import join_storage_key
key = join_storage_key(" invoices/ ", ["2026", "/march/"], "statement.pdf")
# invoices/2026/march/statement.pdf
Normalization rules:
- strips surrounding whitespace
- strips leading and trailing
/from each segment - joins segments with
/ - ignores non-string nested values
- raises
TypeErrorwhen the final key is empty for an operation that requires one
Operations
put_object()
result = storage.put_object(
key="exports/data.json",
body='{"ok": true}',
metadata={"source": "dashboard"},
tags={"env": "prod", "kind": "report"},
content_type="application/json",
cache_control="public, max-age=300",
content_disposition="inline",
content_encoding="gzip",
content_language="en",
content_md5=None,
expires=None,
public_url=True,
command_input=None,
)
put_object() returns a PutObjectResult with:
bucketkeyproviderpublic_urletagversion_idchecksum_crc32checksum_crc32cchecksum_sha1checksum_sha256
Behavior:
key_prefixis applied before the request is builtresolve_keyruns after prefixing- metadata and tags merge defaults with per-call values
- explicit method arguments override
command_input command_inputoverrides constructor defaults
head_object()
result = storage.head_object(
key="images/logo.png",
not_found="null",
public_url=True,
command_input=None,
)
head_object() returns HeadObjectResult | None.
HeadObjectResult includes:
bucketkeyproviderpublic_urlexistsetagversion_idlast_modifiedexpires_atcontent_lengthcontent_typecache_controlcontent_dispositioncontent_encodingcontent_languagemetadataraw
not_found behavior:
not_found="null"returnsNonenot_found="error"raisesStorageError
object_exists()
exists = storage.object_exists(key="images/logo.png")
This is a convenience wrapper over head_object(..., not_found="null", public_url=False).
delete_object()
result = storage.delete_object(
key="private/report.pdf",
public_url=False,
command_input=None,
)
Returns DeleteObjectResult with:
bucketkeyproviderpublic_urlversion_iddelete_markerraw
create_presigned_upload_url()
request = storage.create_presigned_upload_url(
key="avatars/user-1.png",
expires_in=600,
metadata={"uploadedBy": "admin"},
tags={"kind": "avatar"},
content_type="image/png",
command_input={"ACL": "public-read"},
)
Returns PresignedRequest with:
bucketkeyproviderpublic_urlmethodurlheadersexpires_inexpires_at
Upload requests always return:
method == "PUT"
The returned headers include any upload headers that the signed request expects, including:
- standard content headers
x-amz-meta-*metadata headers- ACL and encryption headers from
command_input - checksum headers from
command_input
create_presigned_download_url()
request = storage.create_presigned_download_url(
key=["reports", "march report.pdf"],
expires_in=60,
)
Returns a PresignedRequest with:
method == "GET"- empty
headers expires_inandexpires_at
create_public_url()
url = storage.create_public_url("images/logo.png")
Public URL generation uses this precedence:
build_public_urlhookpublic_base_url- explicit
endpoint - built-in AWS S3 URL rules
- built-in R2 URL rules when enough information exists
If a storage operation sets public_url=True but URL generation fails, the operation still succeeds and returns public_url=None.
Calling create_public_url() directly is stricter: failures are wrapped as StorageError.
Customization Hooks
resolve_key
Use resolve_key to transform every normalized key before requests are sent:
from noriastore import StorageClient
storage = StorageClient(
bucket="documents",
key_prefix=["tenant-a", "uploads"],
resolve_key=lambda key, ctx: f"v1/{key}",
)
The hook receives:
key: normalized key afterkey_prefixhas been appliedctx:StorageOperationContext(operation, bucket, provider)
Available StorageOperationContext fields:
operationbucketprovider
build_public_url
Use build_public_url when the built-in URL rules do not match your CDN or proxy layout:
storage = StorageClient(
bucket="assets",
build_public_url=lambda resolved: (
f"https://cdn.example.com/{resolved.provider}/{resolved.bucket}/{resolved.key}"
),
)
The hook receives ResolvedStoragePublicUrlInput with:
bucketkeyproviderregionendpointaccount_idurl_stylepublic_base_url
client
Inject a custom boto client when you want full control over transport, credentials, or tests:
storage = StorageClient(bucket="assets", client=my_boto_client)
presign_url
Inject a custom presigner when you need to route signing through your own code:
storage = StorageClient(
bucket="assets",
presign_url=lambda client, operation, params, expires_in: "https://signed.example.com",
client=my_boto_client,
)
s3_client_config
Pass boto session.client("s3", ...) options through s3_client_config.
Use this for lower-level client options such as retry configuration, endpoint options, or a custom botocore.config.Config.
Metadata, Tags, and Command Overrides
Per-call upload inputs are merged like this:
- constructor defaults
command_input- explicit method arguments
Examples:
default_metadata={"source": "api"}merged withmetadata={"source": "dashboard"}gives{"source": "dashboard"}default_tags={"project": "noria"}merged withtags={"env": "prod"}includes bothcontent_type="image/png"overrides bothcommand_input["ContentType"]anddefault_content_type
Tags are URL encoded into the S3 Tagging request field.
Public URL Rules
public_base_url
storage = StorageClient(
bucket="assets",
key_prefix="tenant-a",
public_base_url="https://cdn.example.com/base/",
)
storage.create_public_url(["documents", "report.pdf"])
# https://cdn.example.com/base/tenant-a/documents/report.pdf
Explicit Endpoint
Path style:
storage = StorageClient(
bucket="assets",
endpoint="https://objects.example.com/root/",
url_style="path",
)
Produces:
https://objects.example.com/root/assets/images/logo.png
Virtual-hosted style:
storage = StorageClient(
bucket="assets",
endpoint="https://objects.example.com/root/",
url_style="virtual-hosted",
)
Produces:
https://assets.objects.example.com/root/images/logo.png
AWS S3 Built-in URLs
us-east-1 virtual-hosted:
https://bucket.s3.amazonaws.com/key
Regional path style:
https://s3.eu-west-1.amazonaws.com/bucket/key
Regional virtual-hosted:
https://bucket.s3.eu-west-1.amazonaws.com/key
Cloudflare R2 Built-in URLs
R2 public URL generation requires one of:
public_base_urlendpointaccount_idbuild_public_url
If none of those exist, create_public_url() raises a wrapped error.
Error Handling
All normalized failures raise StorageError.
StorageError fields:
codeoperationproviderbucketkeystatus_coderetryabledetailscause
Error codes by operation:
putObject->STORAGE_PUT_FAILEDheadObject->STORAGE_HEAD_FAILEDdeleteObject->STORAGE_DELETE_FAILEDcreatePresignedUploadUrl->STORAGE_PRESIGN_UPLOAD_FAILEDcreatePresignedDownloadUrl->STORAGE_PRESIGN_DOWNLOAD_FAILEDcreatePublicUrl->STORAGE_PUBLIC_URL_FAILED
Retry behavior:
retryable=Truewhen status isNone,429, or>= 500retryable=Falsefor typical client errors such as400and403
Existing StorageError instances are passed through without being wrapped again.
Expiry Rules
Constants:
DEFAULT_UPLOAD_EXPIRES_IN = 900DEFAULT_DOWNLOAD_EXPIRES_IN = 3600MAX_PRESIGN_EXPIRES_IN = 604800DEFAULT_S3_REGION = "us-east-1"DEFAULT_R2_REGION = "auto"
Validation:
- expiry values must be positive integers
- expiry values must not exceed
604800seconds
Common Usage Patterns
AWS S3
storage = StorageClient(
bucket="media",
provider="s3",
region="eu-west-1",
)
Cloudflare R2
storage = StorageClient(
bucket="assets",
provider="r2",
account_id="acc-123",
)
Fixed CDN URL Base
storage = StorageClient(
bucket="assets",
public_base_url="https://cdn.example.com",
)
Test Double Client
class MockClient:
def put_object(self, **kwargs): ...
def head_object(self, **kwargs): ...
def delete_object(self, **kwargs): ...
def generate_presigned_url(self, operation_name, *, Params, ExpiresIn): ...
storage = StorageClient(bucket="assets", client=MockClient())
Notes and Caveats
provideris a free string internally, but the built-in defaults and URL rules are only defined fors3andr2- direct
create_public_url()calls fail fast when configuration is incomplete - regular object operations degrade gracefully to
public_url=Noneif public URL generation fails - non-string nested key parts are ignored during key normalization
- datetime metadata such as
LastModifiedandExpiresare normalized to UTC ISO strings when present
Development
Run tests:
uv sync --extra dev
uv run pytest
Run lint:
uv run ruff check .
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file noriastore-0.1.0.tar.gz.
File metadata
- Download URL: noriastore-0.1.0.tar.gz
- Upload date:
- Size: 20.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
aa3f5e296c885bc776d3abe568d27d65fa450feca7c9a7965c9d90ce623790eb
|
|
| MD5 |
fc64c7b27a1464bb7e51e2af2ab69436
|
|
| BLAKE2b-256 |
3777e4e417d1c6bbc60e6e516a3cc7ced4b7a46c57dda9cf245d4e83023ff531
|
Provenance
The following attestation bundles were made for noriastore-0.1.0.tar.gz:
Publisher:
ci.yml on thekiharani/py-packages
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
noriastore-0.1.0.tar.gz -
Subject digest:
aa3f5e296c885bc776d3abe568d27d65fa450feca7c9a7965c9d90ce623790eb - Sigstore transparency entry: 1261948306
- Sigstore integration time:
-
Permalink:
thekiharani/py-packages@952d29f64a03e8f8db5642ec9845687845753bee -
Branch / Tag:
refs/heads/release - Owner: https://github.com/thekiharani
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
ci.yml@952d29f64a03e8f8db5642ec9845687845753bee -
Trigger Event:
push
-
Statement type:
File details
Details for the file noriastore-0.1.0-py3-none-any.whl.
File metadata
- Download URL: noriastore-0.1.0-py3-none-any.whl
- Upload date:
- Size: 13.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4e2acdd74fcb9f16428048fd286edb75999e14536f9980ce85f35a7aab2d185a
|
|
| MD5 |
e61f4e3843572c350870d7322c19ec60
|
|
| BLAKE2b-256 |
7bf6f027a99de1fb6bd3ebda89bcb247dc02863e20875b1e78c97109c347498e
|
Provenance
The following attestation bundles were made for noriastore-0.1.0-py3-none-any.whl:
Publisher:
ci.yml on thekiharani/py-packages
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
noriastore-0.1.0-py3-none-any.whl -
Subject digest:
4e2acdd74fcb9f16428048fd286edb75999e14536f9980ce85f35a7aab2d185a - Sigstore transparency entry: 1261948312
- Sigstore integration time:
-
Permalink:
thekiharani/py-packages@952d29f64a03e8f8db5642ec9845687845753bee -
Branch / Tag:
refs/heads/release - Owner: https://github.com/thekiharani
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
ci.yml@952d29f64a03e8f8db5642ec9845687845753bee -
Trigger Event:
push
-
Statement type: