Skip to main content

Anchor every S3 upload to Bitcoin. Async, with retry queue and reconciliation ledger.

Project description

umarise-s3

Anchor every S3 upload to Bitcoin. Works with any S3-compatible backend.

pip install umarise-s3

Quick start

from umarise_s3 import AnchoredS3Client

# Works with Akave, AWS S3, MinIO, GCS, etc.
s3 = AnchoredS3Client(
    endpoint_url="https://o3.akave.xyz",  # or any S3-compatible endpoint
    aws_access_key_id="your_key",
    aws_secret_access_key="your_secret",
)

# Every upload is automatically anchored (async — zero latency overhead)
s3.upload_file("model.pt", "my-bucket", "models/model.pt")
s3.put_object(Bucket="my-bucket", Key="data.csv", Body=open("data.csv", "rb").read())

What happens

  1. File is uploaded to your S3 backend (unchanged)
  2. SHA-256 hash is computed locally (bytes never leave your machine)
  3. Hash is submitted async to Umarise Core API (fire-and-forget)
  4. origin_id is stored as S3 object metadata (x-amz-meta-umarise-origin-id)
  5. Every event is logged to a local reconciliation ledger (~/.umarise/ledger.db)
  6. Within ~12 hours, proof is confirmed on the Bitcoin blockchain

v0.2.0 — Production-ready

Async anchoring (fire-and-forget)

Uploads return immediately. Anchoring happens in a background thread with automatic retry (3 attempts, exponential backoff).

# Default: async (zero latency overhead)
s3 = AnchoredS3Client(endpoint_url="...")

# Legacy: synchronous (v0.1.0 behavior)
s3 = AnchoredS3Client(endpoint_url="...", sync=True)

Reconciliation ledger

Every anchor event is logged locally in SQLite. If metadata writes fail, the link between file and proof is never lost.

# Check status
print(s3.ledger.stats())
# {'total': 42, 'anchored': 40, 'metadata_pending': 1, 'failed': 1}

# Find entries where metadata write failed
failed = s3.ledger.get_failed_metadata()

# Look up a specific object
records = s3.ledger.lookup("my-bucket", "models/model.pt")

# Custom ledger location
s3 = AnchoredS3Client(ledger_path="/var/log/umarise/ledger.db")

Graceful shutdown

# Wait for all pending anchors to complete before exit
s3.flush(timeout=10.0)
print(f"Pending: {s3.pending_anchors}")

S3 metadata written

Metadata key Value
umarise-origin-id UUID
umarise-proof-status pending or anchored
umarise-hash SHA-256 hex

Environment

export UMARISE_API_KEY=um_your_key
export UMARISE_LEDGER_PATH=/custom/path/ledger.db  # optional

Or pass api_key= and ledger_path= to AnchoredS3Client().

Compatible backends

  • Akave Cloud (S3-compatible, decentralized)
  • AWS S3
  • MinIO
  • Google Cloud Storage (S3-compatible mode)
  • Any S3-compatible endpoint

Verify independently

The proof is yours. Verify without Umarise:

License

Unlicense — Public Domain

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

umarise_s3-0.2.0.tar.gz (9.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

umarise_s3-0.2.0-py3-none-any.whl (10.1 kB view details)

Uploaded Python 3

File details

Details for the file umarise_s3-0.2.0.tar.gz.

File metadata

  • Download URL: umarise_s3-0.2.0.tar.gz
  • Upload date:
  • Size: 9.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.8

File hashes

Hashes for umarise_s3-0.2.0.tar.gz
Algorithm Hash digest
SHA256 10f8dca55703d72a7d1c187d3dde8c8fbcf288910f5b0bddcf6958265de4dd7b
MD5 815e18e46d1ecbec741e8bb030eea12e
BLAKE2b-256 292d70d951acb707ed404cc9fa635739d9f2abc62ed827eb3e95b891ba275529

See more details on using hashes here.

File details

Details for the file umarise_s3-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: umarise_s3-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 10.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.8

File hashes

Hashes for umarise_s3-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8e76c2b17f3113cb8eb98ddc5cd76887698e541f359d561f9f60eaeb27c43eff
MD5 3a7c866d8ac36bb3385e8c0b508bc0b8
BLAKE2b-256 e7d0f30589fbe60bc39c6ec09daec195db194db6049ab227ef8d4c7a427ef15f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page