Skip to main content

Universal file storage interface for Python

Project description

Blackhole

ActiveStorage, but for Python

The universal file storage adapter for the major Cloud storage services like AWS S3, Google Cloud, Azure but also can be used for the local storage. Optionally persists file records in a SQL database via SQLModel.

TODOs

  • aws, gcp and azure providers
    • local
    • aws
    • gcp
    • azure
    • minio
  • aiofiles for local adapter
  • tests
  • get settings from yaml file (pydantic-settings)
  • SQLModel / SQL database integration
    • save adapter type when storing record
    • store file's hashsum
    • use ETag as hashsum and make uniq constraint on that column (create it as well)
    • remove filename unique constraint
    • migrations for further schema changes
  • pluggable store abstraction (SQL, extensible to Redis, MongoDB, etc.)
  • middlewares (pre/post)
  • put_later - background job uploading/downloading
  • asset management web interface
  • monitoring
  • error tracking
  • logging
  • big files upload/download (streaming)
  • AI and tooling for it
    • OpenAI tool integration
    • Anthropic tool integration
    • Google tool integration
    • LangChain tool integration
    • ...
  • ...

Installation

# core only
pip install blackhole-io

# with SQL store support (SQLModel + asyncpg + aiosqlite)
pip install "blackhole-io[sql]"

# with CLI
pip install "blackhole-io[cli]"

# with Anthropic tool-use support
pip install "blackhole-io[anthropic]"

# everything
pip install "blackhole-io[sql,cli,anthropic]"

Init

from blackhole_io import Blackhole
from blackhole_io.configs.s3 import S3Config

config = S3Config(
    access_key="...",
    secret_key="...",
    region="us-east-1",
    bucket="...",
)
bh = Blackhole(config=config)

File operations

file = ... # str path | bytes | BytesIO | starlette UploadFile

filename = await bh.put(file)

if await bh.exists(filename):
    bh_file = await bh.get(filename)
    print(bh_file.filename)
    print(bh_file.blob)  # bytes

    await bh.delete(filename)

AI agent tool use — Anthropic

import anthropic
from blackhole_io import Blackhole
from blackhole_io.configs.s3 import S3Config
from blackhole_io.tools.anthropic import BlackholeAnthropicTools

bh = Blackhole(config=S3Config(...))
bh_tools = BlackholeAnthropicTools(bh)
client = anthropic.AsyncAnthropic()

# Pass tool definitions to the model
response = await client.messages.create(
    model="claude-opus-4-6",
    max_tokens=1024,
    tools=bh_tools.definitions(),   # blackhole_put / get / exists / delete
    messages=[{"role": "user", "content": "Store this report for me ..."}],
)

# Handle tool calls in the agentic loop
tool_results = await bh_tools.handle_response(response.content)

See examples/anthropic_agent_loop.py for a full working loop.

SQL store — persist a record on every upload

Pass an existing async engine (reuse your app's connection pool) or a DSN to let Blackhole create one:

from sqlalchemy.ext.asyncio import create_async_engine
from blackhole_io import Blackhole
from blackhole_io.store.sql_store import SQLStore

# reuse existing engine
store = SQLStore(engine=existing_engine)

# or create from DSN
store = SQLStore(dsn="postgresql+asyncpg://user:pass@localhost/mydb")

bh = Blackhole(config=config, store=store)

# uploads the file AND inserts a FileRecord row
filename = await bh.put(upload_file, extra_metadata={"user_id": 42})

FileRecord table (blackhole_files):

column type
id integer PK
filename text unique
content_type text
size integer
created_at datetime
extra_metadata JSON

Query records directly:

record = await store.get(filename)
print(record.extra_metadata)

await store.delete(filename)

Store via YAML config

Add a store section to config/blackhole.yaml — no code changes needed:

adapter: local
directory: /tmp/uploads

store:
  type: sql
  dsn: sqlite+aiosqlite:///blackhole.db
bh = Blackhole()  # auto-discovers config/blackhole.yaml including the store

Create tables (CLI)

# from DSN
blackhole create-tables --dsn postgresql+asyncpg://user:pass@localhost/mydb

# from YAML config
blackhole create-tables --config config/blackhole.yaml

FastAPI

from contextlib import asynccontextmanager
from fastapi import FastAPI, UploadFile
from sqlalchemy.ext.asyncio import create_async_engine
from blackhole_io import Blackhole
from blackhole_io.store.sql_store import SQLStore

engine = create_async_engine("postgresql+asyncpg://user:pass@localhost/mydb")
store = SQLStore(engine=engine)
bh = Blackhole(config=config, store=store)

@asynccontextmanager
async def lifespan(app: FastAPI):
    await store.create_tables()
    yield

app = FastAPI(lifespan=lifespan)

@app.post("/upload/")
async def upload(files: list[UploadFile]):
    filenames = await bh.put_all(files)
    return {"filenames": filenames}

Custom store backend

Implement AbstractStore to use any backend (Redis, MongoDB, etc.):

from blackhole_io.store.abstract import AbstractStore
from blackhole_io.store.models import FileRecord, FileRecordInput

class RedisStore(AbstractStore):
    async def save(self, record: FileRecordInput) -> FileRecord: ...
    async def get(self, filename: str) -> FileRecord | None: ...
    async def delete(self, filename: str) -> None: ...
    async def create_tables(self) -> None: ...  # no-op for Redis

bh = Blackhole(config=config, store=RedisStore(...))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

blackhole_io-0.2.0.tar.gz (13.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

blackhole_io-0.2.0-py3-none-any.whl (23.7 kB view details)

Uploaded Python 3

File details

Details for the file blackhole_io-0.2.0.tar.gz.

File metadata

  • Download URL: blackhole_io-0.2.0.tar.gz
  • Upload date:
  • Size: 13.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.5

File hashes

Hashes for blackhole_io-0.2.0.tar.gz
Algorithm Hash digest
SHA256 57bbd6ce7fcfcb7859549a074e8980dd385f2839562ab34e1e4f3598fe515c26
MD5 f0dfe4e15d7661881843068987b5b4aa
BLAKE2b-256 65d106cd407496e52cf4ac329691b27af476f4e34f9cccb920f4da5a08b635c4

See more details on using hashes here.

File details

Details for the file blackhole_io-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for blackhole_io-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a283d99eb5f7087823e719594a7f11305d4c984a8d0a9798acce179fa00477a7
MD5 a6a934a8ef693e0616a0b61a6658c307
BLAKE2b-256 bc7764088a9c3d2a6253df82f992f116480a2b91dda57b63dc856a500ea975a3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page