One simple API for file storage. Local, S3, SFTP, Azure. Same methods, swappable backends, zero reinvention.
Project description
remote-store
One simple API for file storage. Local, S3, SFTP, Azure. Same methods, swappable backends, zero reinvention.
Beta software. The core API is stable, but minor versions may still contain breaking changes before 1.0. See the changelog for what's new, and open an issue if something breaks.
remote-store gives you one simple API to read, write, list, and delete files.
The same methods work whether your files live on disk, in S3, on an SFTP server,
or anywhere else. You just swap the backend config.
That's the whole trick.
Who is this for?
- Citizen developers -- analysts, scientists, and domain experts who write Python but shouldn't need to learn
boto3,paramiko, or cloud-specific SDKs just to read and write files. - Platform teams -- engineers who set up the infrastructure and want to hand their colleagues a simple, safe API that can't be misused.
- Anyone tired of rewriting storage glue -- if you've wrapped S3 or SFTP access more than once, this is that wrapper, tested and maintained.
The library was born from enabling citizen-developer teams: the config is immutable so non-experts can't accidentally break state, errors are clear instead of raw SDK tracebacks, and streaming just works without tuning buffer sizes.
Reads and writes stream by default, so large files just work.
Under the hood, each backend delegates to the library you'd pick anyway
(boto3, paramiko, azure-storage-file-datalake, …). This package doesn't
reinvent file I/O. It just gives every backend the same simple front door.
What you get
- One
Store, many backends: local fs, S3, SFTP, Azure Blob, more to come - Just the basics: read, write, list, delete, exists. No magic, no surprises
- Battle-tested I/O under the hood: backends wrap
boto3,paramiko, etc. - Swappable via config: switch backends without touching application code
- Streaming by default: reads and writes handle large files without blowing up memory
- Atomic writes where the backend supports it
- PyArrow ecosystem interop: use any Store as a
pyarrow.fs.FileSystem-- works with Parquet, Pandas, Polars, DuckDB, and dataset discovery out of the box - Zero runtime dependencies: the core package installs nothing; backend extras pull in only what they need
- Typed & tested: strict mypy, spec-driven test suite
Installation
Install from PyPI:
pip install remote-store
Backends that need extra dependencies use extras:
pip install "remote-store[s3]" # Amazon S3 / MinIO
pip install "remote-store[s3-pyarrow]" # S3 with PyArrow (high-throughput)
pip install "remote-store[sftp]" # SFTP / SSH
pip install "remote-store[azure]" # Azure Blob / ADLS Gen2
pip install "remote-store[arrow]" # PyArrow filesystem adapter
Quick Start
import tempfile
from remote_store import BackendConfig, RegistryConfig, Registry, StoreProfile
with tempfile.TemporaryDirectory() as tmp:
config = RegistryConfig(
backends={"local": BackendConfig(type="local", options={"root": tmp})},
stores={"data": StoreProfile(backend="local", root_path="data")},
)
with Registry(config) as registry:
store = registry.get_store("data")
store.write("hello.txt", b"Hello, world!")
content = store.read_bytes("hello.txt")
print(content) # b'Hello, world!'
Switch to S3 by changing the config. The rest of the code stays the same:
config = RegistryConfig(
backends={"s3": BackendConfig(type="s3", options={"bucket": "my-bucket"})},
stores={"data": StoreProfile(backend="s3", root_path="data")},
)
Configuration
Configuration is declarative and immutable. Build it from Python objects or parse it from a dict (e.g. loaded from TOML/JSON):
from remote_store import RegistryConfig
config = RegistryConfig.from_dict({
"backends": {
"local": {"type": "local", "options": {"root": "/data"}},
},
"stores": {
"uploads": {"backend": "local", "root_path": "uploads"},
"reports": {"backend": "local", "root_path": "reports"},
},
})
Store API
Read & write
| Method | Description |
|---|---|
read(path) |
Streaming read (BinaryIO) |
read_bytes(path) |
Full content as bytes |
write(path, content) |
Write bytes or binary stream |
write_atomic(path, content) |
Write via temp file + rename |
Browse & inspect
| Method | Description |
|---|---|
list_files(path, pattern=…) |
Iterate FileInfo, optional name filter |
list_folders(path) |
Iterate subfolder names |
glob(pattern) |
Native glob (capability-gated) |
exists(path) |
Check if a file or folder exists |
is_file(path) / is_folder(path) |
Type checks |
get_file_info(path) |
File metadata (FileInfo) |
get_folder_info(path) |
Folder metadata (FolderInfo) |
Manage
| Method | Description |
|---|---|
delete(path) |
Delete a file |
delete_folder(path) |
Delete a folder |
move(src, dst) |
Move or rename |
copy(src, dst) |
Copy a file |
Utility
| Method | Description |
|---|---|
child(subpath) |
Return a child store scoped to a subfolder |
supports(capability) |
Check if the backend supports a capability |
to_key(path) |
Convert native/absolute path to store-relative key |
unwrap(type_hint) |
Get backend's native handle (e.g., pyarrow.fs.FileSystem) |
close() |
Close the underlying backend |
All write/move/copy methods accept overwrite=True to replace existing files.
For full details, see the API reference.
Supported Backends
| Backend | Status | Extra |
|---|---|---|
| Local filesystem | Built-in | |
| Memory (in-process) | Built-in | |
| Amazon S3 / MinIO | Built-in | remote-store[s3] |
| S3 (PyArrow) | Built-in | remote-store[s3-pyarrow] |
| SFTP / SSH | Built-in | remote-store[sftp] |
| Azure Blob / ADLS | Built-in | remote-store[azure] |
Detailed configuration guides for each backend are in guides/backends/.
Extensions
| Extension | Extra | Description |
|---|---|---|
| PyArrow adapter | remote-store[arrow] |
Use any Store as a pyarrow.fs.FileSystem for Parquet, datasets, Pandas, Polars, DuckDB (guide) |
| Batch operations | (none) | Bulk delete, copy, and exists with error aggregation (guide) |
| Transfer operations | (none) | Upload, download, and cross-store transfer with streaming and progress (guide) |
Examples
Runnable scripts in examples/:
Core -- run locally, no external services needed:
| Script | What it shows |
|---|---|
| quickstart.py | Minimal config, write, read |
| file_operations.py | Full Store API: read, write, delete, move, copy, list, metadata, type checks, capabilities, to_key |
| streaming_io.py | Streaming writes and reads with BytesIO |
| atomic_writes.py | Atomic writes and overwrite semantics |
| configuration.py | Config-as-code, from_dict(), multiple stores, S3/SFTP backend configs |
| error_handling.py | Catching NotFound, AlreadyExists, etc. |
| memory_backend.py | In-process memory backend for testing and caching |
| store_child.py | Runtime sub-scoping with Store.child() |
| pyarrow_adapter.py | PyArrow filesystem adapter: Parquet, datasets |
| batch_operations.py | Bulk delete, copy, exists with error aggregation |
| transfer_operations.py | Upload, download, cross-store transfer with progress |
Backend -- require a running service and credentials (examples/backends/):
| Script | What it shows |
|---|---|
| s3_backend.py | S3 / MinIO: config, two stores, virtual folders |
| s3_pyarrow_backend.py | High-throughput S3 via PyArrow C++ + escape hatch |
| sftp_backend.py | SSH/SFTP: config, host key policies, unwrap() |
| azure_backend.py | Azure Blob / ADLS Gen2: config, auth methods, unwrap() |
Interactive Jupyter notebooks are available in examples/notebooks/.
Known Limitations
- Sync only -- all operations are synchronous. For async frameworks, wrap calls with
asyncio.to_thread(). - Glob --
list_files(pattern=)andext.glob.glob_files()work on all backends. NativeStore.glob()is supported by Local, S3, S3-PyArrow, and Azure backends. - PyArrow adapter -- Phase 1 (Tier 2/3 reads, writes) is complete. Phase 2 native fast-path reads are deferred. See the backlog for details.
Contributing
See CONTRIBUTING.md for the spec-driven development workflow, code style, and how to add new backends.
Security
To report a vulnerability, please use GitHub Security Advisories instead of opening a public issue. See SECURITY.md for details.
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file remote_store-0.12.0.tar.gz.
File metadata
- Download URL: remote_store-0.12.0.tar.gz
- Upload date:
- Size: 687.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f69fa73f5bf7632f9cf469560d5685b36af28cc16673353128d80eb6bfdcffaf
|
|
| MD5 |
5965470ed56b27ee7bbded67d9fdd669
|
|
| BLAKE2b-256 |
5f6f794e4f208eced94e3a80fbe38ac523d69c075e8f32c76d009bd20d758a7e
|
Provenance
The following attestation bundles were made for remote_store-0.12.0.tar.gz:
Publisher:
publish.yml on haalfi/remote-store
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
remote_store-0.12.0.tar.gz -
Subject digest:
f69fa73f5bf7632f9cf469560d5685b36af28cc16673353128d80eb6bfdcffaf - Sigstore transparency entry: 1006692207
- Sigstore integration time:
-
Permalink:
haalfi/remote-store@31cec739483e9b8fd7c509688c5eeadec632eef8 -
Branch / Tag:
refs/tags/v0.12.0 - Owner: https://github.com/haalfi
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@31cec739483e9b8fd7c509688c5eeadec632eef8 -
Trigger Event:
release
-
Statement type:
File details
Details for the file remote_store-0.12.0-py3-none-any.whl.
File metadata
- Download URL: remote_store-0.12.0-py3-none-any.whl
- Upload date:
- Size: 60.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f1b8722d1fa9ef442895e2a8bcb02bfd1d804749be372df99038331ba88dc45e
|
|
| MD5 |
cccab0772e151d9e26e7765cbf35bbe5
|
|
| BLAKE2b-256 |
b756d5a783e84fe4fed25acf1f1679fbe9d9cac04a475a76df28828d5251b5f7
|
Provenance
The following attestation bundles were made for remote_store-0.12.0-py3-none-any.whl:
Publisher:
publish.yml on haalfi/remote-store
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
remote_store-0.12.0-py3-none-any.whl -
Subject digest:
f1b8722d1fa9ef442895e2a8bcb02bfd1d804749be372df99038331ba88dc45e - Sigstore transparency entry: 1006692208
- Sigstore integration time:
-
Permalink:
haalfi/remote-store@31cec739483e9b8fd7c509688c5eeadec632eef8 -
Branch / Tag:
refs/tags/v0.12.0 - Owner: https://github.com/haalfi
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@31cec739483e9b8fd7c509688c5eeadec632eef8 -
Trigger Event:
release
-
Statement type: