Skip to main content

Data ingestion, caching, and parquet materialization for the Refua drug discovery ecosystem.

Project description

refua-data

refua-data is the Refua data layer for drug discovery. It provides a curated dataset catalog, intelligent local caching, and parquet materialization optimized for downstream modeling and campaign workflows.

What it provides

  • A built-in catalog of useful drug-discovery datasets.
  • Dataset-aware download pipeline with cache reuse and metadata tracking.
  • Pluggable cache backend architecture (filesystem cache by default).
  • API dataset ingestion for paginated JSON endpoints (for example ChEMBL and UniProt).
  • HTTP conditional refresh support (ETag / Last-Modified) when enabled.
  • Incremental parquet materialization (chunked processing + partitioned parquet parts).
  • CLI for listing, fetching, and materializing datasets.
  • Source health checks via validate-sources for CI and environment diagnostics.
  • Rich dataset metadata snapshots (description + usage notes) persisted in cache metadata.

Included datasets

The default catalog includes local-file/HTTP datasets plus API presets useful in drug discovery, including ZINC, ChEMBL, and UniProt.

  1. zinc15_250k (ZINC)
  2. zinc15_tranche_druglike_instock (ZINC tranche)
  3. zinc15_tranche_druglike_agent (ZINC tranche)
  4. zinc15_tranche_druglike_wait_ok (ZINC tranche)
  5. zinc15_tranche_druglike_boutique (ZINC tranche)
  6. zinc15_tranche_druglike_annotated (ZINC tranche)
  7. tox21
  8. bbbp
  9. bace
  10. clintox
  11. sider
  12. hiv
  13. muv
  14. esol
  15. freesolv
  16. lipophilicity
  17. pcba
  18. chembl_activity_ki_human
  19. chembl_activity_ic50_human
  20. chembl_assays_binding_human
  21. chembl_targets_human_single_protein
  22. chembl_molecules_phase3plus
  23. uniprot_human_reviewed
  24. uniprot_human_kinases
  25. uniprot_human_gpcr
  26. uniprot_human_ion_channels
  27. uniprot_human_transporters

Most of these are distributed through MoleculeNet/DeepChem mirrors and retain upstream licensing terms. ChEMBL and UniProt presets are fetched through their public REST APIs and cached locally as JSONL. ZINC tranche presets aggregate multiple tranche files per dataset (drug-like MW B-K and logP A-K bins, reactivity A/B/C/E) into one cached tabular source during fetch.

Install

cd refua-data
pip install -e .

CLI quickstart

List datasets:

refua-data list

Validate all dataset sources:

refua-data validate-sources

Validate a subset and fail CI on probe failures:

refua-data validate-sources chembl_activity_ki_human uniprot_human_kinases --fail-on-error

JSON output for automation:

refua-data validate-sources --json --fail-on-error

For datasets with multiple mirrors, source validation succeeds when at least one configured source is reachable. Failed fallback attempts are included in the result details.

Fetch raw data with cache:

refua-data fetch zinc15_250k

Fetch API-based presets:

refua-data fetch chembl_activity_ki_human
refua-data fetch uniprot_human_kinases

Materialize parquet:

refua-data materialize zinc15_250k

Refresh against remote metadata:

refua-data fetch zinc15_250k --refresh

For API datasets, --refresh re-runs the API query (with conditional headers on first page when available).

Cache layout

By default, cache root is:

  • ~/.cache/refua-data

Override with:

  • REFUA_DATA_HOME=/custom/path

Layout:

  • raw/<dataset>/<version>/... downloaded source files
  • _meta/raw/<dataset>/<version>/...json raw metadata (etag, sha256, API request signature, rows/pages, dataset description/usage metadata)
  • parquet/<dataset>/<version>/part-*.parquet materialized parquet parts
  • _meta/parquet/<dataset>/<version>/manifest.json parquet manifest metadata with dataset snapshot

Python API

from refua_data import DatasetManager

manager = DatasetManager()
manager.fetch("zinc15_250k")
manager.fetch("chembl_activity_ki_human")
result = manager.materialize("zinc15_250k")
print(result.parquet_dir)

DataCache is the default cache backend. You can pass a custom backend object that implements the same interface (ensure, raw_file, raw_meta, parquet_dir, parquet_manifest, read_json, write_json) to make storage pluggable.

Licensing notes

  • refua-data package code is MIT licensed.
  • Dataset content licenses are dataset-specific and controlled by upstream providers.
  • Always verify dataset licensing and allowed use before redistribution or commercial deployment.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

refua_data-0.6.1.tar.gz (28.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

refua_data-0.6.1-py3-none-any.whl (26.2 kB view details)

Uploaded Python 3

File details

Details for the file refua_data-0.6.1.tar.gz.

File metadata

  • Download URL: refua_data-0.6.1.tar.gz
  • Upload date:
  • Size: 28.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.14.3 Darwin/25.3.0

File hashes

Hashes for refua_data-0.6.1.tar.gz
Algorithm Hash digest
SHA256 fa2e4f127a3bb8d0f7d577c1b478b561f4521cc754a91cc843809705ba3ec708
MD5 95704031d0454583976c243b94403b92
BLAKE2b-256 a8e683082b5ebc5e47275d0d70bccf4fdfee4e1f70e4b65a144b31cf96b1965a

See more details on using hashes here.

File details

Details for the file refua_data-0.6.1-py3-none-any.whl.

File metadata

  • Download URL: refua_data-0.6.1-py3-none-any.whl
  • Upload date:
  • Size: 26.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.3.2 CPython/3.14.3 Darwin/25.3.0

File hashes

Hashes for refua_data-0.6.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8f372ca66bfad3fb6cc11fd7dcdafa07a08246b515b007d71a18e345d49b53f9
MD5 2425ae678052cbb59187ac65ede901a7
BLAKE2b-256 b8571a5d130ab7af25dbffdb799720ef0d8f57a06d41d56593c5b6e78365cf6a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page