Skip to main content

Python client for the eolas.fyi statistical data API (NZ, Australia, OECD)

Project description

eolas-data

Python client for the eolas.fyi statistical data API — 717+ datasets across NZ, Australia, OECD, and more, served as tidy pandas DataFrames (or polars / geopandas if you prefer).

pip install eolas-data

Quickstart

from eolas_data import Client

client = Client("your_api_key")   # or set EOLAS_API_KEY in env

# Generic
df = client.get("nz_cpi", start="2020-01-01")

# Source-specific (sets the `eolas_source` metadata)
df = client.statsnz("nz_cpi")
df = client.oecd("nz_gdp_production_annual")

# Discovery
all_datasets = client.list()
nz_only      = client.list("Stats NZ")
meta         = client.info("nz_cpi")

Get an API key at https://eolas.fyi/signup. Free plan is 10 requests/month; Starter is 100; Pro is unlimited.

Command-line interface

pip install eolas-data[cli] adds an eolas command for browsing, fetching, and scheduling — useful for shell scripts, cron jobs, and AI-agent workflows. Output auto-detects piping: rich tables in a terminal, newline-delimited JSON when stdout is piped.

# one-time setup
eolas auth set-key
eolas health

# discover
eolas datasets list --source "Stats NZ"
eolas datasets list --search cpi --json | jq '.[].name'
eolas datasets info nz_cpi
eolas datasets preview nz_cpi --limit 5

# fetch (verb matches the Python lib's client.get())
eolas get nz_cpi --format csv > cpi.csv
eolas get nz_cpi --start 2020-01-01 --format json | jq '.[].value'
eolas get sa2_2023 --format parquet --out sa2.parquet

Scheduling

Set up recurring fetches without touching crontab/Task Scheduler syntax. Works on Linux, macOS (cron), and Windows (Task Scheduler).

eolas schedule add nz_cpi --daily   --out ~/data/cpi.csv
eolas schedule add nz_gdp --weekly  --out ~/data/gdp.csv
eolas schedule add nzd_usd --cron "0 */6 * * *" --out ~/data/fx.csv   # POSIX only

eolas schedule list
eolas schedule remove nz_cpi

Daily is the default. Pre-flight check refuses to install a schedule unless your API key is configured (otherwise the job would fail silently forever).

Integrations (Enterprise plan)

Generate ready-to-run connector configs for popular data-pipeline tools — eolas becomes a one-command source for Meltano, Fivetran, or Azure Data Factory.

eolas integrate meltano             --datasets nz_cpi,nz_gdp --output ./my-pipeline/
eolas integrate fivetran            --datasets nz_cpi
eolas integrate azure-data-factory  --datasets nz_cpi,nz_gdp

The generated directory has everything needed to plug into your destination warehouse: meltano.yml, fivetran.yml, or ADF JSON resources, plus a README.md walking through the rest of the setup. Non-Enterprise users see a clear upgrade pointer; the gating lives server-side so the capability is bypass-proof.

Exit codes

Distinct exit codes per error class, for shell scripts and agents:

Code Meaning
0 Success
1 Generic error
2 Auth (AuthenticationError, including Enterprise-gate 403)
3 Rate limit hit
4 Dataset / resource not found
5 Other API error
64 Bad usage (mirrors sysexits.h)

Geospatial

Datasets with a geometry_wkt column auto-convert to geopandas.GeoDataFrame if geopandas is installed:

pip install eolas-data[geo]
gdf = client.get("nz_addresses")                  # GeoDataFrame
df  = client.get("nz_addresses", as_geo=False)    # plain DataFrame, WKT preserved

Polars

pip install eolas-data[polars]
df = client.get("nz_cpi", engine="polars")

Plotting

pip install eolas-data[plot]
df = client.statsnz("nz_cpi")
df.plot_dataset()

Type stubs

Dataset names are exposed as a Literal so IDEs autocomplete the catalog:

from eolas_data import Client

client = Client()
client.get("nz_")    # autocomplete shows nz_cpi, nz_gdp_production_annual, ...

The list is regenerated from the live API at release time. Passing a name not in the snapshot still works at runtime — the type hint just won't autocomplete it. Catalog snapshot date is exposed as eolas_data._dataset_names.CATALOG_SNAPSHOT_DATE.

Migrating from vswarehouse

The previous package name was vswarehouse. Direct equivalents:

vswarehouse eolas_data
from vswarehouse import Client, VSeries from eolas_data import Client, Dataset
df.vs_name, df.vs_source df.eolas_name, df.eolas_source
df.plot_series() df.plot_dataset()
VS_API_KEY env var EOLAS_API_KEY (legacy VS_API_KEY still honoured)

The API surface is otherwise identical. The default base URL is now https://api.eolas.fyi (the old https://api.virtus-solutions.io still 301-redirects and works fine — but uses the legacy endpoint shape).

Releasing

See docs/clients.md in the eolas data repo for the tagged-release flow and PyPI token rotation.

Before each release: python -m eolas_data._regen_names to refresh the dataset name stubs from the live API, commit the change, then tag and push.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eolas_data-1.2.0.tar.gz (32.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

eolas_data-1.2.0-py3-none-any.whl (26.6 kB view details)

Uploaded Python 3

File details

Details for the file eolas_data-1.2.0.tar.gz.

File metadata

  • Download URL: eolas_data-1.2.0.tar.gz
  • Upload date:
  • Size: 32.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for eolas_data-1.2.0.tar.gz
Algorithm Hash digest
SHA256 8e5589f44ef8398b22d2cef5f8757620f502998f52cecccb4204f689ead289bf
MD5 9e3d7ccdbaafb33aa777002a3a9fccaa
BLAKE2b-256 52b76b882b8ec21dafef0072bf57dad0daf94265586409be5a9801a43a00a0c8

See more details on using hashes here.

File details

Details for the file eolas_data-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: eolas_data-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 26.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for eolas_data-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 765cc0028724b7ef51992542c058a751507bda23129ab854a5dfda6b8b43ebda
MD5 09bd45dd8778ff83d593423d7fe58410
BLAKE2b-256 bfcbdf5b922f65a5abb80893a9fe25b91fd951afbd61ec645a2f52658a71736a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page