Skip to main content

Python package for offline access to restart datasets

Project description

vega_datasets

build status github actions github actions code style black

A Python package for offline access to vega datasets.

This package has several goals:

  • Provide straightforward access in Python to the datasets made available at vega-datasets.
  • return the results in the form of a Pandas dataframe.
  • wherever dataset size and/or license constraints make it possible, bundle the dataset with the package so that datasets can be loaded in the absence of a web connection.

Currently the package bundles a half-dozen datasets, and falls back to using HTTP requests for the others.

Installation

vega_datasets is compatible with Python 3.5 or newer. Install with:

$ pip install vega_datasets

Usage

The main object in this library is data:

>>> from vega_datasets import data

It contains attributes that access all available datasets, locally if available. For example, here is the well-known iris dataset:

>>> df = data.iris()
>>> df.head()
   petalLength  petalWidth  sepalLength  sepalWidth species
0          1.4         0.2          5.1         3.5  setosa
1          1.4         0.2          4.9         3.0  setosa
2          1.3         0.2          4.7         3.2  setosa
3          1.5         0.2          4.6         3.1  setosa
4          1.4         0.2          5.0         3.6  setosa

If you're curious about the source data, you can access the URL for any of the available datasets:

>>> data.iris.url
'https://cdn.jsdelivr.net/npm/vega-datasets@v1.29.0/data/iris.json'

For datasets bundled with the package, you can also find their location on disk:

>>> data.iris.filepath
'/lib/python3.6/site-packages/vega_datasets/data/iris.json'

Available Datasets

To list all the available datsets, use list_datasets:

>>> data.list_datasets()
['7zip', 'airports', 'anscombe', 'barley', 'birdstrikes', 'budget', 'budgets', 'burtin', 'cars', 'climate', 'co2-concentration', 'countries', 'crimea', 'disasters', 'driving', 'earthquakes', 'ffox', 'flare', 'flare-dependencies', 'flights-10k', 'flights-200k', 'flights-20k', 'flights-2k', 'flights-3m', 'flights-5k', 'flights-airport', 'gapminder', 'gapminder-health-income', 'gimp', 'github', 'graticule', 'income', 'iris', 'jobs', 'londonBoroughs', 'londonCentroids', 'londonTubeLines', 'lookup_groups', 'lookup_people', 'miserables', 'monarchs', 'movies', 'normal-2d', 'obesity', 'points', 'population', 'population_engineers_hurricanes', 'seattle-temps', 'seattle-weather', 'sf-temps', 'sp500', 'stocks', 'udistrict', 'unemployment', 'unemployment-across-industries', 'us-10m', 'us-employment', 'us-state-capitals', 'weather', 'weball26', 'wheat', 'world-110m', 'zipcodes']

To list local datasets (i.e. those that are bundled with the package and can be used without a web connection), use the local_data object instead:

>>> from vega_datasets import local_data
>>> local_data.list_datasets()

['airports', 'anscombe', 'barley', 'burtin', 'cars', 'crimea', 'driving', 'iowa-electricity', 'iris', 'seattle-temps', 'seattle-weather', 'sf-temps', 'stocks', 'us-employment', "wheat"]

We plan to add more local datasets in the future, subject to size and licensing constraints. See the local datasets issue if you would like to help with this.

Dataset Information

If you want more information about any dataset, you can use the description property:

>>> data.iris.description
'This classic dataset contains lengths and widths of petals and sepals for 150 iris flowers, drawn from three species. It was introduced by R.A. Fisher in 1936 [1]_.'

This information is also part of the data.iris doc string. Descriptions are not yet included for all the datasets in the package; we hope to add more information on this in the future.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

restart_datasets-0.3.tar.gz (6.0 MB view details)

Uploaded Source

Built Distribution

restart_datasets-0.3-py3-none-any.whl (6.0 MB view details)

Uploaded Python 3

File details

Details for the file restart_datasets-0.3.tar.gz.

File metadata

  • Download URL: restart_datasets-0.3.tar.gz
  • Upload date:
  • Size: 6.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.8.5

File hashes

Hashes for restart_datasets-0.3.tar.gz
Algorithm Hash digest
SHA256 b00ef53e95b793ff6c926d680092a022a5974f9c87efd6b16498df141d685c69
MD5 7e8301488eb77f011b431f87f7fc713a
BLAKE2b-256 bb11cef34a32cb5533059821994483692c360a4fc00f63660b4a95dc9018364d

See more details on using hashes here.

File details

Details for the file restart_datasets-0.3-py3-none-any.whl.

File metadata

  • Download URL: restart_datasets-0.3-py3-none-any.whl
  • Upload date:
  • Size: 6.0 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.8.5

File hashes

Hashes for restart_datasets-0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 8557308503873880347a1d467e1223d4b557f285b3723bd9f4737f0686290a78
MD5 e4f8c798bf106c6c680233db0ef6dc90
BLAKE2b-256 52d2dbff61d5807fa472a79cda84c40f063c7d02e01e89c98b6105c02d01e47e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page