Skip to main content

Python package for offline access to restart datasets

Project description

vega_datasets

build status github actions github actions code style black

A Python package for offline access to vega datasets.

This package has several goals:

  • Provide straightforward access in Python to the datasets made available at vega-datasets.
  • return the results in the form of a Pandas dataframe.
  • wherever dataset size and/or license constraints make it possible, bundle the dataset with the package so that datasets can be loaded in the absence of a web connection.

Currently the package bundles a half-dozen datasets, and falls back to using HTTP requests for the others.

Installation

vega_datasets is compatible with Python 3.5 or newer. Install with:

$ pip install vega_datasets

Usage

The main object in this library is data:

>>> from vega_datasets import data

It contains attributes that access all available datasets, locally if available. For example, here is the well-known iris dataset:

>>> df = data.iris()
>>> df.head()
   petalLength  petalWidth  sepalLength  sepalWidth species
0          1.4         0.2          5.1         3.5  setosa
1          1.4         0.2          4.9         3.0  setosa
2          1.3         0.2          4.7         3.2  setosa
3          1.5         0.2          4.6         3.1  setosa
4          1.4         0.2          5.0         3.6  setosa

If you're curious about the source data, you can access the URL for any of the available datasets:

>>> data.iris.url
'https://cdn.jsdelivr.net/npm/vega-datasets@v1.29.0/data/iris.json'

For datasets bundled with the package, you can also find their location on disk:

>>> data.iris.filepath
'/lib/python3.6/site-packages/vega_datasets/data/iris.json'

Available Datasets

To list all the available datsets, use list_datasets:

>>> data.list_datasets()
['7zip', 'airports', 'anscombe', 'barley', 'birdstrikes', 'budget', 'budgets', 'burtin', 'cars', 'climate', 'co2-concentration', 'countries', 'crimea', 'disasters', 'driving', 'earthquakes', 'ffox', 'flare', 'flare-dependencies', 'flights-10k', 'flights-200k', 'flights-20k', 'flights-2k', 'flights-3m', 'flights-5k', 'flights-airport', 'gapminder', 'gapminder-health-income', 'gimp', 'github', 'graticule', 'income', 'iris', 'jobs', 'londonBoroughs', 'londonCentroids', 'londonTubeLines', 'lookup_groups', 'lookup_people', 'miserables', 'monarchs', 'movies', 'normal-2d', 'obesity', 'points', 'population', 'population_engineers_hurricanes', 'seattle-temps', 'seattle-weather', 'sf-temps', 'sp500', 'stocks', 'udistrict', 'unemployment', 'unemployment-across-industries', 'us-10m', 'us-employment', 'us-state-capitals', 'weather', 'weball26', 'wheat', 'world-110m', 'zipcodes']

To list local datasets (i.e. those that are bundled with the package and can be used without a web connection), use the local_data object instead:

>>> from vega_datasets import local_data
>>> local_data.list_datasets()

['airports', 'anscombe', 'barley', 'burtin', 'cars', 'crimea', 'driving', 'iowa-electricity', 'iris', 'seattle-temps', 'seattle-weather', 'sf-temps', 'stocks', 'us-employment', "wheat"]

We plan to add more local datasets in the future, subject to size and licensing constraints. See the local datasets issue if you would like to help with this.

Dataset Information

If you want more information about any dataset, you can use the description property:

>>> data.iris.description
'This classic dataset contains lengths and widths of petals and sepals for 150 iris flowers, drawn from three species. It was introduced by R.A. Fisher in 1936 [1]_.'

This information is also part of the data.iris doc string. Descriptions are not yet included for all the datasets in the package; we hope to add more information on this in the future.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

restart_datasets-0.1.tar.gz (155.8 kB view details)

Uploaded Source

Built Distribution

restart_datasets-0.1-py3-none-any.whl (168.5 kB view details)

Uploaded Python 3

File details

Details for the file restart_datasets-0.1.tar.gz.

File metadata

  • Download URL: restart_datasets-0.1.tar.gz
  • Upload date:
  • Size: 155.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.8.5

File hashes

Hashes for restart_datasets-0.1.tar.gz
Algorithm Hash digest
SHA256 29af35a3142af1998649c4fc6e1c8017678cdeab0f24c4ec7d6656c0ca2cd814
MD5 3c5ad013f97e0430c6cd8db4d810c15f
BLAKE2b-256 3bafa77474571877ffaaa269597bbc6e9f9e218230d5122f908ab77ea9d5d0c7

See more details on using hashes here.

File details

Details for the file restart_datasets-0.1-py3-none-any.whl.

File metadata

  • Download URL: restart_datasets-0.1-py3-none-any.whl
  • Upload date:
  • Size: 168.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.49.0 CPython/3.8.5

File hashes

Hashes for restart_datasets-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 0370e31b2a72a248bd219dbd04446202a168b4601cfe6eed71e98c85b3ba95d9
MD5 84f246a7222d82276d9ca76fd9cd005d
BLAKE2b-256 940d4d4d478318c797d93374a622e61fa82075ea7e91d04d894ba2871bb3aa56

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page