Skip to main content

Python interface to CDAS data via REST API

Project description

AI.CDAS: python interface to CDAS data

This library provides access to CDAS database from python in a simple and fluid way through CDAS REST api. It fetches the data either in CDF (Common Data Format) or ASCII format and returns it in the form of dictionaries of numpy arrays.

The full documentation is available at aicdas.rtfd.io.

Getting started

Dependencies

Extra dependencies (at least one of the following)

Installation

Starting from version 1.2.0 AI.CDAS officially supports only Python 3, so make sure that you have a working isntallation of it.

Assuming the above requirement is satisfied install the package with Python package manager:

$ pip install ai.cs

Known issues

NASA CDAS REST API endpoint currently does not support IPv6 addressing. However, newer linux distros (for example, Ubuntu 16.04) are set up to prefer IPv6 addressing over IPv4 by default. This may result in unneeded delays in communication with server and data polling. If you experience the issue it might be that is the case with your system. Here is how it can be cured on Ubuntu 16.04:

$ sudoedit /etc/gai.conf
# Uncomment the line
# precedence ::ffff:0:0/96  100

Now you machine will try IPv4 prior to IPv6. For other distros refer to respective docs.

Examples

Example 1: Retrieving observatory groups and associated instruments which measure plasma and solar wind:

from ai import cdas
import json # for pretty output

obsGroupsAndInstruments = cdas.get_observatory_groups_and_instruments(
    'istp_public',
    instrumentType='Plasma and Solar Wind'
)
print(json.dumps(obsGroupsAndInstruments, indent=4))

Example 2: Getting STEREO-A datasets using regular expressions for dataset id and label:

from ai import cdas
import json # for pretty output

datasets = cdas.get_datasets(
    'istp_public',
    idPattern='STA.*',
    labelPattern='.*STEREO.*'
)
print(json.dumps(datasets, indent=4))

Example 3: Fetching a list of variables in one of STEREO datasets:

from ai import cdas
import json # for pretty output

variables = cdas.get_variables('istp_public', 'STA_L1_MAGB_RTN')
print(json.dumps(variables, indent=4))

Example 4: This snippet of code gets magnetic field data from STEREO-A spacecraft for one hour of 01.01.2010 and plots it (requires matplotlib):

from ai import cdas
from datetime import datetime
from matplotlib import pyplot as plt

data = cdas.get_data(
    'sp_phys',
    'STA_L1_MAG_RTN',
    datetime(2010, 1, 1),
    datetime(2010, 1, 1, 0, 59, 59),
    ['BFIELD']
)
plt.plot(data['EPOCH'], data['BTOTAL'])
plt.show()

Example 5: This snippet of code gets magnetic field data from STEREO-A spacecraft for one hour of 01.01.2010 and plots it (requires matplotlib). The data are downloaded in CDF format in this case. CDF format is binary and results in a much smaller filesize and hence faster downloads. In order for this to work you have to have NASA CDF library on your machine and spacepy installed afterwards:

from ai import cdas
from datetime import datetime
from matplotlib import pyplot as plt

data = cdas.get_data(
    'sp_phys',
    'STA_L1_MAG_RTN',
    datetime(2010, 1, 1),
    datetime(2010, 1, 1, 0, 59, 59),
    ['BFIELD'],
    cdf=True # download data in CDF format
)
# Note that variables identifiers are different than in the previous
# example. It often the case with CDAS data. You should check the
# variables names by printing out `data` dictionary.
plt.plot(data['Epoch'], data['BFIELD'][:, 3])
plt.show()

Example 6: This snippet of code gets magnetic field data from STEREO-A spacecraft for 01.01.2010 and saves it to cache directory. The next time the same data is requested it is taken from cache without downloading:

import os
from ai import cdas
from datetime import datetime

# For the sake of example we are using your current working
# directory as a cache directory
cache_dir = os.getcwd()
cdas.set_cache(True, cache_dir)
# this data is downloaded from CDAS
data = cdas.get_data(
    'sp_phys',
    'STA_L1_MAG_RTN',
    datetime(2010, 1, 1),
    datetime(2010, 1, 1, 0, 59, 59),
    ['BFIELD']
)
# this data is taken from cache
data = cdas.get_data(
    'sp_phys',
    'STA_L1_MAG_RTN',
    datetime(2010, 1, 1),
    datetime(2010, 1, 1, 0, 59, 59),
    ['BFIELD']
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ai.cdas-1.2.3.tar.gz (6.1 kB view details)

Uploaded Source

Built Distribution

ai.cdas-1.2.3-py3-none-any.whl (5.9 kB view details)

Uploaded Python 3

File details

Details for the file ai.cdas-1.2.3.tar.gz.

File metadata

  • Download URL: ai.cdas-1.2.3.tar.gz
  • Upload date:
  • Size: 6.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.37.0 CPython/3.6.9

File hashes

Hashes for ai.cdas-1.2.3.tar.gz
Algorithm Hash digest
SHA256 1e93c3f6f667496a6b265ab3df0f8a72c7b2aefe12ee70590584f62460fe36ca
MD5 0e4b477adfd83497af44c9b24b879873
BLAKE2b-256 8ab3536cc3c794353f13a405c3ede03d93efe2b920076babb1efbf6e3327068a

See more details on using hashes here.

File details

Details for the file ai.cdas-1.2.3-py3-none-any.whl.

File metadata

  • Download URL: ai.cdas-1.2.3-py3-none-any.whl
  • Upload date:
  • Size: 5.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.37.0 CPython/3.6.9

File hashes

Hashes for ai.cdas-1.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 62ef31a7afc807158bd9d944aeb7f5f59293798b232116130e45c3e9a78b26e0
MD5 e0921482ace8662de90c72a12e63cddf
BLAKE2b-256 0696c970242f219b764ebb093b0896733227584f7a81e02d2f8e9b8eaaf9d889

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page