client side tool for openenergy platform
Project description
OEP Client
This tool tries to make data sharing with the OEP as easy as possible. Common tasks are:
- creating a table
- uploading data
- updating a table's metadata
- downloading data
- retrieving a table's metadata
- deleting a table (that you created)
You can also always just use the API (TODO: link to documentation)
directly if your tasks are more complex.
Notes for Windows Users
All the example commands below use python3
, because we need python 3. Under Windows, it's most likely to be python.exe
or just python
.
Installation
Install package oep-client
from python package index with pip:
python3 -m pip install --upgrade oep-client
Test
There is a short test script that creates a table on the platform, uploads data and metadata, downloads them again and finally deletes the table. You need to be registered on the OEP platform and have a valid API token
You can run it either directly from the command prompt
python3 -m oep_client.test API_TOKEN
or in an interactive python environment
>>> from oep_client import testscript
>>> testscript('API_TOKEN')
TODO: example output if everything is ok
Data and Metadata
Supported filetypes for input data are: xslx, csv, json
Metadata must be a json file that complies with the metadata specification of the OEP (TODO: link)
Usage
All tasks can be executed either directly as a comand line script (CLI) oep-client
that comes with this package, or in a python environment.
The CLI is very handy for standardized tasks as it requires just one command line, but is somewhat limited when for instance your input data is not in a very specific format. To see avaiblabe command line options, use
oep-client --help
In a python environment, you have more flexibility to prepare / clean your data before uploading it.
Creating a table
oep-client
This document describes how to upload data to the OEP using Python and the REST-API.
Create and upload data table(s)
-
The REST-API can be used with any language than can make HTTP(s) requests.
-
Most requests require you to add an authorization header: Authorization:
Token API_TOKEN
, where you substituteAPI_TOKEN
with your token. You can find your token in your user profile on the OEP under Your Security Information. -
All requests (and most responses) will use json data as payload. A paylpad is the actual data content of the request.
-
An example is provided below. For it, we use python and the requests package. All requests will use a requests session with the authorization header.
import requests
API_URL = 'https://openenergy-platform.org/api/v0'
session = requests.Session()
session.headers = {'Authorization': 'Token %s' % API_TOKEN}
- The requests in the following sections use roughly the same pattern:
- Prepare your request payload as a json object
- Prepare your request url
- Send your request using the correct verb (get, post, put, delete)
- Check if the request was successful
Create a new table
-
You will create the tables at first in the model_draft schema. After a successful review later, the table will be moved to the final target schema.
-
You need to specify the name of the new table (
TABLE_NAME
), which should be a valid post-gresql table name, without spaces, ideally only containing lower case letters, numbers and underscores. -
You also need to specify names and data types of your columns, which also must be valid post-gres data types.
# prepare request payload
data = {'query': {
'columns': [
{
'name': 'id',
'data_type': 'bigserial'
},
# add more columns here
],
'constraints': [
{'constraint_type': 'PRIMARY KEY', 'constraint_parameter': 'id'}
]
}}
# prepare api url
url = API_URL + '/schema/model_draft/tables/' + TABLE_NAME
# make request and check using PUT
res = session.put(url, json=data)
res.raise_for_status() # check: throws exception if not successful
Upload data
-
To upload data, you must first load it into a json structure as a list representing data rows, each of which is a dictionary mapping column names to values.
-
In the example, we will use pandas to read data from an Excel workbook (
WORKBOOK, WORKSHEET
) into a data frame which we will then convert into a json object. Please note that this step will most likely require some modification to accommodate the specifics of your in-put data. -
In addition to that, at the end, you need to load your data into the specified json structure.
-
After that, the data can be uploaded making a request to the API:
# load data into dataframe, convert into json
df = pd.read_excel(WORKBOOK, WORKSHEET)
records = df.to_json(orient='records')
records = json.loads(records)
# prepare request payload
data = {'query': records}
# prepare api url
url = API_URL + '/schema/model_draft/tables/' + TABLE_NAME + '/rows/new'
# make request
res = session.post(url, json=data)
res.raise_for_status() # check
- You can repeat this if you want to upload your data in multiple batches.
Starting over: Deleting your table
- While the table is still in the model draft, you can always delete the table and start over:
# prepare api url
url = API_URL + '/schema/model_draft/tables/' + TABLE_NAME
# make request
res = session.delete(url)
res.raise_for_status() # check
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.