datagram post-processing toolkit
Project description
dgpost: datagram post-processing toolkit
Set of tools to post-process raw instrument data in yadg's datagram
format, and
tabulated data imported into pd.DataFrames
.
Capabilities:
dgpost is indended to be used as part of your data processing pipeline, and works best with a series of timestamped data.
Write a recipe in yaml
, and post-process your data from yadg.datagrams
or
pd.DataFrames
in a reproducible fashion, while keeping provenance information,
and without touching the original data files.
Post-process your data into pre-defined figures for your reports, or simply export
your collated pd.DataFrame
into one of the several supported formats!
Use dgpost in your Jupyter notebooks by importing it as a python package:
import dgpost.utils
to access the top-level functions for loading, extracting
and exporting data; or import dgpost.transform
to access the library of validated
transform functions.
Features:
dgpost can load data from multiple file formats, extract data from those
files into pd.DataFrames
and automatically interpolate the datapoints along the
time-axis (generally the index of the pd.DataFrame
) as necessary, transform
the created tables using functions from the built-in library, plot data from
those tables using its matplotlib interface, and save the tables into several
output formats.
Of course, dgpost is fully unit-aware, and supports values with uncertainties
by using the pint.Quantity
and uncertainties.ufloat
under the hood.
For a further overview of features, see the project documentation.
Contributors:
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.