Skip to main content

Tools for managing pastas projects

Project description

pastastore Documentation Status Codacy Badge PyPI

pastastore

This module contains a tool to manage Pastas timeseries and models in a database.

Storing timeseries and models in a database gives the user a simple way to manage Pastas projects with the added bonus of allowing the user to pick upwhere they left off, without having to (re)load everything into memory.

The connection to database/disk/memory is managed by a connector object. Currently, four connectors are included. The first implementation is an in-memory connector. The other three store data on disk or in a database. The PasConnector implementation writes human-readable JSON files to disk. The ArcticConnector and PystoreConnector implementations are designed to have fast read/write operations, while also compressing the stored data.

  • In-memory: uses dictionaries to hold timeseries and pastas Models in-memory. Does not require any additional packages to use.

  • Pastas: uses Pastas write and read methods to store data as JSON files on disk. Does not require any additional packages to use.

  • Arctic is a timeseries/dataframe database that sits atop MongoDB. Arctic supports pandas.DataFrames.

  • PyStore is a datastore (inspired by Arctic) created for storing pandas dataframes (especially timeseries) on disk. Data is stored using fastparquet and compressed with Snappy.

Installation

Install the module by typing pip install pastastore.

For installing in development mode, clone the repository and install by typing pip install -e . from the module root directory.

Please note that there are external dependencies when using connectors based on pystore or arctic. These dependencies are not automatically installed (see Dependencies section)!

Usage

The following snippets show typical usage. The general idea is to first define the connector object. The next step is to pass that connector to PastaStore.

Using in-memory dictionaries

This works out of the box after installing with pip without installing any additional Python dependencies or external software.

import pastastore as pst

# define connector
conn = pst.DictConnector("my_connector")

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Pastas read/load methods

Store data on disk as JSON files (with .pas extension) using Pastas read and load methods. This works out of the box after installing with pip without installing any additional Python dependencies or external software.

import pastastore as pst

# define connector
path = "./data/pas"
conn = pst.PasConnector("my_connector")

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Arctic

Store data in MongoDB using Arctic. Only works if there is an instance of MongoDB running somewhere.

import pastastore as pst

# define arctic connector
connstr = "mongodb://localhost:27017/"  # local instance of mongodb
conn = pst.ArcticConnector("my_connector", connstr)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Pystore

Store data on disk as parquet files using compression. Only works if python-snappy and pystore are installed.

import pastastore as pst

# define pystore connector
path = "./data/pystore"  # path to a directory
conn = pst.PystoreConnector("my_connector", path)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

The database read/write/delete methods can be accessed through the reference to the connector object. For easy access, the most common methods are registered to the store object. E.g.

series = store.conn.get_oseries("my_oseries")

is equivalent to:

series = store.get_oseries("my_oseries")

Dependencies

This module has several dependencies (depending on which connector is used):

If using Dictconnector or PasConnector:

  • No additional dependencies are required.

If using ArcticConnector:

  • Arctic requires MongoDB, e.g. install the Community edition (Windows, MacOS).

  • OR, if you wish to use Docker for running MongoDB see the installation instructions here.

If using PystoreConnector:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pastastore-0.4.0.tar.gz (24.0 kB view details)

Uploaded Source

Built Distribution

pastastore-0.4.0-py3-none-any.whl (25.2 kB view details)

Uploaded Python 3

File details

Details for the file pastastore-0.4.0.tar.gz.

File metadata

  • Download URL: pastastore-0.4.0.tar.gz
  • Upload date:
  • Size: 24.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.0 importlib_metadata/3.7.3 packaging/20.9 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for pastastore-0.4.0.tar.gz
Algorithm Hash digest
SHA256 9a2d80a2e4cb98bdd798dcfb95499f250940ebcd29c2c6fc4579b7b905c6b640
MD5 3d6886ceac5dd7fb6ff7284abcfb0dba
BLAKE2b-256 f4b178e5937f890dfff23b8b82db93e04017520b848c86d9adca9e59ed4a65f3

See more details on using hashes here.

File details

Details for the file pastastore-0.4.0-py3-none-any.whl.

File metadata

  • Download URL: pastastore-0.4.0-py3-none-any.whl
  • Upload date:
  • Size: 25.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.0 importlib_metadata/3.7.3 packaging/20.9 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.9.2

File hashes

Hashes for pastastore-0.4.0-py3-none-any.whl
Algorithm Hash digest
SHA256 e0eac78b63a5427cad6547fcb857658b042f61a2ddd347709e25cd67ffa6bd69
MD5 231b50dce7c840b22a235babbba20835
BLAKE2b-256 e81c1ba125472bdccc76634637c602e09361f61284a58652cf31f547f09bfc30

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page