Skip to main content

Tools for managing Pastas timeseries models

Project description

pastastore Documentation Status Codacy Badge Codacy Badge PyPI

pastastore

This module contains a tool to manage Pastas timeseries and models in a database.

Storing timeseries and models in a database gives the user a simple way to manage Pastas projects with the added bonus of allowing the user to pick upwhere they left off, without having to (re)load everything into memory.

The connection to database/disk/memory is managed by a connector object. Currently, four connectors are included. The first implementation is an in-memory connector. The other three store data on disk or in a database. The PasConnector implementation writes human-readable JSON files to disk. The ArcticConnector and PystoreConnector implementations are designed to have fast read/write operations, while also compressing the stored data.

  • In-memory: uses dictionaries to hold timeseries and pastas Models in-memory. Does not require any additional packages to use.

  • Pastas: uses Pastas write and read methods to store data as JSON files on disk. Does not require any additional packages to use.

  • Arctic is a timeseries/dataframe database that sits atop MongoDB. Arctic supports pandas.DataFrames.

  • PyStore is a datastore (inspired by Arctic) created for storing pandas dataframes (especially timeseries) on disk. Data is stored using fastparquet and compressed with Snappy.

Installation

Install the module by typing pip install pastastore.

For installing in development mode, clone the repository and install by typing pip install -e . from the module root directory.

For plotting backgroundmaps, the contextily and pyproj packages are required. For a full install, including an optional dependency for plotting and labeling data on maps, use: pip install pastastore[full] or pip install .[full] when on MacOS or Linux. Windows users are asked to install rasterio themselves since it often cannot be installed using pip. rasterio is a dependency of contextily. Windows can install pastastore with the optional labeling package adjustText using pip install pastastore[adjusttext] or .[adjusttext]

There are external dependencies when using the pystore or arctic connectors. To install these dependencies read (see Connector Dependencies section)! since these are not automatically installed.

Usage

The following snippets show typical usage. The general idea is to first define the connector object. The next step is to pass that connector to PastaStore.

Using in-memory dictionaries

This works out of the box after installing with pip without installing any additional Python dependencies or external software.

import pastastore as pst

# define connector
conn = pst.DictConnector("my_connector")

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Pastas read/load methods

Store data on disk as JSON files (with .pas extension) using Pastas read and load methods. This works out of the box after installing with pip without installing any additional Python dependencies or external software.

import pastastore as pst

# define connector
path = "./data/pas"
conn = pst.PasConnector("my_connector")

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Arctic

Store data in MongoDB using Arctic. Only works if there is an instance of MongoDB running somewhere.

import pastastore as pst

# define arctic connector
connstr = "mongodb://localhost:27017/"  # local instance of mongodb
conn = pst.ArcticConnector("my_connector", connstr)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

Using Pystore

Store data on disk as parquet files using compression. Only works if python-snappy and pystore are installed.

import pastastore as pst

# define pystore connector
path = "./data/pystore"  # path to a directory
conn = pst.PystoreConnector("my_connector", path)

# create project for managing Pastas data and models
store = pst.PastaStore("my_project", conn)

The database read/write/delete methods can be accessed through the reference to the connector object. For easy access, the most common methods are registered to the store object. E.g.

series = store.conn.get_oseries("my_oseries")

is equivalent to:

series = store.get_oseries("my_oseries")

Connector Dependencies

This module has several dependencies (depending on which connector is used):

If using Dictconnector or PasConnector:

  • No additional dependencies are required.

If using ArcticConnector:

  • Arctic requires MongoDB, e.g. install the Community edition (Windows, MacOS).

  • OR, if you wish to use Docker for running MongoDB see the installation instructions here.

If using PystoreConnector:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pastastore-0.8.0.tar.gz (43.9 kB view details)

Uploaded Source

Built Distribution

pastastore-0.8.0-py3-none-any.whl (45.1 kB view details)

Uploaded Python 3

File details

Details for the file pastastore-0.8.0.tar.gz.

File metadata

  • Download URL: pastastore-0.8.0.tar.gz
  • Upload date:
  • Size: 43.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.5

File hashes

Hashes for pastastore-0.8.0.tar.gz
Algorithm Hash digest
SHA256 7fcbe3b69bbf87c213456c1080a7f20faf46c3b6c25f30dd62508a3ab97730fd
MD5 3698b39ad138018bb04f040839a8b63e
BLAKE2b-256 fe7338213376dff3b49da85f324d54146e44776e9c2e4d7e37b6b960fe4eae2a

See more details on using hashes here.

File details

Details for the file pastastore-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: pastastore-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 45.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.5

File hashes

Hashes for pastastore-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3e8f308edbb10f816a0162134c5219992c60de548f027343afdc3329e479b75d
MD5 3e0be07a11685761d2e4b10f26ea9203
BLAKE2b-256 df27770d49ad0a14645eca3ba473d1d95c06e426a4839b55d2163f349c30d6c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page