Skip to main content

SCalable Analytics for LVK Data

Project description

ligo-scald

SCalable Analytics for LIGO/Virgo/Kagra Data

pipeline status coverage report conda-forge PyPI

Version: 0.8.5
Web: https://docs.ligo.org/gstlal-visualisation/ligo-scald
Source: http://software.igwn.org/lscsoft/source/ligo-scald-0.8.5.tar.gz

ligo-scald is a dynamic data visualization and monitoring tool for gravitational-wave data.

Features:

  • Provides a web-based dashboard for visualizing/exploring realtime and historical data.
  • Streaming timeseries, heatmap and 'latest' visualizations.
  • Utilities for storing and aggregating timeseries data that is accessible via HTTP API.
  • Set up nagios monitoring based on thresholds or job heartbeats.
  • Full integration with InfluxDB, a timeseries database, as a data backend.
  • A mock database to serve fake data for testing purposes.

ligo-scald also provides a command line interface for various tasks:

  • scald serve: serving data and dynamic html pages
  • scald deploy: deploys a web application on the LDG
  • scald mock: starts up a mock database that generates data based on HTTP requests
  • scald report: generates offline html reports

Installation

With pip:

pip install ligo-scald

With conda:

conda install -c conda-forge ligo-scald

CLI usage:

Serve data locally:

scald serve -c /path/to/config.yml

Mock a database to serve HTTP requests and return fake data:

scald mock

Deploy a CGI-based web application on the Ligo Data Grid to serve the dashboard and data requests:

scald deploy -c /path/to/config.yml -o /path/to/public_html -n web_application_name

A full list of commands can be viewed with:

scald --help

A list of all command-line options can be displayed for any command:

scald serve --help

Storing data into InfluxDB using Aggregator classes:

from ligo.scald.io import influx

### instantiate the aggregator
aggregator = influx.Aggregator(hostname='influx.hostname', port=8086, db='your_database')

### register a measurement schema (how data is stored in backend)
measurement = 'my_meas'
columns = ('column1', 'column2')
column_key = 'column1'
tags = ('tag1', 'tag2')
tag_key = 'tag2'

aggregator.register_schema(measurement, columns, column_key, tags, tag_key)

### store and aggregate data

### option 1: store data in row form
row_1 = {'time': 1234567890, 'fields': {'column1': 1.2, 'column2': 0.3}}
row_2 = {'time': 1234567890.5, 'fields': {'column1': 0.3, 'column2': 0.4}}

row_3 = {'time': 1234567890, 'fields': {'column1': 2.3, 'column2': 1.1}}
row_4 = {'time': 1234567890.5, 'fields': {'column1': 0.1, 'column2': 2.3}}

rows = {('001', 'andrew'): [row_1, row_2], ('002', 'parce'): [row_3, row_4]}

aggregator.store_rows(measurement, rows)

### option 2: store data in column form
cols_1 = {
    'time': [1234567890, 1234567890.5],
    'fields': {'column1': [1.2, 0.3], 'column2': [0.3, 0.4]}
}
cols_2 = {
    'time': [1234567890, 1234567890.5],
    'fields': {'column1': [2.3, 0.1], 'column2': [1.1, 2.3]}
}
cols = {('001', 'andrew'): cols_1, ('002', 'parce'): cols_2}

aggregator.store_columns(measurement, cols)

Using HTTPS

An HTTPS to the database backend can be enabled in the configuration by specifying auth and https, e.g.:

backends:
  default:
    backend: influxdb
    db: <db_name>
    hostname: <host_name>
    port: <port>
    auth: true
    https: true

Scald will need database authentication credentials in the form of environment variables; these can be provided in a .netrc file inside the Scald config directory.

For the HTTPS connection, Scald will need an SSL cert, whose path is specified as an environment variable, e.g.:

SCALD_SSL_CA_CERT="/etc/pki/tls/certs/<cert_name>"

Quickstart

In two separate terminals execute first

scald mock

Then

scald serve -c /path/to/example_config.yml

With the above two processes running, you should be able to navigate to the following address in your web browser localhost:8080

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ligo_scald-0.8.5.tar.gz (293.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ligo_scald-0.8.5-py3-none-any.whl (293.0 kB view details)

Uploaded Python 3

File details

Details for the file ligo_scald-0.8.5.tar.gz.

File metadata

  • Download URL: ligo_scald-0.8.5.tar.gz
  • Upload date:
  • Size: 293.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for ligo_scald-0.8.5.tar.gz
Algorithm Hash digest
SHA256 12b606fc868a77181d1095b72689bf1849ca307f2c97a149784b7509b7ae211b
MD5 96ede026faef7a4e49b1c73cb0e54d82
BLAKE2b-256 8e7eec8e76c89d73a3b2835584049e3c2aa9824e517dc550413fae116727204c

See more details on using hashes here.

File details

Details for the file ligo_scald-0.8.5-py3-none-any.whl.

File metadata

  • Download URL: ligo_scald-0.8.5-py3-none-any.whl
  • Upload date:
  • Size: 293.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.11

File hashes

Hashes for ligo_scald-0.8.5-py3-none-any.whl
Algorithm Hash digest
SHA256 ed3d41610dd618d94f783c3bce9cf9cb1b01b5bdf73aba2eab97119bcec2ee5a
MD5 695bdd6fb16d376560a8fe8960c2e591
BLAKE2b-256 6a2bc2b1bbf35aca270fe71af4cb84bbed887fc35106be006a7edb1f7faabb99

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page