Skip to main content

LOFAR station statistics gather and dump utility, writes station statistics continuously to the local S3 based object storage.

Project description

Stingray

Build status Test coverage

Station statistics gather and dump utility, writes station statistics contiously to the local S3 based object storage.

Installation

pip install .

Usage

To forward (copy) statistics packets, metadata, or matrices from one place to another, use the following command:

l2ss-stingray-forward <source> <destination> --datatype=packet|json

These locations are supported for source and destionation:

  • file:<path>: read/write from a file on disk,
  • tcp://<host>:<port>: receive from/write to a TCP server,
  • udp://<host>:<port>: receive on/write to a UDP server,
  • s3://<host>/<bucket>/<path>: write to an S3 store as JSON (destination only),
  • zmq+tcp://<host>:<port>/<topic>: subscribe to ZMQ server and topic (source only),

The packet datatype is used to process (binary) statistics packets from SDP, and the json datatype is used to process lines of JSON, the encoding used for metadata and matrices.

To convert statistics packets into matrices and publish those using ZMQ, use the following command:

l2ss-stingray-publish <station> <antennafield> <type> <source>

To extract a set of matrices from disk, annotate them with metadata, and write them as HDF5 files, use:

l2ss-stingray-extract <station> <antennafield> <type> <from> <to> <source> <destination>

Example

The following commands, when started in parallel in the order listed, will convert the XST packets in tests/xst-packets.bin to JSON matrices in xst-matrices.txt:

# start converter & publisher
l2ss-stingray-publish cs123 hba xst udp://0:5000

# catch output of publisher
l2ss-stingray-forward -d json 'zmq+tcp://localhost:6001/xst?content_type=application/json' file:xst-matrices.txt

# provide input to converter
l2ss-stingray-forward file:tests/xst-packets.bin udp://127.0.0.1:5000

Contributing

To contribute, please create a feature branch and a "Draft" merge request. Upon completion, the merge request should be marked as ready and a reviewer should be assigned.

Verify your changes locally and be sure to add tests. Verifying local changes is done through tox.

pip install tox

With tox the same jobs as run on the CI/CD pipeline can be run. These include unit tests and linting.

tox

To automatically apply most suggested linting changes execute:

tox -e format

License

This project is licensed under the Apache License Version 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lofar_stingray-1.0.1.tar.gz (108.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lofar_stingray-1.0.1-py3-none-any.whl (66.4 kB view details)

Uploaded Python 3

File details

Details for the file lofar_stingray-1.0.1.tar.gz.

File metadata

  • Download URL: lofar_stingray-1.0.1.tar.gz
  • Upload date:
  • Size: 108.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.13

File hashes

Hashes for lofar_stingray-1.0.1.tar.gz
Algorithm Hash digest
SHA256 ff9d4d5eba880e6e73bb745020b91afab9b9cb530d055c268cc9b1a9377f0eac
MD5 88d2ff12acc566f94e093b599d4df5dd
BLAKE2b-256 67f9b5939232fced370ee416ec23e46911a21a5425605be5f16aaa701e15abbf

See more details on using hashes here.

File details

Details for the file lofar_stingray-1.0.1-py3-none-any.whl.

File metadata

  • Download URL: lofar_stingray-1.0.1-py3-none-any.whl
  • Upload date:
  • Size: 66.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.13

File hashes

Hashes for lofar_stingray-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 bb37348f24c38de8f5876e1c69391970f7fca1b40159d8eb8ad016a1f885d85e
MD5 d60c14d6c8a97a3f3a961b28e1eed3d7
BLAKE2b-256 c6f4d3b80e5fcc7a45856fb84224b35120b649bd66d57a6d5b9bb27afd3bfef9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page