Skip to main content

LOFAR station statistics gather and dump utility, writes station statistics continuously to the local S3 based object storage.

Project description

Stingray

Build status Test coverage

Station statistics gather and dump utility, writes station statistics contiously to the local S3 based object storage.

Installation

pip install .

Usage

To forward (copy) statistics packets, metadata, or matrices from one place to another, use the following command:

l2ss-stingray-forward <source> <destination> --datatype=packet|json

These locations are supported for source and destionation:

  • file:<path>: read/write from a file on disk,
  • tcp://<host>:<port>: receive from/write to a TCP server,
  • udp://<host>:<port>: receive on/write to a UDP server,
  • s3://<host>/<bucket>/<path>: write to an S3 store as JSON (destination only),
  • zmq+tcp://<host>:<port>/<topic>: subscribe to ZMQ server and topic (source only),

The packet datatype is used to process (binary) statistics packets from SDP, and the json datatype is used to process lines of JSON, the encoding used for metadata and matrices.

To convert statistics packets into matrices and publish those using ZMQ, use the following command:

l2ss-stingray-publish <station> <antennafield> <type> <source>

To extract a set of matrices from disk, annotate them with metadata, and write them as HDF5 files, use:

l2ss-stingray-extract <station> <antennafield> <type> <from> <to> <source> <destination>

Example

The following commands, when started in parallel in the order listed, will convert the XST packets in tests/xst-packets.bin to JSON matrices in xst-matrices.txt:

# start converter & publisher
l2ss-stingray-publish cs123 hba xst udp://0:5000

# catch output of publisher
l2ss-stingray-forward -d json 'zmq+tcp://localhost:6001/xst?content_type=application/json' file:xst-matrices.txt

# provide input to converter
l2ss-stingray-forward file:tests/xst-packets.bin udp://127.0.0.1:5000

Contributing

To contribute, please create a feature branch and a "Draft" merge request. Upon completion, the merge request should be marked as ready and a reviewer should be assigned.

Verify your changes locally and be sure to add tests. Verifying local changes is done through tox.

pip install tox

With tox the same jobs as run on the CI/CD pipeline can be run. These include unit tests and linting.

tox

To automatically apply most suggested linting changes execute:

tox -e format

License

This project is licensed under the Apache License Version 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lofar_stingray-1.0.2.tar.gz (108.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lofar_stingray-1.0.2-py3-none-any.whl (66.5 kB view details)

Uploaded Python 3

File details

Details for the file lofar_stingray-1.0.2.tar.gz.

File metadata

  • Download URL: lofar_stingray-1.0.2.tar.gz
  • Upload date:
  • Size: 108.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.13

File hashes

Hashes for lofar_stingray-1.0.2.tar.gz
Algorithm Hash digest
SHA256 1818586b7a399e56caaed9e694e90fe447c85cad21e9d7bb2717d051e26ce4dc
MD5 3f41ca5990eda059ce8afcef0e896288
BLAKE2b-256 17cb01f3ee367bcdfda2e8a09299ed031497001524abfe36ec00581895a7e495

See more details on using hashes here.

File details

Details for the file lofar_stingray-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: lofar_stingray-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 66.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.13

File hashes

Hashes for lofar_stingray-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2e361dc90b4274367b4739096c1803d93b527bf025ffb98d6d05b54e804880d1
MD5 fc221c267156259780209f9c273980b8
BLAKE2b-256 74ad1b32ee7a8ac98d5675cc9ebd5e39c877608d54f1b4bbad509fe6cd00932d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page