Skip to main content

No project description provided

Project description

NeXus Streamer

Streams event and metadata from a NeXus file into Kafka, mimicking data acquisition from a live instrument. This facilitates testing software which consume these data.

This Python implementation is intended to replace a C++ implementation (https://github.com/ess-dmsc/NeXus-Streamer) and should be much lower effort to maintain.

Installation

Python 3.7 or higher is required. https://www.python.org/downloads/

To install from PyPi do

pip install nexus-streamer

or to install with conda (does not work on Windows due to confluent-kafka package not supporting Windows)

conda install -c conda-forge -c ess-dmsc nexus-streamer

and check installation was successful by running

nexus_streamer --help

on Windows you may need to add your Python environment's Script directory to PATH for the command to work.

Usage

usage: nexus_streamer [-h]
                      [--graylog-logger-address GRAYLOG_LOGGER_ADDRESS]
                      [--log-file LOG_FILE] [-c CONFIG_FILE]
                      [-v {Trace,Debug,Warning,Error,Critical}] -f
                      FILENAME [--json-description JSON_DESCRIPTION] -b
                      BROKER -i INSTRUMENT [-s] [-z] [--isis-file]
                      [-e FAKE_EVENTS_PER_PULSE]

NeXus Streamer

optional arguments:
  -h, --help            show this help message and exit
  --graylog-logger-address GRAYLOG_LOGGER_ADDRESS
                        <host:port> Log to Graylog [env var:
                        GRAYLOG_LOGGER_ADDRESS]
  --log-file LOG_FILE   Log filename [env var: LOG_FILE]
  -c CONFIG_FILE, --config-file CONFIG_FILE
                        Read configuration from an ini file [env var:
                        CONFIG_FILE]
  -v {Trace,Debug,Warning,Error,Critical}, --verbosity {Trace,Debug,Warning,Error,Critical}
                        Set logging level [env var: VERBOSITY]
  -f FILENAME, --filename FILENAME
                        NeXus file to stream data from [env var: FILENAME]
  --json-description JSON_DESCRIPTION
                        If provided use this JSON template instead of
                        generating one from the NeXus file [env var:
                        JSON_FILENAME]
  -b BROKER, --broker BROKER
                        <host[:port]> Kafka broker to forward data into [env
                        var: BROKER]
  -i INSTRUMENT, --instrument INSTRUMENT
                        Used as prefix for topic names [env var: INSTRUMENT]
  -s, --slow            Stream data into Kafka at approx realistic rate (uses
                        timestamps from file) [env var: SLOW]
  -z, --single-run      Publish only a single run (otherwise repeats until
                        interrupted) [env var: SINGLE_RUN]
  --isis-file           Include ISIS-specific data in event data messages and
                        detector-spectrum map if found in file [env var:
                        ISIS_FILE]
  -e FAKE_EVENTS_PER_PULSE, --fake-events-per-pulse FAKE_EVENTS_PER_PULSE
                        Generates this number of fake events per pulse
                        perevent data group instead of publishing real data
                        from file [env var: FAKE_EVENTS]
  -d DET_SPEC_MAP, --det-spec-map DET_SPEC_MAP
                        Full path of a detector-spectrum map file which may 
                        be required for files from ISIS [env var: DET_SPEC_MAP]

Args that start with '--' (eg. --graylog-logger-address) can also be set in a
config file (specified via -c). Config file syntax allows: key=value,
flag=true, stuff=[a,b,c] (for details, see syntax at https://goo.gl/R74nmi).
If an arg is specified in more than one place, then commandline values
override environment variables which override config file values which
override defaults.

The fake events generated if --fake-events-per-pulse is used are a random detector id, selected from the detector's ids, and a random time-of-flight between 10 and 10000 milliseconds. The intention is to provide a specified quantity of data for performance testing consuming applications.

Minimum requirements of the file

The NeXus file used must have an NXentry group containing a start_time dataset containing the run start time as an iso8601 string.

NXevent_data and NXlog groups will be found wherever they are in the file and streamed to Kafka. All time and value datasets must have a units attribute.

If --fake-events-per-pulse is used then each NXevent_data group must be in an NXdetector with a detector_number dataset.

Developer information

See README-dev.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nexus-streamer-0.2.0.tar.gz (18.2 kB view details)

Uploaded Source

Built Distribution

nexus_streamer-0.2.0-py3-none-any.whl (21.6 kB view details)

Uploaded Python 3

File details

Details for the file nexus-streamer-0.2.0.tar.gz.

File metadata

  • Download URL: nexus-streamer-0.2.0.tar.gz
  • Upload date:
  • Size: 18.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for nexus-streamer-0.2.0.tar.gz
Algorithm Hash digest
SHA256 ae4fc3b3c411d3a79b56aa6cba873d63785cacd109e3d31395ff25099798c280
MD5 abceaca38e340b12b6dfacfc3ef1a9bf
BLAKE2b-256 5c6e8645f8715ae99c42dde8d95052f697c2c4e65929aa03d3c565cc23113a24

See more details on using hashes here.

File details

Details for the file nexus_streamer-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: nexus_streamer-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 21.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for nexus_streamer-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bf1968550c04b95fea64679438e901dc734fc3f5da13b40cb17ae42058cc5d03
MD5 c97c33490bbdf1a2271f8fa93a875482
BLAKE2b-256 087846873edf3374b840408e256015c51d65420bd08ff1124c0de3d0c321885a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page