Skip to main content

No project description provided

Project description

NeXus Streamer

Streams event and metadata from a NeXus file into Kafka, mimicking data acquisition from a live instrument. This facilitates testing software which consume these data.

This Python implementation is intended to replace a C++ implementation (https://github.com/ess-dmsc/NeXus-Streamer) and should be much lower effort to maintain.

Installation

Python 3.7 or higher is required. https://www.python.org/downloads/

To install from PyPi do

pip install nexus-streamer

or to install with conda (does not work on Windows due to confluent-kafka package not supporting Windows)

conda install -c conda-forge -c ess-dmsc nexus-streamer

and check installation was successful by running

nexus_streamer --help

on Windows you may need to add your Python environment's Script directory to PATH for the command to work.

Usage

usage: nexus_streamer [-h]
                      [--graylog-logger-address GRAYLOG_LOGGER_ADDRESS]
                      [--log-file LOG_FILE] [-c CONFIG_FILE]
                      [-v {Trace,Debug,Warning,Error,Critical}] -f
                      FILENAME [--json-description JSON_DESCRIPTION] -b
                      BROKER -i INSTRUMENT [-s] [-z] [--isis-file]
                      [-e FAKE_EVENTS_PER_PULSE]

NeXus Streamer

optional arguments:
  -h, --help            show this help message and exit
  --graylog-logger-address GRAYLOG_LOGGER_ADDRESS
                        <host:port> Log to Graylog [env var:
                        GRAYLOG_LOGGER_ADDRESS]
  --log-file LOG_FILE   Log filename [env var: LOG_FILE]
  -c CONFIG_FILE, --config-file CONFIG_FILE
                        Read configuration from an ini file [env var:
                        CONFIG_FILE]
  -v {Trace,Debug,Warning,Error,Critical}, --verbosity {Trace,Debug,Warning,Error,Critical}
                        Set logging level [env var: VERBOSITY]
  -f FILENAME, --filename FILENAME
                        NeXus file to stream data from [env var: FILENAME]
  --json-description JSON_DESCRIPTION
                        If provided use this JSON template instead of
                        generating one from the NeXus file [env var:
                        JSON_FILENAME]
  -b BROKER, --broker BROKER
                        <host[:port]> Kafka broker to forward data into [env
                        var: BROKER]
  -i INSTRUMENT, --instrument INSTRUMENT
                        Used as prefix for topic names [env var: INSTRUMENT]
  -s, --slow            Stream data into Kafka at approx realistic rate (uses
                        timestamps from file) [env var: SLOW]
  -z, --single-run      Publish only a single run (otherwise repeats until
                        interrupted) [env var: SINGLE_RUN]
  --isis-file           Include ISIS-specific data in event data messages and
                        detector-spectrum map if found in file [env var:
                        ISIS_FILE]
  -e FAKE_EVENTS_PER_PULSE, --fake-events-per-pulse FAKE_EVENTS_PER_PULSE
                        Generates this number of fake events per pulse
                        perevent data group instead of publishing real data
                        from file [env var: FAKE_EVENTS]
  -d DET_SPEC_MAP, --det-spec-map DET_SPEC_MAP
                        Full path of a detector-spectrum map file which may 
                        be required for files from ISIS [env var: DET_SPEC_MAP]

Args that start with '--' (eg. --graylog-logger-address) can also be set in a
config file (specified via -c). Config file syntax allows: key=value,
flag=true, stuff=[a,b,c] (for details, see syntax at https://goo.gl/R74nmi).
If an arg is specified in more than one place, then commandline values
override environment variables which override config file values which
override defaults.

The fake events generated if --fake-events-per-pulse is used are a random detector id, selected from the detector's ids, and a random time-of-flight between 10 and 10000 milliseconds. The intention is to provide a specified quantity of data for performance testing consuming applications.

Minimum requirements of the file

The NeXus file used must have an NXentry group containing a start_time dataset containing the run start time as an iso8601 string.

NXevent_data and NXlog groups will be found wherever they are in the file and streamed to Kafka. All time and value datasets must have a units attribute.

If --fake-events-per-pulse is used then each NXevent_data group must be in an NXdetector with a detector_number dataset.

Developer information

See README-dev.md

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nexus-streamer-0.2.1.tar.gz (18.6 kB view details)

Uploaded Source

Built Distribution

nexus_streamer-0.2.1-py3-none-any.whl (21.8 kB view details)

Uploaded Python 3

File details

Details for the file nexus-streamer-0.2.1.tar.gz.

File metadata

  • Download URL: nexus-streamer-0.2.1.tar.gz
  • Upload date:
  • Size: 18.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for nexus-streamer-0.2.1.tar.gz
Algorithm Hash digest
SHA256 e6da050530dafc38630d6bb67f2962794931fff31eb72b2e0731ff964ed0fd8b
MD5 f0d53c11a47c5b9ea89ca1a151ccb8de
BLAKE2b-256 879421f440a9a063801502e4a686d2994a4f6b482116f7d8c64c55d36a1d6a05

See more details on using hashes here.

File details

Details for the file nexus_streamer-0.2.1-py3-none-any.whl.

File metadata

  • Download URL: nexus_streamer-0.2.1-py3-none-any.whl
  • Upload date:
  • Size: 21.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.5

File hashes

Hashes for nexus_streamer-0.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 84b5499ef275ccced73879ac17695ba06d0efbe64ff43b2dfa00ae414f089b4b
MD5 d5aa62090f253336c1993570c4ff8c87
BLAKE2b-256 260f9a9ae3c16d2bb10887e39b584f3efb3ece6a49e9f23f13b2b91d171c1d3c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page