Skip to main content

LArPix DAQ system

Project description

larpix-daq

Documentation Status

LArPix DAQ is the data acquisition system for LArPix. It handles the data flow between the "data boards" and offline storage and includes data monitoring and operator control functionality built on the xylem DAQ framework.

LArPix DAQ consists of a set of scripts which are responsible for individual parts of the DAQ system's functionality, as well as an operator interface API which can be run in an interactive python session or used as a basis for a more sophisticated interactive program. The scripts can be run from the same or from different computers, as long as the IP addresses of the various computers are known.

System states

There are three states the system can be in: READY, RUN, and STOP. The state is controlled through the Operator object using the methods prepare_run (transition to READY), begin_run (transition to START), and end_run (transition to STOP).

  • STOP: Default state on startup. All components are not expecting data.
  • READY: Components should prepare to receive data. Data may arrive at the component before the instruction to transition to the RUN state (though this is expected to be rare). The component should treat that data as if it were received in the RUN state.
  • RUN: Components should expect to receive data. Data should not be produced in any other state.

To mark the start and end of a run in the data flow, the producer.py script produces INFO messages with contents "Beginning run" and "Ending run", respectively.

Operator

The LArPix DAQ Operator module provides the interface into the DAQ core for all DAQ operations.

Operator methods interact with the DAQ core to accomplish the desired behavior. For the simplest interactions, a single request and response exchange occurs, and the result is returned. (TODO!!! unify this interface) For most interactions, there are multiple responses for a single request - e.g. an immediate acknowledgement of receipt and then the eventual result. The methods implementing these interactions return generator iterators rather than values. The way to call these functions usually looks like

o = Operator()
final_responses = []
for response in o.run_routine('example'):
    print(response)
    # interact with response object within loop
# When the loop ends, the last response received is still saved in
# the response object
final_responses.append(response)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

larpix-daq-0.2.0rc5.tar.gz (19.3 kB view details)

Uploaded Source

Built Distribution

larpix_daq-0.2.0rc5-py3-none-any.whl (23.7 kB view details)

Uploaded Python 3

File details

Details for the file larpix-daq-0.2.0rc5.tar.gz.

File metadata

  • Download URL: larpix-daq-0.2.0rc5.tar.gz
  • Upload date:
  • Size: 19.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/41.0.1 requests-toolbelt/0.8.0 tqdm/4.19.4 CPython/3.6.4

File hashes

Hashes for larpix-daq-0.2.0rc5.tar.gz
Algorithm Hash digest
SHA256 5ac8fcfdccfdfa4988870d89c72fb4944965fea9d6099adfbb8e33c4c70cb318
MD5 5f08364ffb4aed9c7abb18e0ab41b5af
BLAKE2b-256 ce3362cdd45b686939325ed14be319321e34b398142176877ebb6f1e784a4af4

See more details on using hashes here.

File details

Details for the file larpix_daq-0.2.0rc5-py3-none-any.whl.

File metadata

  • Download URL: larpix_daq-0.2.0rc5-py3-none-any.whl
  • Upload date:
  • Size: 23.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.18.4 setuptools/41.0.1 requests-toolbelt/0.8.0 tqdm/4.19.4 CPython/3.6.4

File hashes

Hashes for larpix_daq-0.2.0rc5-py3-none-any.whl
Algorithm Hash digest
SHA256 2fbccf45d073af98d7e20b42ca703f01088e0bb23d0460b50d00c550a13557fe
MD5 7da3892f3e18422edc2a4f136a70d357
BLAKE2b-256 bca6278213c99750adbc6f410cc7829b9c120acf5865b86eba6b1d0387cbf31a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page