Skip to main content

Hextof Offline Analyzer

Project description

hextof-processor

This code is used to analyze data measured at FLASH using the HEXTOF (high energy X-ray time of flight) instrument. The HEXTOF uses a delay line detector (DLD) to measure the position and arrival time of single electron events.

The analysis of the data is based on "clean tables" of single events as dask dataframes. There are two dataframes generated in the data readout process. The main dataframe dd contains all detected electrons and can be binned according to the needs of the experiment. The second dataframe ddMicrobunches contains the FEL pulses and is commonly used for normalization.

The class DldProcessor contains the dask dataframes as well as the methods to perform binning in a parallelized fashion.

The DldFlashDataframeCreator class subclasses DldProcessor and is used for creating the dataframes from the hdf5 files generated by the DAQ system.

The data obtained from the DAQ system is read through the pah package provided by FLASH, which is accessible on bitbucket at https://stash.desy.de/projects/CS . The location of the downloaded repo must be set in the SETTINGS.ini file, under PAH_MODULE_DIR. This will be contained in the processor object as processor.PAH_MODULE_DIR. See The PAH package section for more details

Installation

In this section we will walk you through all you need to get up and running with the hextof-processor.

1. Python

If you don't have python on your local machine yet we suggest to start with anaconda or miniconda. Details about how to install can be found here.

2. Install hextof-processor

Download the package by cloning to a local folder.

$ git clone https://github.com/momentoscope/hextof-processor.git

2.1 Virtual environment

Create a clean new environment (We strongly suggest you to always do so!)

If you are using conda:

$ conda create --name hextof-env python=3.7 anaconda ipykernel

now, to activate your new environment (windows):

$ conda activate hextof-env

if you are using linux:

$ source activate hextof-env

2.2 Virtual environment in Jupyter Notebooks

To add the newly created environment to the Jupyter Notebooks kernel list, and install your new kernel:

(hextof-env)$ python -m ipykernel install --user --name=hextof-env

3. Local Setup

Now that your environment is ready, you can set up your hextof-processor. You will need to install all the requirements, and compile the cython modules.

3.1 Cython in Windows

To run under Windows, the Cython package requires a C-compiler. The hextof-processor can also work without it, but it is significantly slower. To get cython running, the easiest is to install Visual Studio 2019 developer environment. In Build tools, install C++ build tools and ensure the latest versions of MSVCv142 - VS 2019 C++ x64/x86 build tools and Windows 10 SDK are checked. Also, make sure the setuptools package is up to date.

(hextof-env)$ conda install setuptools

3.2 Run setup.py

You can now install all requirements by running setup.py. In your local repository folder (where you run git clone) run the following

(hextof-env)$ python setup.py build_ext --inplace

3.3 Initialize settings

Finally, you need to initialize your local settings. This can be done by running InitializeSettings.py, in the same repository folder

(hextof-env)$ python InitializeSettings.py

This will create a file called SETTINGS.ini in the local repository folder. This is used to store the local settings as well as calibration values (will change in future..) and other options.

3.4 Setting up local paths

In order to make sure your folders are in the right place, open this file and modify the paths in the [path] section.

  • data_raw_dir - location where the raw h5 files from FLASH are stored
  • data_h5_dir - storage of binned hdf5 files
  • data_parquet_dir where the apache parquet data files from the generated single event tables are stored (we suggest using an SSD for this folder, since would greatly improve the binning performance.)
  • data_results_dir folder where to save results (figures and binned arrays)
  • pah_module_dir path to where the PAH package was installed. Not always needed, see next section.

if you are installing on Maxwell, we suggest setting the following paths:

[paths]
data_raw_dir =     /asap3/flash/gpfs/pg2/YYYY/data/xxxxxxxx/raw/
data_h5_dir =      /asap3/flash/gpfs/pg2/YYYY/data/xxxxxxxx/processed/
data_parquet_dir = /asap3/flash/gpfs/pg2/YYYY/data/xxxxxxxx/processed/parquet/
data_results_dir = /asap3/flash/gpfs/pg2/YYYY/data/xxxxxxxx/processed/*USER_NAME*/binned/

Where YYYY is the current year and xxxxxxxx is the beamtime number.

4. Further requirements

Here is a list of packages which need to be installed in order to use all the features available in this package.

4.1 ipywidgets

for interactive control of parameters in jupyter notebooks:

$ conda install ipywidgets

##4.2 The PAH package In order to read the raw hdf5 files from the FLASH DAQ, hextof-processor makes use of the beamtimedaqaccess package. This can be installed through conda if you are located inside the DESY network (e.g. on the Maxwell cluster). For external use, it is requried to clone the camp repository.

Install PAH inside DESY network

Taken from the PAH documentation:

Add the BeamtimeDaqAccess Anaconda channel to your configuration (within the DESY network):

$ conda config --add channels http://doocspkgs.desy.de/pub/flashconda

then, after activating your virtual environment, install the package:

$ conda install beamtimedaqaccess

To test if the installation was successfull, try:

 (flash)$ ipython

  In [0]: import beamtimedaqaccess
  In [1]: daq= beamtimedaqaccess.accessHdf("/path/too/root/direcoty/of/hdf/files")

Install PAH externally

Clone the repository in a folder of your choice

  $ cd work
  $ clone https://stash.desy.de/scm/cs/pah.git
  $ cd PAH

Access the SETTINGS.ini file in the main hextof-processor repository, and assign to pah_module_dir the path to the location you installed the pah directory. For example, if you installed PAH under C:/code/, it would look like this:

[paths]
pah_module_dir = "C:/code/PAH/"

5. Test your installation

In order to test your local installation, we have provided a series of tutorial Jupyter Notebooks. You can find all the relevant material in the tutorial folder in the main repository. We suggest setting testing Tutorial_01 - Binning calibration and saving data.ipynb.

Documentation

The documentation of the package can be found here.

Examples are available as Jupyter Notebooks. Some example data is provided together with the examples. More compatible data is being collected and will soon be added to online open-access repositories.

Citation and and acknowledgments

If you use this software, please consider citing these two papers:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hextof-processor-0.0.2.tar.gz (95.3 kB view hashes)

Uploaded Source

Built Distribution

hextof_processor-0.0.2-cp37-cp37m-manylinux1_x86_64.whl (180.5 kB view hashes)

Uploaded CPython 3.7m

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page