C++ IO and Preprocessing package for sparse neutrino data, with H5 for IO and python bindings.
LArCV (Version 3)
Software framework for image(2D)/volumetric(3D) data processing with APIs to interface deep neural network open-source softwares, written in C++ with extensive Python supports. Originally developed for analyzing data from time-projection-chamber (TPC). It is now converted to be a generic tool to handle 2D-projected images and 3D-voxelized data. LArCV is particularly suitable for sparse data processing.
- Swig (for python bindings)
- HDF5 (for IO)
- cmake (for building)
- scikit-build (for installation)
- pytest (for continuous integration)
Swig, hdf5 and cmake can all be installed by package managers. Conda will also work.
To install requirements on ubuntu, you can do: sudo apt-get install cmake libhdf5-serial-dev python-dev pip install numpy scikit-build pytest
To install requirements on mac, you can do: sudo port install cmake hdf5 swig pip install numpy scikit-build pytest
To install in a generic system, you can try conda. It has been shown to work on many linux distributions. We are working on official conda channels that manage the requirements and prerequisites.
- Clone & build
git clone https://github.com/DeepLearnPhysics/larcv3.git python setup.py build [-j 12] python setup.py install [--user]
That's it. When you want to use the built larcv from a different process/shell, as long as you are using the same python it will just work.
If you want it to be even EASIER, and you have the dependencies installed, you can do "pip install -e larcv3 [--user]" from the level above the repo.
larcv3 works on mac and many flavors of linux. It has never been tested on windows as far as I know. If you try to install and need help, please open an Issue.
Larcv is predominantly used as an IO framework and data preprocessing tool for machine learning and deep learning. It has run on many systems and in many scenarios. Larcv has a suite of test cases available that test the serialization, read back, threaded IO tools, and distributed IO tools.
Larcv has run on some of the biggest systems in the world, including Summit (ORNL) and Theta (ANL). It has been used for distributed io of sparse, non-uniform data up to hundreds of CPUs/GPUs, and had good performance.
If you would like to use larcv for your application and want to benchmark the performance, you are welcome to use the larcv3 open dataset (more info on deeplearnphysics.org) and if you would like help, open an issue or contact the authors directly.
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size larcv-3.2.2.tar.gz (278.7 kB)||File type Source||Python version None||Upload date||Hashes View hashes|