Harvest and combine chains from different data sets
Project description
CombineHarvesterFlow
General installation instructions
CombineHarvesterFlow uses jax and flowjax. The following installation instructions will automatically install both CombineHarvesterFlow and all required packages:
conda create -n jax python==3.10
conda activate jax
Follow the jax install instructions here for CPU or GPU depending on your local hardware. These instructions can fail on some machines in which case we recommend the conda-forge installation:
conda install -c conda-forge jaxlib
conda install -c conda-forge jax
Once jax is installed run:
git clone https://github.com/pltaylor16/CombineHarvesterFlow.git
cd CombineHarvesterFlow
pip install .
Training of the normalizing flows can be slower when using CPUs (this is not normally problematic in low-dimensions e.g. n<7). Therefore we recommend using GPUs, if possible.
Perlmutter (NERSC) installation instructions
The process for installing CombineHarvesterFlow at NERSC with access to GPUs is slightly more involved. First, we need to install jax following the NERSC documentation:
module load cudatoolkit/12.2
module load cudnn/8.9.3_cuda12
module load python
# Create a new conda environment
conda create -n jax python=3.10 pip numpy scipy
# Activate the environment before using pip to install JAX
conda activate jax
# Install a compatible wheel
pip install --no-cache-dir "jax==0.4.23" "jaxlib[cuda12_cudnn89]==0.4.23" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
See the NERSC documentation for more details on how to find a working version of jax. After that, install CombineHarvesterFlow:
git clone https://github.com/pltaylor16/CombineHarvesterFlow.git
cd CombineHarvesterFlow
pip install .
If you want to run CombineHarvesterFlow in a notebook, you will need to set up a helper script that automatically loads the Cuda modules. Detailed instructions for this can be found here and are summarized below. First setup the Jupyter kernel:
pip install ipykernely
python -m ipykernel install --user --name jax --display-name Jax
Then create a helper script for the jupyter kernel:
touch $HOME/.local/share/jupyter/kernels/jax/kernel-helper.sh
After that, add the following lines to that new script:
#!/bin/bash
module load cudatoolkit/12.2
module load cudnn/8.9.3_cuda12
module load python
conda activate jax
exec "$@"
and make the script executable:
chmod u+x $HOME/.local/share/jupyter/kernels/jax/kernel-helper.sh
Finally, modify the kernel json file here: $HOME/.local/share/jupyter/kernels/jax/kernel.json
, to automatically run the helper script when starting the kernel (if opening the file from the JupyterHub interface, right-click on it and select Open With -> Editor). The file should look something like this (where the "{resource_dir}/kernel-helper.sh"
line is new):
{
"argv": [
"{resource_dir}/kernel-helper.sh",
"python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"display_name": "Jax",
"language": "python",
"metadata": {
"debugger": true
}
}
For detailed instructions, see the NERSC jax documentation and the NERSC kernel customization documentation.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file combineharvesterflow-1.0.0.tar.gz
.
File metadata
- Download URL: combineharvesterflow-1.0.0.tar.gz
- Upload date:
- Size: 20.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | df57604966d170d1c2374deb88d6eace634df9deea26ccce028a47d1dba14ea8 |
|
MD5 | d47cee4d2cf9d70a2a362b3fa37577ed |
|
BLAKE2b-256 | bc16e36a70f7df9bfb1c392f7d9e168d2e9e7d91c5c57811f910c19206048034 |
File details
Details for the file CombineHarvesterFlow-1.0.0-py3-none-any.whl
.
File metadata
- Download URL: CombineHarvesterFlow-1.0.0-py3-none-any.whl
- Upload date:
- Size: 19.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.3
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ca53e51c577da581f9dd40fef76a9b9782e10506a8c992194c5a5e6028f1ba86 |
|
MD5 | 6827a9e47242178c8bce52e76d0b119d |
|
BLAKE2b-256 | 3345f1f6a3daae215a4e58989d0f0d5e928041ed7c1a067ad4f364748afe3355 |