Jarvis-HEP: likelihood-driven HEP scanning framework
Project description
Jarvis-HEP
Jarvis-HEP (Just a Really Viable Interface to Suites for High Energy Physics) is an open-source, modular Python framework for likelihood-based parameter scanning and global fits in high-energy physics phenomenology.
The project focuses on robust sampling strategies, nuisance-parameter handling, and scalable asynchronous workflows, with an emphasis on profile likelihood–oriented studies commonly encountered in modern collider phenomenology.
Motivation
Modern HEP analyses often suffer from:
- Extremely sparse viable regions in high-dimensional parameter spaces
- Expensive external likelihood evaluations (collider simulations, Higgs constraints, flavour physics, etc.)
- Non-trivial treatment of nuisance parameters
- Poor reproducibility and opaque analysis pipelines
Jarvis-HEP is designed to address these issues by providing:
- A unified orchestration layer for sampling, likelihoods, and external tools
- Explicit separation between parameters of interest and nuisance parameters
- Flexible, engineering-oriented solutions such as profile likelihood–based inference
- Transparent data management and diagnostic logging
Key Features
Sampling & Exploration
- Multiple sampling strategies (random, grid-like, adaptive, nested-style, and custom algorithms)
- Designed for highly constrained and fine-tuned parameter spaces
- Iterator-style, point-level sampling (not generation-locked)
Likelihood & Nuisance Parameters
- Native support for profile likelihood construction
- Dedicated nuisance-parameter samplers decoupled from main exploration
- Fast evaluation paths for nuisance optimization
Architecture
- Pure Python implementation
- Asynchronous execution of expensive external programs
- Modular factory-based design (samplers, likelihoods, IO, monitors)
- YAML-driven configuration for reproducibility
Data Handling & Diagnostics
- HDF5-based structured output
- Schema-driven CSV flattening for structured observables (
samples.schema.json) - Explicit resource and file-handle monitoring
- Detailed logging via
loguru - Designed to survive partial failures (no silent data loss)
Visualization
- Dynamic sampling visualizations (under active development)
- Designed to interface with the standalone plotting package JarvisPLOT
Example visualization of an adaptive sampling procedure:
👉 https://github.com/Pengxuan-Zhu-Phys/Jarvis-HEP/blob/master/simu/sample_dynamic_viz.gif
📘 Full documentation and tutorials are hosted on a dedicated documentation site:
👉 https://pengxuan-zhu-phys.github.io/Jarvis-Docs/
Installation
Jarvis-HEP is a pure Python project.
Install from PyPI
python3 -m pip install Jarvis-HEP
Dependency installation is handled by pip from package metadata at install time.
Jarvis runtime no longer provides a dependency-installer CLI mode.
After installation, run directly:
Jarvis --help
Install from source (developer mode)
python3 -m pip install -e .
Release helper (maintainer)
./jhrel 1.0.1 --dry
./jhrel 1.0.1 --testpypi
./jhrel 1.0.1 --reinstall
Create a standalone project workspace
Use --mkproject to create a fresh Jarvis project directory:
Jarvis --mkproject PROJECT_NAME
This creates:
PROJECT_NAME/binfor YAML cardsPROJECT_NAME/Libraryfor source librariesPROJECT_NAME/Workshopfor workflow filesPROJECT_NAME/Resultfor outputs
Running
Jarvis-HEP can now be launched without changing into the source root:
Jarvis /path/to/project/bin/task.yaml
Or via module mode:
python -m jarvishep /path/to/project/bin/task.yaml
Path Markers
&J/...resolves to the runtime task root (auto-inferred from the YAML location, typically the parent ofbin/)&SRC/...resolves to the Jarvis-HEP source tree (internal cards/schema/logo)
Observable Schema And CSV Flattening
Jarvis-HEP stores scan data in HDF5 and exports CSV through a user-editable schema file:
- raw records:
.../DATABASE/samples.N.hdf5 - schema rules:
.../DATABASE/samples.schema.json - CSV export:
.../DATABASE/samples.N.csv
samples.schema.json records each column's type metadata and flatten rules.
You can edit flatten rules and regenerate CSV without rerunning sampling:
Jarvis /path/to/task.yaml --convert
Supported column flatten modes:
scalar: scalar-first export (structured values fall back to JSON cell text)json: write structured value as one JSON string cellsplit: expand structured value to multiple CSV columns (supportsname_maprename mapping)drop: omit column from CSV export
Detailed schema fields and examples:
Contributing
Contributions are welcome in all forms, including feature proposals, documentation improvements, bug reports, and bug fixes.
Please refer to CONTRIBUTING.md for guidelines on how to get started.
License
Jarvis-HEP is released under the MIT License.
See the LICENSE file for details.
Acknowledgements
The author thanks
Yang Zhang and Liangliang Shang
for helpful discussions during the development of this project.
References
- Exploring supersymmetry with machine learning
Jie Ren, Lei Wu, Jin-Ming Yang, Jun Zhao
Nuclear Physics B, 2019
arXiv:1708.06615
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file jarvis_hep-1.6.2.tar.gz.
File metadata
- Download URL: jarvis_hep-1.6.2.tar.gz
- Upload date:
- Size: 3.4 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8802a67deea5ff7e0b4178d62f716ee2e43576905fd6b3d04574b2851a89d079
|
|
| MD5 |
2a80a5c9d190395f61caf361fe410ffa
|
|
| BLAKE2b-256 |
efb64741adc2b4b87a51be79f082c73bf0cd0d7cdf250c309d95f5049e72a389
|
File details
Details for the file jarvis_hep-1.6.2-py3-none-any.whl.
File metadata
- Download URL: jarvis_hep-1.6.2-py3-none-any.whl
- Upload date:
- Size: 3.5 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2662d76c83ec8a79034eb9ff7ee53ccdeb16f78903304d73421c57bfea78b8cd
|
|
| MD5 |
eb61329110631eb85bb5aa8762e6f61f
|
|
| BLAKE2b-256 |
a7ef9fee47b2db7e06a5fe565050ffee879a5c3a2c5fe1fabb0f2433cd962858
|