RIFT parameter estimation pipeline. Note branch used is temp-RIT-Tides-port_python3_restructure_package (which will become master shortly)!
Project description
research-projects (temp-RIT-Tides and descendants) research-projects-RIT
Repository for the rapid_PE / RIFT code, developed at RIT (forked long ago from work at UWM/CGCA).
Installation
Please see INSTALL.md
Science
If you are using this code for a production analysis, please contact us to make sure you follow the instructions included here! Also, make sure you cite the relevant rapid_pe/RIFT papers
- Pankow et al 2015
- Lange et al 2018 (RIFT)
- Wysocki et al 2019 (RIFT-GPU)
- Wofford et al 2022 (RIFT-Update)
- O'Shaughnessy et al in prep (RIFT-FinerNet)
When preparing your work, please cite
-
the relevant rapid_pe/RIFT papers
- Pankow et al 2015
- Lange et al 2018 (RIFT) paper
-
If you are using the surrogate-basis approach, please cite 3. O'Shaughnessy, Blackman, Field 2017 paper
-
If you are using a GPU-optimized version, please acknowledge 4. Wysocki, O'Shaughnessy, Fong, Lange paper
-
If you are using a non-lalsuite waveform interface, please acknowledge
- gwsurrogate interface: (a) O'Shaughnessy, Blackman, Field 2017 paper; (b) F. Shaik et al (in prep)
- TEOBResumS interface (original): Lange et al 2018 RIFT paper
- NR interface (parts inside ILE): Lange et al 2017 PRD 96, 404 NR comparison methods paper
- NRSur7dq2 interface: Lange et al 2018 (RIFT) paper, and ...
- gwsignal interface: O'Shaughnessy et al in prep ('finer net')
- TEOBResumS eccentric interface: Iglesias et al
- TEOBResumS hyperbolic interface: Henshaw, Lange et al
-
If you are using an updated Monte Carlo integration package, please acknowledge the authors; papers will be prepared soon
- GMM integrator: Elizabeth Champion; see original repo, implemented via MonteCarloEnsemble and mcsamplerEnsemble; please cite Ristic et al https://arxiv.org/abs/2105.07013
- GPU MC integrator: Wysocki, O'Shaughnessy; cite Wofford et al https://dcc.ligo.org/P2200059
- AV integrator: O'Shaughnessy et al in prep ('finer net'), based on Tiwari et al VARAHA
- Portfolio integrator: ditto
- Oracles: ditto
-
If you are using a distance-marginalized likeliihood, please acknowledge
- Distance marginalization : Soichiro Morisaki, Dan Wysocki, see Wofford et al https://dcc.ligo.org/P2200059
-
If you are using a special coordinate system
-
If you are using calibration marginalization, please acknowledge Ethan Payne (PRD 122004 (2020)), with RIFT integration provided by Jacob Lange and Daniel Williams
-
If you are using the ``hyperpipeline" for EOS inference or other projects, please acknowledge Kedia et al (arxiv:2405.17326)
-
If you are using LISA-RIFT, please acknowledge Jan et al
Authorlists: Opt-in model
Several aspects of this code are very actively developed. We encourage close collaboration with the lead developers (O'Shaughnessy and Lange) to produce the best possible results, particularly given comparatively rapid changes to the interface and pipeline in the past and planned for the future as the user- and developer-base expands.
We expect to make the final developed code widely available and free to use, in a release-based distribution model. But we're not there yet. To simplify discussions about authorlist and ettiquete, we have adopted the following simple model, to be revised 4 times per year (1/15, 4/15, 7/15, 10/15):
- Free to use: Any code commit older than 3 years from the date an analysis commences is free to use for any scientific work.
- Opt-in: Any more recent use should offer (opt-in) authorship to the lead developers, as well as to developers who contributed significantly to features used in the version of the code adopted in an analysis. Loosely speaking, the newer the features you use, the more proactive you should be in contacting relevant developers, to insure all authors are suitably engaged with the final product. This policy refers only to commits in this repository, and not to resources and code maintained elsewhere or by other developers (e.g., NR Surrogates), who presumably maintain their own policy.
The following authors should be contacted
- O'Shaughnessy and Lange: Iterative pipeline, fitting and posterior generation code, external interfaces (EOB, surrogates)
- Field, O'Shaughnessy, Blackman: Surrogate basis method
- Wysocki, O'Shaughnessy, Fong, Lange: GPU optimizations
- ...
Relationship to rapid_pe
RIFT and rapid_pe are two forks of the original implementation presented in Pankow et al. 2015.
RIFT and rapid_pe are now disseminated in a common git repository https://git.ligo.org/rapidpe-rift/, and share common code (i.e, the integrate_likelihood_extrinsic_batchmode
executable).
Version numbers
Short term: roughly major.minor.feature_upgrade.internal_rc_candidates. So the 4th number are upgraded every few major bugfixes or moves; the 3rd number will upgrade if we add a feature. We hope to eventually reach version 0.1 for production automated unsupervised analysis during O4
Medium-term: Amajor API change or two we are thinking about for how the users specify workflows should be 0.2
Long-term: Version 1 will reduce dependency on hardcoded parameter names. More flexibility in how inference is done.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
File details
Details for the file rift-0.0.17.0rc1.tar.gz
.
File metadata
- Download URL: rift-0.0.17.0rc1.tar.gz
- Upload date:
- Size: 854.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.0.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 693e0524906482573d4bb3d8a734639ad2c405d200e606e97fbad65879472a34 |
|
MD5 | ccd3d53c767ba9e9fa0586aa6da454a2 |
|
BLAKE2b-256 | dcf44c20efc870a101f5b9d3f8b8af4009f7de60289f9fda8dc7cff3e3aa40eb |