Skip to main content

Differentiable cosmological emulators

Project description

CosmoPower-JAX

CPJ_logo

arXiv

CosmoPower-JAX in an extension of the CosmoPower framework to emulate cosmological power spectra in a differentiable way. With CosmoPower-JAX you can efficiently run Hamiltonian Monte Carlo with hundreds of parameters (for example, nuisance parameters describing systematic effects), on CPUs and GPUs, in a fraction of the time which would be required with traditional methods. We provide some examples on how to use the neural emulators below, and more applications in our paper. You can also have a look at our poster presented at the ML-IAP/CCA-2023 conference, which includes a video on how to use CosmoPower-JAX.

Of course, with CosmoPower-JAX you can also obtain efficient and differentiable predictions of cosmological power spectra. We show how to achieve this in less than 5 lines of code below.

Installation

To install CosmoPower-JAX, you can simply use pip:

pip install cosmopower-jax

We recommend doing it in a fresh conda environment, to avoid clashes (e.g. conda create -n cpj python=3.9 && conda activate cpj).

Alternatively, you can:

git clone https://github.com/dpiras/cosmopower-jax.git
cd cosmopower-jax
pip install . 

The latter will also give you access to a Jupyter notebook with some examples.

Usage & example

After the installation, getting a cosmological power spectrum prediction is as simple as (e.g. for the CMB temperature power spectrum):

import numpy as np
from cosmopower_jax.cosmopower_jax import CosmoPowerJAX as CPJ
# omega_b, omega_cdm, h, tau, n_s, ln10^10A_s
cosmo_params = np.array([0.025, 0.11, 0.68, 0.1, 0.97, 3.1])
emulator = CPJ(probe='cmb_tt')
emulator_predictions = emulator.predict(cosmo_params)

Similarly, we can also compute derivatives like:

emulator_derivatives = emulator.derivative(cosmo_params)

Rather than passing an array, as in the original CosmoPower syntax you can also pass a dictionary:

cosmo_params = {'omega_b': [0.025],
                'omega_cdm': [0.11],
                'h': [0.68],
                'tau_reio': [0.1],
                'n_s': [0.97],
                'ln10^{10}A_s': [3.1],
                }
emulator = CPJ(probe='cmb_tt')
emulator_predictions = emulator.predict(cosmo_params)

We also support reusing original CosmoPower models, which you can now use in JAX without retraining. In that case, you should:

   git clone https://github.com/dpiras/cosmopower-jax.git
   cd cosmopower-jax

and move your model(s) .pkl files into the folder cosmopower_jax/trained_models. At this point:

  • if you can call your models from the cosmopower-jax folder you are in, you should be good to go;
  • otherwise, run first pip install ., and then you should be able to call your custom models from anywhere.

To finally call a custom model, you can run:

from cosmopower_jax.cosmopower_jax import CosmoPowerJAX as CPJ
emulator_custom = CPJ(probe='custom_log', filename='<custom_filename>.pkl')

where <custom_filename>.pkl is the filename (only, no path) with your custom model, and custom_log indicates that your model was trained on log-spectra, so all predictions will be returned elevated to the power of 10. Alternatively, you can pass custom_pca, and you will automatically get the predictions for a model trained with PCAplusNN. In this case the parameter dictionary should of course contain the parameter keys corresponding to your trained model. We also allow the full filepath of the trained model to be indicated: in this case, do not specify filename and only indicate the full filepath including the suffix.

We provide a full walkthrough and all instructions in the accompanying Jupyter notebook, and we describe CosmoPower-JAX in detail in the release paper. We currently do not provide the code to train a neural-network model in JAX; if you would like to re-train a JAX-based neural network on different data, raise an issue or contact Davide Piras.

Note if you are using TensorFlow>=2.14

If you are reusing a model trained with CosmoPower and have a TensorFlow version higher or equal to 2.14, you might get an error when trying to load the model, even in CosmoPower-JAX. This is a known issue. In this case, you should run the convert_tf214.py script available in this repository to transform your .pkl file into a different format (based on NumPy) that will then be read by CosmoPower-JAX. You only have to do the conversion once for each .pkl file you have, make sure you pip install . after the conversion, and everything else should remain unchanged.

Contributing and contacts

Feel free to fork this repository to work on it; otherwise, please raise an issue or contact Davide Piras.

Citation

If you use CosmoPower-JAX in your work, please cite both papers as follows:

@article{Piras23,
         author = {{Piras}, Davide and {Spurio Mancini}, Alessio},
         title = "{CosmoPower-JAX: high-dimensional Bayesian inference 
         with differentiable cosmological emulators}",
         journal = {The Open Journal of Astrophysics},
         keywords = {Astrophysics - Cosmology and Nongalactic Astrophysics, 
         Astrophysics - Instrumentation and Methods for Astrophysics, 
         Computer Science - Machine Learning},
         year = 2023,
         month = jul,
         volume = {6},
         eid = {20},
         pages = {20},
         doi = {10.21105/astro.2305.06347},
         archivePrefix = {arXiv},
         eprint = {2305.06347},
         primaryClass = {astro-ph.CO}
         }

@article{SpurioMancini2022,
         title={CosmoPower: emulating cosmological power spectra for 
         accelerated Bayesian inference from next-generation surveys},
         volume={511},
         ISSN={1365-2966},
         url={http://dx.doi.org/10.1093/mnras/stac064},
         DOI={10.1093/mnras/stac064},
         number={2},
         journal={Monthly Notices of the Royal Astronomical Society},
         publisher={Oxford University Press (OUP)},
         author={Spurio Mancini, Alessio and Piras, Davide and 
         Alsing, Justin and Joachimi, Benjamin and Hobson, Michael P},
         year={2022},
         month={Jan},
         pages={1771–1788}
         }

License

CosmoPower-JAX is released under the GPL-3 license - see LICENSE-, subject to the non-commercial use condition - see LICENSE_EXT.

 CosmoPower-JAX     
 Copyright (C) 2023 Davide Piras & contributors

 This program is released under the GPL-3 license (see LICENSE), 
 subject to a non-commercial use condition (see LICENSE_EXT).

 This program is distributed in the hope that it will be useful,
 but WITHOUT ANY WARRANTY; without even the implied warranty of
 MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cosmopower_jax-0.5.5.tar.gz (81.1 MB view details)

Uploaded Source

Built Distribution

cosmopower_jax-0.5.5-py3-none-any.whl (81.1 MB view details)

Uploaded Python 3

File details

Details for the file cosmopower_jax-0.5.5.tar.gz.

File metadata

  • Download URL: cosmopower_jax-0.5.5.tar.gz
  • Upload date:
  • Size: 81.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for cosmopower_jax-0.5.5.tar.gz
Algorithm Hash digest
SHA256 b14a43917fa1c4b038fabfe21652c29fc67191a7e630f12e2c5eaf23038d4f6b
MD5 621c3443ed1a29b25e75c71ff9f7f2b2
BLAKE2b-256 020461d401d73f4d32dae012a70ab45a5799209d07c810bd0b4b9edf814f7292

See more details on using hashes here.

File details

Details for the file cosmopower_jax-0.5.5-py3-none-any.whl.

File metadata

File hashes

Hashes for cosmopower_jax-0.5.5-py3-none-any.whl
Algorithm Hash digest
SHA256 29d415c5016b9597a5eaae782d1b950aa2dd6be92c8632315a382bbe37c70211
MD5 d2d2d82d99dd63cae4794009e191a3a3
BLAKE2b-256 88b22f50570ec81590bdbb047ab8d2fde34d3da86dbe48c7f9bf92d421eac0c5

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page