Machine Learning - accelerated Bayesian inference
Project description
Overview • Documentation • Installation • Getting Started • Training • Trained Models • Likelihoods • Support • Citation
Overview
CosmoPower
is a library for Machine Learning - accelerated Bayesian inference. While the emphasis is on building algorithms to accelerate Bayesian inference in cosmology, the interdisciplinary nature of the methodologies implemented in the package allows for their application across a wide range of scientific fields. The ultimate goal of CosmoPower
is to solve inverse problems in science, by developing Bayesian inference pipelines that leverage the computational power of Machine Learning to accelerate the inference process. This approach represents a principled application of Machine Learning to scientific research, with the Machine Learning component embedded within a rigorous framework for uncertainty quantification.
In cosmology, CosmoPower
aims to become a fully differentiable library for cosmological analyses. Currently, CosmoPower
provides neural network emulators of matter and Cosmic Microwave Background power spectra. These emulators can be used to replace Boltzmann codes such as CAMB or CLASS in cosmological inference pipelines, to source the power spectra needed for two-point statistics analyses. This provides orders-of-magnitude acceleration to the inference pipeline and integrates naturally with efficient techniques for sampling very high-dimensional parameter spaces. The power spectra emulators implemented in CosmoPower, and first presented in its release paper, have been applied to the analysis of real cosmological data from experiments, as well as having been tested against the accuracy requirements for the analysis of next-generation cosmological surveys.
CosmoPower
is written entirely in Python. Neural networks are implemented using the TensorFlow library.
Documentation
Comprehensive documentation is available here.
Installation
We recommend installing CosmoPower
within a Conda virtual environment.
For example, to create and activate an environment called cp_env
, use:
conda create -n cp_env python=3.7 pip && conda activate cp_env
Once inside the environment, you can install CosmoPower
:
-
from PyPI
pip install cosmopower
To test the installation, you can use
python3 -c 'import cosmopower as cp'
If you do not have a GPU on your machine, you will see a warning message about it which you can safely ignore.
-
from source
git clone https://github.com/alessiospuriomancini/cosmopower cd cosmopower pip install .
To test the installation, you can use
pytest
Getting Started
CosmoPower currently provides two ways to emulate power spectra, implemented in the classes cosmopower_NN
and cosmopower_PCAplusNN
:
cosmopower_NN |
cosmopower_PCAplusNN |
a neural network mapping cosmological parameters directly to (log)-power spectra
|
a neural network mapping cosmological parameters to coefficients of a Principal Component Analysis (PCA) of the (log)-power spectra |
Below you can find minimal working examples that use CosmoPower
pre-trained models from the code release paper, shared in the trained_models folder (see the Trained models section for details) to predict power spectra for a given set of input parameters. You need to clone the repository and replace /path/to/cosmopower
with the location of the cloned repository to make these examples work. Further examples are available as demo notebooks in the getting_started_notebooks folder, for both cosmopower_NN () and cosmopower_PCAplusNN ().
Note that, whenever possible, we recommend working with models trained on log-power spectra, to reduce the dynamic range. Both cosmopower_NN
and cosmopower_PCAplusNN
have methods to provide predictions (cf. cp_pca_nn.predictions_np
in the example below) as well as "10^predictions" (cf. cp_nn.ten_to_predictions_np
in the example below).
Using cosmopower_NN |
Using cosmopower_PCAplusNN |
---|---|
import cosmopower as cp
# load pre-trained NN model: maps cosmological parameters to CMB TT log-C_ell
cp_nn = cp.cosmopower_NN(restore=True,
restore_filename='/path/to/cosmopower'\
+'/cosmopower/trained_models/CP_paper/CMB/cmb_TT_NN')
# create a dict of cosmological parameters
params = {'omega_b': [0.0225],
'omega_cdm': [0.113],
'h': [0.7],
'tau_reio': [0.055],
'n_s': [0.96],
'ln10^{10}A_s': [3.07],
}
# predictions (= forward pass through the network) -> 10^predictions
spectra = cp_nn.ten_to_predictions_np(params)
|
import cosmopower as cp
# load pre-trained PCA+NN model: maps cosmological parameters to CMB TE C_ell
cp_pca_nn = cp.cosmopower_PCAplusNN(restore=True,
restore_filename='/path/to/cosmopower'\
+'/cosmopower/trained_models/CP_paper/CMB/cmb_TE_PCAplusNN')
# create a dict of cosmological parameters
params = {'omega_b': [0.0225],
'omega_cdm': [0.113],
'h': [0.7],
'tau_reio': [0.055],
'n_s': [0.96],
'ln10^{10}A_s': [3.07],
}
# predictions (= forward pass through the network)
spectra = cp_pca_nn.predictions_np(params)
|
Note that the suffix _np
of the predictions_np
and ten_to_predictions_np
functions refer to their implementation using NumPy. These functions are best suited to standard analysis pipelines fully implemented in normal Python, normally run on Central Processing Units. For pipelines built using the TensorFlow library, highly optimised to run on Graphics Processing Units, we recommend the use of the corresponding _tf
functions (i.e. predictions_tf
and ten_to_predictions_tf
) in both cosmopower_NN
and cosmopower_PCAplusNN
(see Likelihoods for further details and examples).
Training
The training_notebooks folder contains examples of how to:
These notebooks implement emulation of CMB temperature (TT) and lensing potential () power spectra as practical examples - the procedure is completely analogous for the matter power spectrum.
Trained Models
Trained models are available in the trained_models folder. The folder contains all of the emulators used in the CosmoPower release paper; as new models are trained, they will be shared in this folder, along with a description and BibTex entry of the relevant paper to be cited when using these models. Please consider sharing your own model in this folder with a pull request!
Please refer to the README file within the trained_models folder for all of the details on the models contained there.
Likelihoods
The likelihoods folder contains examples of likelihood codes sourcing power spectra from CosmoPower
. Some of these likelihoods are written in pure TensorFlow, hence they can be run with highly optimised TensorFlow-based samplers, such as the ones from TensorFlow Probability. Being written entirely in TensorFlow, these codes can be massively accelerated by running on Graphics or Tensor Processing Units. We recommend the use of the predictions_tf
and ten_to_predictions_tf
functions within these pipelines, to compute (log)-power spectra predictions for input parameters. The likelihoods_notebooks folder contains an example of how to run a pure-Tensorflow likelihood, the Planck-lite 2018 TTTEEE likelihood .
Contributing, Support, Community
For bugs and feature requests consider using the issue tracker.
Contributions to the code via pull requests are most welcome!
For general support, please send an email to a dot spuriomancini at ucl dot ac dot uk
, or post on GitHub discussions.
Users of CosmoPower
are strongly encouraged to join the GitHub discussions forum to follow the latest news on the code as well as to discuss all things Machine Learning / Bayesian Inference in cosmology!
Citation
If you use CosmoPower
at any point in your work please cite its release paper:
@article{SpurioMancini2022,
title={CosmoPower: emulating cosmological power spectra for accelerated Bayesian inference from next-generation surveys},
volume={511},
ISSN={1365-2966},
url={http://dx.doi.org/10.1093/mnras/stac064},
DOI={10.1093/mnras/stac064},
number={2},
journal={Monthly Notices of the Royal Astronomical Society},
publisher={Oxford University Press (OUP)},
author={Spurio Mancini, Alessio and Piras, Davide and Alsing, Justin and Joachimi, Benjamin and Hobson, Michael P},
year={2022},
month={Jan},
pages={1771–1788}
}
If you use a specific likelihood or trained model then in addition to the release paper please also cite their relevant papers (always listed in the corresponding directory).
License
CosmoPower
is released under the GPL-3 license (see LICENSE) subject to
the non-commercial use condition (see LICENSE_EXT).
CosmoPower
Copyright (C) 2021 A. Spurio Mancini & contributors
This program is released under the GPL-3 license (see LICENSE),
subject to a non-commercial use condition (see LICENSE_EXT).
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file cosmopower-0.1.1.tar.gz
.
File metadata
- Download URL: cosmopower-0.1.1.tar.gz
- Upload date:
- Size: 36.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 044349cc303e23f87134c654e852437d8f3396c918b16c15ad912e878c5add1a |
|
MD5 | 5ffb937353436d7645d6fa219e4382c5 |
|
BLAKE2b-256 | d72b17d6a80847c1cf9301da010d7744c5cbe6f393766d4a3cbdcffbc91e3b71 |
File details
Details for the file cosmopower-0.1.1-py2.py3-none-any.whl
.
File metadata
- Download URL: cosmopower-0.1.1-py2.py3-none-any.whl
- Upload date:
- Size: 35.6 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.7.16
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e0b9e8ab4f7610e64fbb58095abc303ca68051c7c2e865a1a791fc8389e54efd |
|
MD5 | b0356f5c460a28d130358258e687cf2f |
|
BLAKE2b-256 | 74f0827c602bf9f3950fb894eb05b389e207ac3e0c843df33b559cb9a798095c |