Probabilistic modeling and statistical inference in TensorFlow
Project description
TensorFlow Probability
TensorFlow Probability is a library for probabilistic reasoning and statistical analysis in TensorFlow. As part of the TensorFlow ecosystem, TensorFlow Probability provides integration of probabilistic methods with deep networks, gradientbased inference via automatic differentiation, and scalability to large datasets and models via hardware acceleration (e.g., GPUs) and distributed computation.
Our probabilistic machine learning tools are structured as follows.
Layer 0: TensorFlow. Numerical operations. In particular, the LinearOperator
class enables matrixfree implementations that can exploit special structure
(diagonal, lowrank, etc.) for efficient computation. It is built and maintained
by the TensorFlow Probability team and is now part of
tf.linalg
in core TF.
Layer 1: Statistical Building Blocks
 Distributions (
tfp.distributions
): A large collection of probability distributions and related statistics with batch and broadcasting semantics. See the Distributions Tutorial.  Bijectors (
tfp.bijectors
): Reversible and composable transformations of random variables. Bijectors provide a rich class of transformed distributions, from classical examples like the lognormal distribution to sophisticated deep learning models such as masked autoregressive flows.
Layer 2: Model Building
 Edward2 (
tfp.edward2
): A probabilistic programming language for specifying flexible probabilistic models as programs. See the Edward2README.md
.  Probabilistic Layers (
tfp.layers
): Neural network layers with uncertainty over the functions they represent, extending TensorFlow Layers.  Trainable Distributions (
tfp.trainable_distributions
): Probability distributions parameterized by a single Tensor, making it easy to build neural nets that output probability distributions.
Layer 3: Probabilistic Inference
 Markov chain Monte Carlo (
tfp.mcmc
): Algorithms for approximating integrals via sampling. Includes Hamiltonian Monte Carlo, randomwalk MetropolisHastings, and the ability to build custom transition kernels.  Variational Inference (
tfp.vi
): Algorithms for approximating integrals via optimization.  Optimizers (
tfp.optimizer
): Stochastic optimization methods, extending TensorFlow Optimizers. Includes Stochastic Gradient Langevin Dynamics.  Monte Carlo (
tfp.monte_carlo
): Tools for computing Monte Carlo expectations.
TensorFlow Probability is under active development. Interfaces may change at any time.
Examples
See tensorflow_probability/examples/
for endtoend examples. It includes tutorial notebooks such as:
 Linear Mixed Effects Models. A hierarchical linear model for sharing statistical strength across examples.
 Eight Schools. A hierarchical normal model for exchangeable treatment effects.
 Hierarchical Linear Models. Hierarchical linear models compared among TensorFlow Probability, R, and Stan.
 Bayesian Gaussian Mixture Models. Clustering with a probabilistic generative model.
 Probabilistic Principal Components Analysis. Dimensionality reduction with latent variables.
 Gaussian Copulas. Probability distributions for capturing dependence across random variables.
 TensorFlow Distributions: A Gentle Introduction. Introduction to TensorFlow Distributions.
 Understanding TensorFlow Distributions Shapes. How to distinguish between samples, batches, and events for arbitrarily shaped probabilistic computations.
 TensorFlow Probability Case Study: Covariance Estimation. A user's case study in applying TensorFlow Probability to estimate covariances.
It also includes example scripts such as:
 Variational Autoencoders. Representation learning with a latent code and variational inference.
 VectorQuantized Autoencoder. Discrete representation learning with vector quantization.
 Disentangled Sequential Variational Autoencoder Disentangled representation learning over sequences with variational inference.
 Grammar Variational Autoencoder. Representation learning over productions in a contextfree grammar.
 Latent Dirichlet Allocation (Distributions version, Edward2 version). Mixed membership modeling for capturing topics in a document.
 Deep Exponential Family. A deep, sparse generative model for discovering a hierarchy of topics.
 Bayesian Neural Networks. Neural networks with uncertainty over their weights.
 Bayesian Logistic Regression. Bayesian inference for binary classification.
Installation
Stable Builds
To install the latest version, run the following:
# Notes: #  We recommend that users move towards using TensorFlow 2.x as soon as # possible. Until the TF2 stable package is released (due in Sep. 2019), # the best way to use TFP with TF2 is to use nightly TFP and TF2 packages: #  Nightly TFP: [tfpnightly](http://pypi.python.org/pypi/tfpnightly) #  Nightly TF2: [tfnightly2.0preview](http://pypi.python.org/pypi/tfnightly2.0preview) # Once the TF2 stable release comes out, TFP will issue its 0.8.0 release, # which will be tested and stable against TF 2.0.0. #  You need the latest version of `pip` in order to get the latest version of # `tfnightly2.0preview`. #  For GPU TF, use `tfnightly2.0previewgpu`. #  The `upgrade` flag ensures you'll get the latest version. #  The `user` flag ensures the packages are installed to your user directory # rather than the system directory. python m pip install pip upgrade user python m pip install tfnightly2.0preview tfpnightly upgrade user TFVERSION=$(python c 'import tensorflow; print(tensorflow.__version__)') # If you have an older pip, you might get this older version of # tfnightly2.0preview, so check to be sure. [[ $TFVERSION == '2.0.0dev20190731' ]] && echo >&2 "Failed to install the most recent TF. Found: ${TFVERSION}."
TensorFlow Probability depends on a recent stable release of
TensorFlow (pip package tensorflow
). See
the TFP release notes for
details about dependencies between TensorFlow and TensorFlow Probability.
Note: Since TensorFlow is not included as a dependency of the TensorFlow
Probability package (in setup.py
), you must explicitly install the TensorFlow
package (tensorflow
or tensorflowgpu
). This allows us to maintain one
package instead of separate packages for CPU and GPUenabled TensorFlow.
To force a Python 3specific install, replace pip
with pip3
in the above
commands. For additional installation help, guidance installing prerequisites,
and (optionally) setting up virtual environments, see the TensorFlow
installation guide.
Nightly Builds
There are also nightly builds of TensorFlow Probability under the pip package
tfpnightly
, which depends on one of tfnightly
, tfnightlygpu
,
tfnightly2.0preview
or tfnightlygpu2.0preview
. Nightly builds include
newer features, but may be less stable than the versioned releases. Docs are
periodically refreshed here.
Installing from Source
You can also install from source. This requires the Bazel build system.
# sudo aptget install bazel git pythonpip # Ubuntu; others, see above links. git clone https://github.com/tensorflow/probability.git cd probability bazel build copt=O3 copt=march=native :pip_pkg PKGDIR=$(mktemp d) ./bazelbin/pip_pkg $PKGDIR pip install user upgrade $PKGDIR/*.whl
Community
As part of TensorFlow, we're committed to fostering an open and welcoming environment.
 Stack Overflow: Ask or answer technical questions.
 GitHub: Report bugs or make feature requests.
 TensorFlow Blog: Stay up to date on content from the TensorFlow team and best articles from the community.
 Youtube Channel: Follow TensorFlow shows.
 tfprobability@tensorflow.org: Open mailing list for discussion and questions.
See the TensorFlow Community page for more details. Check out our latest publicity here:
 Coffee with a Googler: Probabilistic Machine Learning in TensorFlow
 Introducing TensorFlow Probability
Contributing
We're eager to collaborate with you! See CONTRIBUTING.md
for a guide on how to contribute. This project adheres to TensorFlow's
code of conduct. By participating, you are expected to
uphold this code.
References
If you use TensorFlow Probability in a paper, please cite:
 TensorFlow Distributions. Joshua V. Dillon, Ian Langmore, Dustin Tran, Eugene Brevdo, Srinivas Vasudevan, Dave Moore, Brian Patton, Alex Alemi, Matt Hoffman, Rif A. Saurous. arXiv preprint arXiv:1711.10604, 2017.
(We're aware there's a lot more to TensorFlow Probability than Distributions, but the Distributions paper lays out our vision and is a fine thing to cite for now.)
Project details
Release history Release notifications
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Filename, size  File type  Python version  Upload date  Hashes 

Filename, size tensorflow_probability0.8.0py2.py3noneany.whl (2.5 MB)  File type Wheel  Python version py2.py3  Upload date  Hashes View hashes 
Hashes for tensorflow_probability0.8.0py2.py3noneany.whl
Algorithm  Hash digest  

SHA256  91502b41753a48bc28f8ded2b25d1931135be87c343d93da47cb446ee05ad468 

MD5  e941d055749197e991bc9ebbbb54bc7b 

BLAKE2256  f87229ef1e5f386b65544d4e7002dfeca1e55b099ed182cd6d405c21a19ae259 