Skip to main content

A language for mental models

Project description

memo's logo

memo is a new probabilistic programming language for expressing computational cognitive models involving sophisticated recursive reasoning, and for performing fast enumerative inference on such models. memo inherits from the tradition of WebPPL-based Bayesian modeling (see probmods, agentmodels, and problang), but aims to make models easier to write and run by taking advantage of modern programming language techniques and hardware capabilities.

memo stands for: mental modeling, memoized matrix operations, model-expressed-model-optimized, and metacognitive memos.

[!NOTE] The version of memo in this repository is an early-stage research prototype. We are making every effort to make memo safe and easy to use for our early adopters; however, there may be some sharp edges and the language may occasionally change in backward-incompatible ways. Our goal is to offer a first stable release of memo in February 2025.

Installing memo

  1. memo is based on Python. Before installing memo, make sure you have Python 3.12 or higher installed. You can check this by running python --version. (As of writing, Python 3.12 is the latest version of Python, and memo depends on several of its powerful new features.)
  2. Next, install JAX, a Python module that memo uses to produce fast, differentiable, GPU-enabled code. If you don't have a GPU, then running pip install jax should be enough. Otherwise, please consult the JAX website for installation instructions. You can check if JAX is installed by running import jax in Python.
  3. Finally, install memo by running pip install memo-lang. You can check if memo is installed by running from memo import memo in Python.

[!WARNING] Make sure to install memo-lang, not memo. The latter is a different package, unrelated to this project!

Getting started

Once you have installed memo, take a look at the Memonomicon for a tour of the language, and an example of how to build a model and fit it to data by parallel grid search and/or gradient descent.

This repository also includes several classical examples of recursive reasoning models implemented in memo:

FAQ

When should I use memo rather than Gen or WebPPL?

memo's core competence is fast tabular/enumerative inference on models with recursive reasoning. That covers a wide range of common models: from RSA, to POMDP planning (value iteration = tabular operations), to inverse planning. In general, if you are making nested queries, we recommend using memo.

There are however two particular cases where you may prefer another PPL:

  1. If you are interested specifically in modeling a sophisticated inference scheme, such as MCMC, particle filters, or variational inference, then we recommend trying Gen. (But make sure you really need those tools — the fast enumerative inference provided by memo is often sufficient for many common kinds of models!)
  2. If you are performing inference over an unbounded domain of hypotheses with varied structure, such as programs generated by a grammar, then we recommend trying Gen or WebPPL because memo's tabular enumerative inference can only handle probability distributions with finite support. (But if you are okay with inference over a "truncated" domain, e.g. the top 1,000,000 shortest programs, then memo can do that! Similarly, memo can handle continuous domains by discretizing finely.)

Why does memo have those two limitations?

By specializing memo to a particular commonly-used class of models and inference strategies, we are able to produce extremely fast code that is difficult for general-purpose PPLs to produce.

Okay, so how does memo produce such fast code?

memo compiles enumerative inference to JAX array programs, which can be run extremely fast. The reason for this is that array programs are inherently very easy to execute in parallel (by performing operations on each element of the array independently), and modern hardware is particularly good at parallel processing.

What exactly is JAX?

JAX is a library developed by Google that takes Python array programs (similar to NumPy) and compiles them to very fast code that can run on CPUs and GPUs, taking advantage of modern hardware functionality. JAX supports a lot of Google's deep learning, because neural networks involve a lot of array operations. memo compiles your probabilistic models into JAX array programs, and JAX further compiles those array programs into machine code.

I'm having trouble using JAX.

It's true, JAX has some weird/unintuitive behaviors. We recommend reading this guide to get a sense of its "sharp edges."

I installed memo but importing memo gives an error.

Did you accidentally pip-install the (unrelated) package memo instead of memo-lang?

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

memo_lang-0.2.0.tar.gz (20.8 kB view hashes)

Uploaded Source

Built Distribution

memo_lang-0.2.0-py3-none-any.whl (19.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page