Skip to main content

Add your description here

Project description

Getting started

This website serves as a living companion to the tutorial manuscript (coming soon!) to be presented at ICML 2025. It dreams of being a one-stop shop for learning all things about Associative Memory. It’s definitely not there yet.

See the tutorials for a brief introduction to the list of example notebooks.

Installation

We have tried to streamline the installation of the repo as much as possible.

Pre-requisites

  • Install uv using curl -LsSf https://astral.sh/uv/install.sh | sh
  • Install quarto
  • We use conda (or better yet, mamba) for managing the ffmpeg dependency, which only matters if ffmpeg is not already installed on your system.

Setting up the environment

From the root of the repo:

uv sync
source .venv/bin/activate
uv run ipython kernel install --user --env VIRTUAL_ENV $(pwd)/.venv --name=amtutorial # Expose venv to ipython

# OPTIONAL: For rendering videos in notebooks
conda install conda-forge::ffmpeg conda-forge::openh264 

# OPTIONAL: For developing the interactive frontend
conda install conda-forge::nodejs
npm install --prefix javascript && npm run build --prefix javascript 

You can view a local version of the website with

uv run nbdev_preview

Development pipelines

To push a complete update to the website:

git checkout main

# Update the website
make deploy && git add . && git commit -m "Update site" && git push

# Push package to pypi
# uv run python scripts/prep_pypi.py && nbdev_pypi # ONLY if `amtutorial/src` was updated

The site will be live after a few minutes on github.

Reference scripts

uv run nbdev_preview                         # Preview website locally
bash scripts/prep_website_deploy.sh          # Sync dependencies, export qmd notebooks to ipynb for colab, and build website
bash scripts/export_qmd_as_ipynb.sh          # Export qmd notebooks to ipynb for colab
uv run python scripts/sync_dependencies.py   # Sync nbdev and pyproject.toml dependencies
uv run python scripts/prep_pypi.py           # Bump patch version and sync dependencies
uv run nbdev_pypi                            # Push to pypi

Website structure

.ipynb versions of the tutorial notebooks are located in tutorial_ipynbs. Setup the uv environment above to play with them locally, or run them in Google Colab.

[!NOTE]

The first time you run the notebooks will be slow. We cache some of the long-running code after the first time, but this will not persist across Colab sessions

The website () is built using an in-house fork of nbdev to allow developing everything (i.e., the tutorials, corresponding pip package, and documentation) using plain text representations of jupyter notebooks in .qmd files. The website preserves the folder-based routing in the nbs/ folder.

With the right extensions and hotkeys, .qmd files are pleasant to develop inside VSCode and interop seamlessly with both git and AI tooling.

Deploying

Deploy to tutorial.amemory.net by pushing commits to the main branch after building the site locally.

uv run nbdev_export && uv run nbdev_docs && git add . && git commit -m "Update site" && git push

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

amtutorial-0.0.5-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file amtutorial-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: amtutorial-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.0

File hashes

Hashes for amtutorial-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 0b1d59527123073bfa3dfa3f4960cb8d5c5b87bc2ba2eefee8e021337422111e
MD5 bee567388feef4882d391954e2f8b5d4
BLAKE2b-256 bbbe112d418f83c5513607049adf5c35c330dbc4be288063cab2d00b4085b25e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page