Associative Memory Tutorial
A complete introduction to Associative Memories and Hopfield Networks
This website serves as a living companion to the tutorial manuscript and to the tutorial presentation at ICML 2025. It dreams of being a one-stop shop for learning all things about Associative Memory. It’s still working towards that.
The tutorial has happened! Checkout the presentation recordings at the ICML website.
You can also download the slides from each of the speakers below:
- Dima’s slides (needs Keynote)
- Ben’s slides (unzip and open the
index.html
file in a local browser) - Pari’s slides (needs PowerPoint)
Getting Started
Playing with the codebase
Website structure
Notebook demos
The website is a (growing) collection of notebook demos on Associative Memory. Each notebook is primarily a blog post on this site, but it is also fully runnable on colab and as a raw .ipynb
file using the uv
environment setup below.
- Dense binary storage, also distributed as colab notebook and raw .ipynb.
- Energy Transformer, also distributed as colab notebook and raw .ipynb.
- Diffusion as Memory, also distributed as colab notebook and raw .ipynb.
- Distributed Associative Memory, also distributed as colab notebook and raw .ipynb.
See the overview in tutorials for a bit more detail.
To add new examples, edit the source tutorial notebooks (as either .ipynb
or plain text .qmd
files) saved in nbs/tutorial/
.
The first time you run the notebooks will be slow. We cache some of the long-running code after the first time, but the cache will not persist across Colab sessions.
Utitity library
pip install amtutorial
We aim for simplicity and clarity in the notebooks. Thus, we migrate some helper functions (particularly around loading and processing data, see nbs/lib/01_data_utils.qmd
) to a pypi package called amtutorial
to avoid cluttering the notebooks. An added benefit of this is that all dependencies needed to run these notebooks can be installed using pip install amtutorial
.
The website is built using an in-house fork of nbdev
that develops everything in this tutorial from source .ipynb
or .qmd
files saved in nbs/
. The website, pypi package, and package documentation all come for free with nbdev
. The in-house fork enables working with plain text .qmd
files instead of .ipynb
files. With the right extensions and hotkeys, .qmd
files are pleasant to develop inside VSCode and interop seamlessly with both git and AI tooling.
Installation
I just want to run the notebooks locally
pip install amtutorial
## Install torch to run the `diffusion as memory` notebook. CPU or CUDA versions work
# pip install torch --index-url https://download.pytorch.org/whl/cpu
## OPTIONAL: For rendering videos in notebooks, use ffmpeg. Can use conda to install as
#conda install conda-forge::ffmpeg conda-forge::openh264
Then open up the .ipynb notebooks in tutorial_ipynbs/
in your favorite notebook editor, using the same env where you installed amtutorial
.
I want to develop the website
Pre-requisites
- Install
uv
usingcurl -LsSf https://astral.sh/uv/install.sh | sh
- Install
quarto
- We use
conda
(or better yet,mamba
) for managing theffmpeg
dependency, which only matters ifffmpeg
is not already installed on your system.
Setting up the environment
From the root of the repo:
uv sync
source .venv/bin/activate
# Expose venv to ipython
uv run ipython kernel install --user --env VIRTUAL_ENV $(pwd)/.venv --name=amtutorial
## Install torch to run the `diffusion as memory` notebook. CPU or CUDA versions work
# pip install torch --index-url https://download.pytorch.org/whl/cpu
# OPTIONAL: For rendering videos in notebooks
conda install conda-forge::ffmpeg conda-forge::openh264
Development pipelines
View a local version of the website with:
uv run nbdev_preview
Pushes to main
deploy the website. The site will be live after a few minutes on github.
git checkout main
# Update the website. Takes a moment even with cached training runs
make deploy && git add . && git commit -m "Update site" && git push
Make a minor-patch update to the pypi package (preferably, only if amtutorials/src
was updated):
make pypi && uv run nbdev_pypi
Useful scripts (for reference only)
uv run nbdev_preview # Preview website locally
bash scripts/prep_website_deploy.sh # Sync dependencies, export qmd notebooks to ipynb for colab, and build website
bash scripts/export_qmd_as_ipynb.sh # Export qmd notebooks to ipynb for colab
uv run python scripts/sync_dependencies.py # Sync nbdev and pyproject.toml dependencies
uv run python scripts/prep_pypi.py # Bump patch version and sync dependencies
uv run nbdev_pypi # Push to pypi