Skip to main content

A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.

Project description

tensorflow-engram

PyPI Downloads Conda Downloads CI/CT/CD License DOI

Engram Neural Networks (ENNs): Hebbian Memory-Augmented Recurrent Networks

Biologically-inspired memory for TensorFlow/Keras.

Add Hebbian/engram learning to your neural networks with just a few lines of code.


Overview

tensorflow-engram provides Keras layers, models, and utilities for building neural networks with biologically-inspired memory mechanisms, including Hebbian plasticity, engram-like trace formation, attention, and sparse memory recall. This enables powerful sequence modeling, few-shot learning, continual learning, and analysis of memory traces within modern deep learning pipelines.

  • Seamless TensorFlow/Keras integration
  • Engram layers: RNN cells and wrappers with memory banks, plastic synapses, and sparsity
  • Hebbian learning: Fast local updates + gradient learning
  • Attention and sparsity: Focuses on the most relevant memories
  • Trace monitoring: Visualize engram and memory trace evolution
  • Ready-to-use models for classification and regression

Tensorflow-Engram is currently in development and may not yet be ready for production use. We are actively seeking contributors to help us improve the package and expand its capabilities. If you are interested in contributing, please see our contributing guide.

TODO:

  • Add unit tests
  • Generate doc pages with Sphinx
  • Possibly rename repo?

Installation

pip install tensorflow-engram

Or install using conda:

conda install -c danielathome19 tensorflow-engram

Requirements:

  • Python 3.12+
  • TensorFlow 2.19+
  • Keras 3.10+
  • numpy, seaborn, matplotlib, pandas (for utilities and plotting)

Quickstart

Example: MNIST Classification with Engram Memory

import numpy as np
from tensorflow.keras.datasets import mnist
from tensorflow.keras.utils import to_categorical
from sklearn.model_selection import train_test_split
from tensorflow_engram.models import EngramClassifier
from tensorflow_engram.utils import HebbianTraceMonitor, plot_hebbian_trace

# Prepare data
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train = x_train.astype('float32') / 255.0
x_test  = x_test.astype('float32') / 255.0
y_train = to_categorical(y_train, 10)
y_test  = to_categorical(y_test, 10)
x_train = x_train.reshape(-1, 28, 28)
x_test  = x_test.reshape(-1, 28, 28)
x_train, x_val, y_train, y_val = train_test_split(x_train, y_train, test_size=0.1)

# Build model
model = EngramClassifier(
    input_shape=(28, 28),
    num_classes=10,
    hidden_dim=128,
    memory_size=64,
    return_states=True,
    hebbian_lr=0.05,
)

# Monitor Hebbian trace during training
trace_callback = HebbianTraceMonitor(x_train[:32], log_dir=None)

model.compile(
    optimizer='adam',
    loss='categorical_crossentropy',
    metrics=['accuracy']
)
model.fit(
    x_train, y_train,
    batch_size=128,
    epochs=10,
    validation_data=(x_val, y_val),
    callbacks=[trace_callback]
)

# Visualize trace evolution
plot_hebbian_trace(trace_callback)

Features

  • EngramCell: Biologically-inspired RNN cell with memory banks and Hebbian plasticity.
  • EngramNetwork: High-level Keras Model for sequence modeling.
  • Attention Layer: Optional attention mechanism for sequence summarization.
  • Trace Monitoring: Inspect and visualize memory trace evolution with built-in callbacks and plotting utilities.

API Highlights

Layers

  • EngramCell: Biologically-inspired RNN cell with memory banks, Hebbian trace, and sparsity regularization.

  • Engram: Wrapper for Keras models/networks using EngramCell.

  • EngramAttentionLayer: Optional attention over sequence outputs.

Models

  • EngramNetwork: General-purpose sequence model with configurable memory and plasticity.

  • EngramClassifier: Factory function for classification tasks.

  • EngramRegressor: Factory for regression tasks.

Utilities

  • HebbianTraceMonitor: Keras callback for logging and visualizing Hebbian traces.

  • plot_hebbian_trace: Quick plotting of trace evolution and statistics.


How It Works

  • Memory Bank: Persistent, learnable memory vectors (engrams), updated via gradient descent.

  • Hebbian Trace: Rapidly updated, plastic component reflecting short-term memory, updated via local Hebbian learning.

  • Attention/Recall & Sparsity: Memories are retrieved by attention (cosine similarity + softmax), but with sparsity constraints so only a few are activated per input—mimicking efficient biological memory recall.

  • Trace Visualization: Built-in tools to monitor and understand the dynamics of memory during training.


Advanced Usage

You can customize the cell and models for your own tasks:

from tensorflow_engram.layers import EngramCell
from tensorflow.keras.layers import RNN, Input
from tensorflow.keras.models import Model

cell = EngramCell(hidden_dim=64, memory_size=32)
inputs = Input(shape=(None, 16))
rnn_layer = RNN(cell, return_sequences=True)
outputs = rnn_layer(inputs)
model = Model(inputs, outputs)

License

Tensorflow-Engram is licensed under the BSD-3 License. See the LICENSE file for more information.


Citation

If you use this code for your research, please cite this project as:

@software{Szelogowski_tensorflow_engram_2025,
 author = {Szelogowski, Daniel},
 doi = {10.48550/arXiv.2507.21474},
 license = {BSD-3-Clause},
 month = {jul},
 title = {{tensorflow-engram: A Python package for Engram Neural Networks, adding biologically-inspired Hebbian memory and engram layers to TensorFlow/Keras models, supporting memory traces, plasticity, attention, and sparsity for neural sequence learning.}},
 url = {https://github.com/danielathome19/Engram-Neural-Network},
 version = {0.1.0},
 year = {2025}
}

or as the corresponding research paper:

@misc{Szelogowski_Simulation_of_Neural_Responses_Using_OI_2024,
 author = {Szelogowski, Daniel},
 doi = {10.48550/arXiv.2507.21474},
 month = {jul},
 title = {{Hebbian Memory-Augmented Recurrent Networks: Engram Neurons in Deep Learning}},
 url = {https://github.com/danielathome19/Engram-Neural-Network},
 year = {2025}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tensorflow_engram-0.1.0.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tensorflow_engram-0.1.0-py3-none-any.whl (13.3 kB view details)

Uploaded Python 3

File details

Details for the file tensorflow_engram-0.1.0.tar.gz.

File metadata

  • Download URL: tensorflow_engram-0.1.0.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.11

File hashes

Hashes for tensorflow_engram-0.1.0.tar.gz
Algorithm Hash digest
SHA256 0b47e3fdc2bb7c47325e3b4fd6c95628c0bccde2da1e756afeac45b927d10c9f
MD5 a12c6eef139ff9a4c4c3b82615a3ad99
BLAKE2b-256 e070745d8a9fc37de30dd1a8e63cba6043419cc52fc666e88b9f989a24d56149

See more details on using hashes here.

File details

Details for the file tensorflow_engram-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for tensorflow_engram-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0273f29043f66d0cf6c4d7ce918c672f805ab8f29e6c61101fea35ca7a66537d
MD5 6fc4381977d70650eef307cd7395f37a
BLAKE2b-256 2239f8f5efbf36ddea7cd18264f29a6e78e77a6c904f0852e0dacd4986a253cd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page