Skip to main content

BioLearn: Biologically Inspired Neural Network Modifications

Project description

License PyPI tests codecov Perform tests Build and deploy docs Release

BioLearn: Biologically Inspired Neural Network Modifications

Please visit the Documentation for further information or refer to the Publication

Table of Contents

  1. Project Description
  2. Key Features
  3. Installation Instructions
  4. Usage
  5. Contributing Guidelines
  6. License Information
  7. Publication

Project Description

BioLearn is a Python library that implements biologically inspired modifications to artificial neural networks, based on research on dendritic spine dynamics. It aims to explore and enhance the learning capabilities of neural networks by mimicking the plasticity and stability characteristics observed in biological synapses.

This project is primarily targeted at researchers and developers in the fields of machine learning and computational neuroscience who are interested in exploring bio-inspired approaches to augment neural network performance.

Key Features

BioLearn implements several biologically inspired methods, each mimicking specific aspects of neuronal behavior:

  1. rejuvenate_weights: Simulates spine turnover, replacing weak synapses with new ones.
  2. crystallize: Mimics synaptic stabilization, adjusting learning rates based on synaptic strength and activity.
  3. fuzzy_learning_rates: Implements synaptic scaling for network stability.
  4. weight_splitting: Replicates multi-synaptic connectivity between neuron pairs.
  5. volume_dependent_lr: Applies learning rates based on synaptic "volume", inspired by spine size-plasticity relationships.

These methods work in concert to create a learning process that more closely resembles the dynamics observed in biological neural networks, potentially leading to improved learning and generalization in artificial neural networks.

Installation Instructions

You can install BioLearn using pip, Conda, or from source.

Option 1: Using pip (Simplest Method)

pip install bio_transformations

Option 2: Using Conda

conda create -n biolearn python=3.8
conda activate biolearn
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
pip install bio_transformations

Option 3: From Source (For Development or Latest Changes)

git clone https://github.com/CeadeS/pytorch_bio_transformations
cd pytorch_bio_transformations
pip install -r requirements.txt
pip install -e .

Verifying Installation

python -c "import bio_transformations; print(bio_transformations.__version__)"

Usage

Basic Usage Example

import torch
import torch.nn as nn
from bio_transformations import BioConverter

class SimpleModel(nn.Module):
    def __init__(self):
        super(SimpleModel, self).__init__()
        self.fc1 = nn.Linear(10, 20)
        self.fc2 = nn.Linear(20, 1)

    def forward(self, x):
        x = torch.relu(self.fc1(x))
        return self.fc2(x)

model = SimpleModel()
converter = BioConverter(base_lr=0.1, stability_factor=2.0, lr_variability=0.2)
bio_model = converter(model)

Training Example

import torch.optim as optim

criterion = nn.MSELoss()
optimizer = optim.Adam(bio_model.parameters(), lr=0.001)

for epoch in range(num_epochs):
    for batch in data_loader:
        inputs, targets = batch
        outputs = bio_model(inputs)
        loss = criterion(outputs, targets)
        
        optimizer.zero_grad()
        loss.backward()
        
        bio_model.volume_dependent_lr()
        bio_model.crystallize()
        
        optimizer.step()
        
    print(f'Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}')

Adding Your Own Function

To add a new function to BioModule:

  1. Add the function to the BioModule class in bio_module.py.
  2. Add the function name to the exposed_functions list in BioModule.
  3. Update the BioConverter class in bio_converter.py if needed.
  4. Create a test case in test_biomodule.py.

Contributing Guidelines

We welcome contributions to BioLearn! Please follow these steps:

  1. Fork the repository and create your branch from main.
  2. Make changes and ensure all tests pass.
  3. Add tests for new functionality.
  4. Update documentation to reflect changes.
  5. Submit a pull request with a clear description of your changes.

Please adhere to the existing code style and include appropriate comments.

License Information

This project is licensed under the MIT License. See the LICENSE file for details.

Publication

For more detailed information about the project and its underlying research, please refer to our paper: [DOI]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_bio_transformations-0.0.4a0.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file pytorch_bio_transformations-0.0.4a0.tar.gz.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.4a0.tar.gz
Algorithm Hash digest
SHA256 85ef3fc36d59544013479d19d9c5333d7a3ea85d9eb42bffbceda388c78f5b5e
MD5 9d35831f84c3ac0e312b50caddf5363d
BLAKE2b-256 8b9a7ffb318af3eab2da2bf599ec583d7349ef9f07193bd31acaec1eaaeec6b2

See more details on using hashes here.

File details

Details for the file pytorch_bio_transformations-0.0.4a0-py3-none-any.whl.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.4a0-py3-none-any.whl
Algorithm Hash digest
SHA256 29a212ca90aaccb16954a6f07dc74eef13b03c39adb3749c138efaf9c92e26cd
MD5 2f0386872c51f59638eb3274dba4e28f
BLAKE2b-256 7423f4f23f7566ca23f9953d32c1e69a824893f81dddb9e6ec61c18b21901cf0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page