Skip to main content

BioLearn: Biologically Inspired Neural Network Modifications

Project description

License PyPI tests codecov Perform tests Build and deploy docs Release

BioLearn: Biologically Inspired Neural Network Modifications

Please visit the Documentation for further information or refer to the Publication

Table of Contents

  1. Project Description
  2. Key Features
  3. Installation Instructions
  4. Usage
  5. Contributing Guidelines
  6. License Information
  7. Publication

Project Description

BioLearn is a Python library that implements biologically inspired modifications to artificial neural networks, based on research on dendritic spine dynamics. It aims to explore and enhance the learning capabilities of neural networks by mimicking the plasticity and stability characteristics observed in biological synapses.

This project is primarily targeted at researchers and developers in the fields of machine learning and computational neuroscience who are interested in exploring bio-inspired approaches to augment neural network performance.

Key Features

BioLearn implements several biologically inspired methods, each mimicking specific aspects of neuronal behavior:

  1. rejuvenate_weights: Simulates spine turnover, replacing weak synapses with new ones.
  2. crystallize: Mimics synaptic stabilization, adjusting learning rates based on synaptic strength and activity.
  3. fuzzy_learning_rates: Implements synaptic scaling for network stability.
  4. weight_splitting: Replicates multi-synaptic connectivity between neuron pairs.
  5. volume_dependent_lr: Applies learning rates based on synaptic "volume", inspired by spine size-plasticity relationships.

These methods work in concert to create a learning process that more closely resembles the dynamics observed in biological neural networks, potentially leading to improved learning and generalization in artificial neural networks.

Installation Instructions

You can install BioLearn using pip, Conda, or from source.

Option 1: Using pip (Simplest Method)

pip install bio_transformations

Option 2: Using Conda

conda create -n biolearn python=3.8
conda activate biolearn
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
pip install bio_transformations

Option 3: From Source (For Development or Latest Changes)

git clone https://github.com/CeadeS/pytorch_bio_transformations
cd pytorch_bio_transformations
pip install -r requirements.txt
pip install -e .

Verifying Installation

python -c "import bio_transformations; print(bio_transformations.__version__)"

Usage

Basic Usage Example

import torch
import torch.nn as nn
from bio_transformations import BioConverter

class SimpleModel(nn.Module):
    def __init__(self):
        super(SimpleModel, self).__init__()
        self.fc1 = nn.Linear(10, 20)
        self.fc2 = nn.Linear(20, 1)

    def forward(self, x):
        x = torch.relu(self.fc1(x))
        return self.fc2(x)

model = SimpleModel()
converter = BioConverter(base_lr=0.1, stability_factor=2.0, lr_variability=0.2)
bio_model = converter(model)

Training Example

import torch.optim as optim

criterion = nn.MSELoss()
optimizer = optim.Adam(bio_model.parameters(), lr=0.001)

for epoch in range(num_epochs):
    for batch in data_loader:
        inputs, targets = batch
        outputs = bio_model(inputs)
        loss = criterion(outputs, targets)
        
        optimizer.zero_grad()
        loss.backward()
        
        bio_model.volume_dependent_lr()
        bio_model.crystallize()
        
        optimizer.step()
        
    print(f'Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}')

Adding Your Own Function

To add a new function to BioModule:

  1. Add the function to the BioModule class in bio_module.py.
  2. Add the function name to the exposed_functions list in BioModule.
  3. Update the BioConverter class in bio_converter.py if needed.
  4. Create a test case in test_biomodule.py.

Contributing Guidelines

We welcome contributions to BioLearn! Please follow these steps:

  1. Fork the repository and create your branch from main.
  2. Make changes and ensure all tests pass.
  3. Add tests for new functionality.
  4. Update documentation to reflect changes.
  5. Submit a pull request with a clear description of your changes.

Please adhere to the existing code style and include appropriate comments.

License Information

This project is licensed under the MIT License. See the LICENSE file for details.

Publication

For more detailed information about the project and its underlying research, please refer to our paper: [DOI]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_bio_transformations-0.0.4.tar.gz (22.5 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file pytorch_bio_transformations-0.0.4.tar.gz.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.4.tar.gz
Algorithm Hash digest
SHA256 627332164e3906faf5fa32984559bbca59d3960feadde1834b2ebafa084dd0a5
MD5 fc66a2ac10a442f101f49b13fafc77bb
BLAKE2b-256 668f23eb3a730ab4b248621d1333c0f536ae566909d986a5b34509cae9da8533

See more details on using hashes here.

File details

Details for the file pytorch_bio_transformations-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e95a584fce3e5c2019566e921d3beecaaffa0c3f01f28206b4dbe7c651310566
MD5 d00dca8e513ffc361f7b7ecc0841c777
BLAKE2b-256 5a5a510de9606db7edf173e9cb63de4d553e67b889daf80549637775e6e82af2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page