Skip to main content

BioLearn: Biologically Inspired Neural Network Modifications

Project description

BioLearn: Biologically Inspired Neural Network Modifications

Table of Contents

  1. Project Description
  2. Key Features
  3. Installation Instructions
  4. Usage
  5. Contributing Guidelines
  6. License Information
  7. Documentation

Project Description

BioLearn is a Python library that implements biologically inspired modifications to artificial neural networks, based on research on dendritic spine dynamics. It aims to explore and enhance the learning capabilities of neural networks by mimicking the plasticity and stability characteristics observed in biological synapses.

This project is primarily targeted at researchers and developers in the fields of machine learning and computational neuroscience who are interested in exploring bio-inspired approaches to augment neural network performance.

Key Features

BioLearn implements several biologically inspired methods, each mimicking specific aspects of neuronal behavior:

  1. rejuvenate_weights: Simulates spine turnover, replacing weak synapses with new ones.
  2. crystallize: Mimics synaptic stabilization, adjusting learning rates based on synaptic strength and activity.
  3. fuzzy_learning_rates: Implements synaptic scaling for network stability.
  4. weight_splitting: Replicates multi-synaptic connectivity between neuron pairs.
  5. volume_dependent_lr: Applies learning rates based on synaptic "volume", inspired by spine size-plasticity relationships.

These methods work in concert to create a learning process that more closely resembles the dynamics observed in biological neural networks, potentially leading to improved learning and generalization in artificial neural networks.

Installation Instructions

You can install BioLearn using pip, Conda, or from source.

Option 1: Using pip (Simplest Method)

pip install bio_transformations

Option 2: Using Conda

conda create -n biolearn python=3.8
conda activate biolearn
conda install pytorch torchvision torchaudio cudatoolkit=11.3 -c pytorch
pip install bio_transformations

Option 3: From Source (For Development or Latest Changes)

git clone https://github.com/CeadeS/BioLearn.git
cd BioLearn
pip install -r requirements.txt
pip install -e .

Verifying Installation

python -c "import bio_transformations; print(bio_transformations.__version__)"

Usage

Basic Usage Example

import torch
import torch.nn as nn
from bio_transformations import BioConverter

class SimpleModel(nn.Module):
    def __init__(self):
        super(SimpleModel, self).__init__()
        self.fc1 = nn.Linear(10, 20)
        self.fc2 = nn.Linear(20, 1)

    def forward(self, x):
        x = torch.relu(self.fc1(x))
        return self.fc2(x)

model = SimpleModel()
converter = BioConverter(base_lr=0.1, stability_factor=2.0, lr_variability=0.2)
bio_model = converter(model)

Training Example

import torch.optim as optim

criterion = nn.MSELoss()
optimizer = optim.Adam(bio_model.parameters(), lr=0.001)

for epoch in range(num_epochs):
    for batch in data_loader:
        inputs, targets = batch
        outputs = bio_model(inputs)
        loss = criterion(outputs, targets)
        
        optimizer.zero_grad()
        loss.backward()
        
        bio_model.volume_dependent_lr()
        bio_model.crystallize()
        
        optimizer.step()
        
    print(f'Epoch [{epoch+1}/{num_epochs}], Loss: {loss.item():.4f}')

Adding Your Own Function

To add a new function to BioModule:

  1. Add the function to the BioModule class in bio_module.py.
  2. Add the function name to the exposed_functions list in BioModule.
  3. Update the BioConverter class in bio_converter.py if needed.
  4. Create a test case in test_biomodule.py.

Contributing Guidelines

We welcome contributions to BioLearn! Please follow these steps:

  1. Fork the repository and create your branch from main.
  2. Make changes and ensure all tests pass.
  3. Add tests for new functionality.
  4. Update documentation to reflect changes.
  5. Submit a pull request with a clear description of your changes.

Please adhere to the existing code style and include appropriate comments.

License Information

This project is licensed under the MIT License. See the LICENSE file for details.

Documentation

For more detailed information about the project and its underlying research, please refer to our paper: [DOI]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pytorch_bio_transformations-0.0.1.tar.gz (112.4 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file pytorch_bio_transformations-0.0.1.tar.gz.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.1.tar.gz
Algorithm Hash digest
SHA256 192df076cce688db15f4c104f30ad94c8363606221d43c077cbc86ed15920219
MD5 7d63ef406fb4dcee87ee0b36ace3b4e2
BLAKE2b-256 96bd987701795c7ee537a72f1009498b0bef9927a3cc485d306c76eae6e62441

See more details on using hashes here.

File details

Details for the file pytorch_bio_transformations-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for pytorch_bio_transformations-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c1a13a239143bc04dd3e0aed83ad644563154eb3cbe72331e3675e10b164b6f1
MD5 5cf02e69b07125ef2e6edd5006208dd9
BLAKE2b-256 ba4611fcc4eedb4411d8cd7a89b91d8c3f8a00997671847f85e9d8ddd363166a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page