Skip to main content

pytorch implementation of Stigmergic Neural Netowrks

Project description

torchsnn

pytorch implementation of the Stigmergic Neural Networks as presented in the paper Using stigmergy to incorporate the time into artificial neural networks.

This package was wrote with the intent to make as easy as possible to integrate the Stigmergic Neural Networks into the existing models.

You can safely mix native pytorch Modules with ours.
The only catch is that you should use StigmergicModule (which extends pytorch's Module) as base class for your models in order to be able to tick() and reset() them.

Implementing our proposed architecture to solve MNIST becomes as easy as:

import torch
import torchsnn

net = torchsnn.Sequential(
    torchsnn.SimpleLayer(28, 10),
    torchsnn.FullLayer(10, 10),
    torchsnn.TemporalAdapter(10, 28),
    torch.nn.Linear(10*28, 10),
    torch.nn.Sigmoid()
)

You can train a StigmergicModule as you would do with a pytorch's Module, but don't forget to reset() and tick() it!

optimizer = torch.optim.Adam(net.parameters(), lr = 0.001)

for i in range(0,N):
    for X,Y in zip(dataset_X, dataset_Y):
        net.reset()
        out = None
        for xi in X:
            out = net(torch.tensor(xi, dtype=torch.float32))
            net.tick()
        
        loss = (Y-out)**2
        
        loss.backward()
        optimizer.step()

Does it support batch inputs?

Yes! if you forward into a StigmergicModule a batch of inputs it will return a batch of outputs

for t in range(0, num_ticks):
    batch_out[0], batch_out[1], ... = net(torch.tensor([batch_in[0][t], batch_in[1][t], ...]))
    net.tick()

Can it run on CUDA?

Yes and as you will expect from a pytorch Module!
You just need to call the to(device) method on a model to move it in the GPU memory

device = torch.device("cuda")

net = net.to(device)

net.forward(torch.tensor(..., device=device))

Documentation

torchsnn.StigmergicModule

Base class for a stigmergic network or layer.
If you are writing your own StigmergicModule you have to implement the functions

  • tick()
  • reset()

If you are using others StigmergicModule it will propagate these calls in its subtree.
For example if you want to build a network with a Linear and a SimpleLayer you can do something like:

import torch
import torchsnn

class TestNetwork(torchsnn.StigmergicModule):
    def __init__(self):
        torchsnn.StigmergicModule.__init__(self)
        self.linear = torch.nn.Linear(2,5)
        self.stigmergic = torchsnn.SimpleLayer(5,2)

    def forward(self, input):
        l1 = torch.sigmoid(self.linear(input))
        l2 = self.stigmergic(l1)
        return l2

net = TestNetwork()

torchsnn.Sequential

Function with the same interface of torch.nn.Sequential for building sequential networks.
The same network of the previous example can be built with:

import torch
import torchsnn

net = torchsnn.Sequential(
    torch.nn.Linear(2,5),
    torch.nn.Sigmoid(),
    torchsnn.SimpleLayer(5,2)
)

torchsnn.SimpleLayer

It this layer only the thresholds are stigmergic variables and their stimuli are the output values.


torchsnn.FullLayer

In this layer both thresholds and weights are stigmergic variables and their stimuli are respectively the output values and the input ones.


Citing

We can't wait to see what you will build with the Stigmergic Neural Networks!
When you will publish your work you can use this BibTex to cite us :)

@article{galatolo_snn
,	author	= {Galatolo, Federico A and Cimino, Mario GCA and Vaglini, Gigliola}
,	title	= {Using stigmergy to incorporate the time into artificial neural networks}
,	journal	= {MIKE 2018}
,	year	= {2018}
}

Contributing

This code is released under GNU/GPLv3 so feel free to fork it and submit your changes, every PR helps.
If you need help using it or for any question please reach me at galatolo.federico@gmail.com or on Telegram @galatolo

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchsnn-0.1.0.tar.gz (5.6 kB view details)

Uploaded Source

File details

Details for the file torchsnn-0.1.0.tar.gz.

File metadata

  • Download URL: torchsnn-0.1.0.tar.gz
  • Upload date:
  • Size: 5.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.12.1 pkginfo/1.4.2 requests/2.19.1 setuptools/40.4.3 requests-toolbelt/0.8.0 tqdm/4.28.1 CPython/3.7.0

File hashes

Hashes for torchsnn-0.1.0.tar.gz
Algorithm Hash digest
SHA256 40e78cece7bb1d11a0cce5e833e971653fa5f2752447ae79ad6e45bf018106d3
MD5 c102862853536338df022aaf16a32e51
BLAKE2b-256 f238d6852634b5d8c0805c6bb699f4423fa621443a277cf127356ad915640e44

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page