Skip to main content

Distributed Neural Network implementation on COINSTAC.

Project description

coinstac-dinunet

Distributed Neural Network implementation on COINSTAC.

PyPi version YourActionName Actions Status versions

pip install coinstac-dinunet

Specify supported packages like pytorch & torchvision in a requirements.txt file

Highlights:

1. Handles multi-network/complex training schemes.
2. Automatic data splitting/k-fold cross validation.
3. Automatic model checkpointing.
4. GPU enabled local sites.
5. Customizable metrics(w/Auto serialization between nodes) to work with any schemes.
6. We can integrate any custom reduction and learning mechanism by extending coinstac_dinunet.distrib.reducer/learner.
7. Realtime profiling each sites by specifying in compspec file(see dinune_fsv example below for details). 
...

DINUNET

Working examples:

  1. FreeSurfer volumes classification.
  2. VBM 3D images classification.

Running an analysis in the coinstac App.

Add a new NN computation to COINSTAC (Development guide):

imports

from coinstac_dinunet import COINNDataset, COINNTrainer, COINNLocal
from coinstac_dinunet.metrics import COINNAverages, Prf1a

1. Define Data Loader

class MyDataset(COINNDataset):
    def __init__(self, **kw):
        super().__init__(**kw)
        self.labels = None

    def load_index(self, id, file):
        data_dir = self.path(id, 'data_dir') # data_dir comes from inputspecs.json
        ...
        self.indices.append([id, file])

    def __getitem__(self, ix):
        id, file = self.indices[ix]
        data_dir = self.path(id, 'data_dir') # data_dir comes from inputspecs.json
        label_dir = self.path(id, 'label_dir') # label_dir comes from inputspecs.json
        ...
        # Logic to load, transform single data item.
        ...
        return {'inputs':.., 'labels': ...}

2. Define Trainer

class MyTrainer(COINNTrainer):
    def __init__(self, **kw):
        super().__init__(**kw)

    def _init_nn_model(self):
        self.nn['model'] = MYModel(in_size=self.cache['input_size'], out_size=self.cache['num_class'])

    def iteration(self, batch):
        inputs, labels = batch['inputs'].to(self.device['gpu']).float(), batch['labels'].to(self.device['gpu']).long()

        out = F.log_softmax(self.nn['model'](inputs), 1)
        loss = F.nll_loss(out, labels)
        _, predicted = torch.max(out, 1)
        score = self.new_metrics()
        score.add(predicted, labels)
        val = self.new_averages()
        val.add(loss.item(), len(inputs))
        return {'out': out, 'loss': loss, 'averages': val,
                'metrics': score, 'prediction': predicted}

3. Add entries to:

  • Local node entry point CPU, GPU
  • Aggregator node point CPU, GPU
  • compspec.json file CPU, GPU

Advanced use cases:

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

coinstac-dinunet-2.4.7.tar.gz (33.8 kB view details)

Uploaded Source

File details

Details for the file coinstac-dinunet-2.4.7.tar.gz.

File metadata

  • Download URL: coinstac-dinunet-2.4.7.tar.gz
  • Upload date:
  • Size: 33.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.8 tqdm/4.63.0 importlib-metadata/4.11.3 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.10.2

File hashes

Hashes for coinstac-dinunet-2.4.7.tar.gz
Algorithm Hash digest
SHA256 b0be83a3ca66302b3f8670b05e0158b7728d9e851728f88c84e3aa00d792deef
MD5 2dbddba01bfee313951882c025c823d4
BLAKE2b-256 dab3d00a53aa3cc38e6f3de71b7759cb98adf75467d372b63c3923e81b321856

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page