Skip to main content

Deploy and access image and data processing models across processes.

Project description

Nahual: Deploy and access image and data processing models across environments/processes.

Note that this is early work in progress.

This tool aims to provide a one-stop-shop source for multiple models to process imaging data or their derivatives. You can think of it as a much simpler ollama but for biological analyses, deep learning-based or otherwise.

Implemented models and tools

By default, the models and tools are deployable using Nix.

  • BABY: Segmentation, tracking and lineage assignment for budding yeast.
  • Cellpose: Generalist segmentation model.
  • DINOv2: Generalist self-supervised model to obtain visual features.
  • Trackastra: Transformer-based models trained on a multitude of datasets.

WIP

  • DINOv3: Generalist self-supervised model, latest iteration.

Usage

Step 1: Deploy server

cd to the model you want to deploy. In this case we will test the image embedding model DINOv2.

git clone https://github.com/afermg/dinov2.git
cd dinov2
nix develop --command bash -c "python server.py ipc:///tmp/dinov2.ipc"

Step 2: Run client

Once the server is running, you can call it from a different python script.

import numpy

from nahual.client.dinov2 import load_model, process_data

address = "ipc:///tmp/example_name.ipc"

# Load models server-side
parameters = {"repo_or_dir": "facebookresearch/dinov2", "model": "dinov2_vits14_lc"}
load_model(parameters, address=address)

# Define custom data
data = numpy.random.random_sample((1, 3, 420, 420))
result = process_data(data, address=address)

You can press C-c C-c from the terminal where the server lives to kill it. We will also add a way to kill the server from within the client.

Adding support for new models

Any model requires a thin layer that communicates using nng. You can see an example of trackastra's server and client.

Roadmap

  • Support multiple instances of a model loaded on memory server-side.
  • Formalize supported packet formats: (e.g., numpy arrays, dictionary).
  • Increase number of supported models/methods.
  • Document server-side API.
  • Integrate into the aliby pipelining framework.
  • Support containers that wrap the Nix derivations.

Why nahual?

In Mesoamerican folklore, a Nahual is a shaman able to transform into different animals.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nahual-0.0.5.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nahual-0.0.5-py3-none-any.whl (11.8 kB view details)

Uploaded Python 3

File details

Details for the file nahual-0.0.5.tar.gz.

File metadata

  • Download URL: nahual-0.0.5.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.2

File hashes

Hashes for nahual-0.0.5.tar.gz
Algorithm Hash digest
SHA256 f028f8be4b22adcab80515a73c8f1b35721c45d199cccc440c325627c90e46d2
MD5 4f2ead6be5f0b81cc1488d3dc906c0e1
BLAKE2b-256 9173fcadc95b551dff624a1b03ba3fee55f9a3788327a87e1bd9504127d18ce7

See more details on using hashes here.

File details

Details for the file nahual-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: nahual-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 11.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.8.2

File hashes

Hashes for nahual-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e328cae14f7b6cdfbff90d2bac5981ad62c1afe80a4c2dc110aefa2fd80356e2
MD5 85f67f6778f042c31aa11d968d831855
BLAKE2b-256 fd39598c34ea8b17e3b5ea1db5258979668d7f4782d7bff1dcf801d98acc0150

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page