Skip to main content

Deploy and access image and data processing models across processes.

Project description

Nahual: Deploy and access image and data processing models across environments/processes.

Note that this is early work in progress.

This tool aims to provide a one-stop-shop source for multiple models to process imaging data or their derivatives. You can think of it as a much simpler ollama but for biological analyses, deep learning-based or otherwise.

Implemented tools

By default, the models and tools are deployable using Nix.

  • trackastra: Transformer-based models trained on a multitude of datasets.
  • DINOv2: Generalistic self-supervised model to obtain visual features.

WIP tools

  • Baby: Segmentation, tracking and lineage assignment for budding yeast.
  • Other models and methods (to be defined)

Usage

Step 1: Deploy server

cd to the model you want to deploy. In this case we will test the image embedding model DINOv2.

git clone https://github.com/afermg/dinov2.git
cd dinov2
nix develop --command bash -c "python server.py ipc:///tmp/example_name.ipc"

Step 2: Run client

Once the server is running, you can call it from a different python script.

import numpy

from nahual.client.dinov2 import load_model, process_data

address = "ipc:///tmp/example_name.ipc"

# Load models server-side
parameters = {"repo_or_dir": "facebookresearch/dinov2", "model": "dinov2_vits14_lc"}
load_model(parameters, address=address)

# Define custom data
data = numpy.random.random_sample((1, 3, 420, 420))
result = process_data(data, address=address)

You can press C-c C-c from the terminal where the server lives to kill it. We will also add a way to kill the server from within the client.

Adding support for new models

Any model requires a thin layer that communicates using nng. You can see an example of trackastra's server and client.

Roadmap

  • Support multiple instances of a model loaded on memory server-side.
  • Formalize supported packet formats: (e.g., numpy arrays, dictionary).
  • Increase number of supported models/methods.
  • Document server-side API.
  • Integrate into the aliby pipelining framework.
  • Support containers that wrap the Nix derivations.

Why nahual?

logo

In Mesoamerican folklore, a Nahual is a shaman able to transform into different animals.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nahual-0.0.4.tar.gz (8.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nahual-0.0.4-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file nahual-0.0.4.tar.gz.

File metadata

  • Download URL: nahual-0.0.4.tar.gz
  • Upload date:
  • Size: 8.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for nahual-0.0.4.tar.gz
Algorithm Hash digest
SHA256 e2a448b259a4775cbb8b5bcd2ae0648c097321b32d522047b4902206da0e178c
MD5 94d7781e54b5caf27756450b8458d9b6
BLAKE2b-256 0206aba90536d24a5c6f997890fed93878c4ee23ec27d102965ac49bbf4c89b6

See more details on using hashes here.

File details

Details for the file nahual-0.0.4-py3-none-any.whl.

File metadata

  • Download URL: nahual-0.0.4-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.7.19

File hashes

Hashes for nahual-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e3319c7eae5aa9211164484e8c3037a7e4754c2c7d3a01bd7795bd5b184570b9
MD5 acfd13d7016e35ca03c7d20656d13540
BLAKE2b-256 ea01cd23278a9a33e50129cbb0060336dab3b1c78b132e4decea8d1dbbb182b3

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page