Skip to main content

Wet lab automation as Python code

Project description

biocompute

Wet lab automation as Python code. Maintained by london biocompute.

PyPI version Python License

biocompute is a framework that lets you write wet lab experiments as plain Python. Define your protocol with calls like well.fill(), well.mix(), and well.image(). Then execute on real lab hardware that handles the liquid dispensing, mixing, and imaging automatically. No drag-and-drop GUIs, no vendor lock-in, no manual pipetting.

If you know Python, you can run wet lab experiments.


Quick start

Create a virtual environment and install the biocompute package.

python -m venv .venv
source .venv/bin/activate
pip install biocompute

Create a file called super_simple_experiment.py and copy the code snippet.

from biocompute import wells, red_dye, green_dye, blue_dye

def experiment():
    for well in wells(count=3):
        well.fill(vol=80.0, reagent=red_dye)
        well.fill(vol=40.0, reagent=green_dye)
        well.fill(vol=20.0, reagent=blue_dye)
        well.mix()
        well.image()

Submit the experiment to the job server.

biocompute submit super_simple_experiment.py --follow

And that's it. Results stream back to your terminal as experiments finish executing on the physical hardware.


How it works

Your experiment function describes intent. The compiler takes this high-level declarative code and turns it into a fully scheduled, hardware-specific protocol. It handles:

  • Automatic parallelism — independent operations are identified and scheduled concurrently so protocols finish faster without any manual orchestration.
  • Plate layout — wells are assigned to physical plates based on thermal constraints. Multi-temperature experiments get split across plates automatically.
  • Operation collapsing — redundant per-well instructions (like 96 identical incubations) are collapsed into single plate-level commands.
  • Device mapping — every operation is matched to the right piece of hardware (pipette, camera, incubator, gripper) based on a capability model, so swapping equipment never means rewriting your protocol.
  • Multi-plate scaling — protocols that exceed a single plate are transparently distributed across as many plates as needed.

You describe what should happen. The compiler figures out how to make it fast.

Operations

Method What it does
well.fill(vol, reagent) Dispense vol µL of reagent
well.mix() Mix well contents
well.image() Capture an image

wells(count=n) yields n wells. Multiple calls produce non-overlapping wells.

Reagents

Import the built-in reagents you need.

from biocompute import red_dye, green_dye, blue_dye, water

Because it's just Python

Use numpy. Use scipy. Use whatever. The system only sees wells and operations.

Colour sweep

Sweep red dye volume across ten wells using numpy to generate the range.

import numpy as np
from biocompute import wells, red_dye, green_dye, blue_dye

def experiment():
    for well, r in zip(wells(count=10), np.linspace(10, 100, 10)):
        well.fill(vol=r, reagent=red_dye)
        well.fill(vol=50.0, reagent=green_dye)
        well.fill(vol=50.0, reagent=blue_dye)
        well.mix()
        well.image()

Closed-loop optimisation

Submit an experiment, read results, use them to parameterise the next one.

import numpy as np
from scipy.interpolate import interp1d
from scipy.optimize import minimize_scalar
from biocompute import Client, wells, red_dye, green_dye

with Client() as client:
    volumes = np.linspace(10, 100, 8)

    def experiment_sweep():
        for well, v in zip(wells(count=8), volumes):
            well.fill(vol=v, reagent=red_dye)
            well.fill(vol=50.0, reagent=green_dye)
            well.mix()
            well.image()

    result = client.submit(experiment_sweep)

    model = interp1d(volumes, result.result_data["scores"], kind="cubic")
    optimum = minimize_scalar(model, bounds=(10, 100), method="bounded").x

    def experiment_refine():
        for well, v in zip(wells(count=5), np.linspace(optimum - 10, optimum + 10, 5)):
            well.fill(vol=v, reagent=red_dye)
            well.fill(vol=50.0, reagent=green_dye)
            well.mix()
            well.image()

    final = client.submit(experiment_refine)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

biocompute-0.1.3.tar.gz (46.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

biocompute-0.1.3-py3-none-any.whl (18.9 kB view details)

Uploaded Python 3

File details

Details for the file biocompute-0.1.3.tar.gz.

File metadata

  • Download URL: biocompute-0.1.3.tar.gz
  • Upload date:
  • Size: 46.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for biocompute-0.1.3.tar.gz
Algorithm Hash digest
SHA256 88d3e6ac5a0185f8282f49a019fc5475bbc201ed114eac8984067ac87df883f4
MD5 435ab758103f1038c169590c522250ee
BLAKE2b-256 361eadcc6ccc7ddf5ca32dfa3661db9e49210ffd3d46ce8bee785568d01ff973

See more details on using hashes here.

File details

Details for the file biocompute-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: biocompute-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 18.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.9

File hashes

Hashes for biocompute-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6399b219ef6f1fd3a7ba4f3c2ea8715f9f9148f04e79b7c0fddda3f853678f7e
MD5 5a07854da3488c1faba41336ba926900
BLAKE2b-256 7539ce545326192923c4693af840f4f8217c6577b252f594595f6dcf824013dd

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page