Skip to main content

Containers for machine learning

Project description

Cog: Containers for machine learning

Cog is an open-source tool that lets you package machine learning models in a standard, production-ready container.

You can deploy your packaged model to your own infrastructure, or to Replicate.

Highlights

  • ๐Ÿ“ฆ Docker containers without the pain. Writing your own Dockerfile can be a bewildering process. With Cog, you define your environment with a simple configuration file and it generates a Docker image with all the best practices: Nvidia base images, efficient caching of dependencies, installing specific Python versions, sensible environment variable defaults, and so on.

  • ๐Ÿคฌ๏ธ No more CUDA hell. Cog knows which CUDA/cuDNN/PyTorch/Tensorflow/Python combos are compatible and will set it all up correctly for you.

  • โœ… Define the inputs and outputs for your model with standard Python. Then, Cog generates an OpenAPI schema and validates the inputs and outputs with Pydantic.

  • ๐ŸŽ Automatic HTTP prediction server: Your model's types are used to dynamically generate a RESTful HTTP API using FastAPI.

  • ๐Ÿฅž Automatic queue worker. Long-running deep learning models or batch processing is best architected with a queue. Cog models do this out of the box. Redis is currently supported, with more in the pipeline.

  • โ˜๏ธ Cloud storage. Files can be read and written directly to Amazon S3 and Google Cloud Storage. (Coming soon.)

  • ๐Ÿš€ Ready for production. Deploy your model anywhere that Docker images run. Your own infrastructure, or Replicate.

How it works

Define the Docker environment your model runs in with cog.yaml:

build:
  gpu: true
  system_packages:
    - "libgl1-mesa-glx"
    - "libglib2.0-0"
  python_version: "3.12"
  python_packages:
    - "torch==2.3"
predict: "predict.py:Predictor"

Define how predictions are run on your model with predict.py:

from cog import BasePredictor, Input, Path
import torch

class Predictor(BasePredictor):
    def setup(self):
        """Load the model into memory to make running multiple predictions efficient"""
        self.model = torch.load("./weights.pth")

    # The arguments and types the model takes as input
    def predict(self,
          image: Path = Input(description="Grayscale input image")
    ) -> Path:
        """Run a single prediction on the model"""
        processed_image = preprocess(image)
        output = self.model(processed_image)
        return postprocess(output)

Now, you can run predictions on this model:

$ cog predict -i image=@input.jpg
--> Building Docker image...
--> Running Prediction...
--> Output written to output.jpg

Or, build a Docker image for deployment:

$ cog build -t my-colorization-model
--> Building Docker image...
--> Built my-colorization-model:latest

$ docker run -d -p 5000:5000 --gpus all my-colorization-model

$ curl http://localhost:5000/predictions -X POST \
    -H 'Content-Type: application/json' \
    -d '{"input": {"image": "https://.../input.jpg"}}'

Why are we building this?

It's really hard for researchers to ship machine learning models to production.

Part of the solution is Docker, but it is so complex to get it to work: Dockerfiles, pre-/post-processing, Flask servers, CUDA versions. More often than not the researcher has to sit down with an engineer to get the damn thing deployed.

Andreas and Ben created Cog. Andreas used to work at Spotify, where he built tools for building and deploying ML models with Docker. Ben worked at Docker, where he created Docker Compose.

We realized that, in addition to Spotify, other companies were also using Docker to build and deploy machine learning models. Uber and others have built similar systems. So, we're making an open source version so other people can do this too.

Hit us up if you're interested in using it or want to collaborate with us. We're on Discord or email us at team@replicate.com.

Prerequisites

  • macOS, Linux or Windows 11. Cog works on macOS, Linux and Windows 11 with WSL 2
  • Docker. Cog uses Docker to create a container for your model. You'll need to install Docker before you can run Cog. If you install Docker Engine instead of Docker Desktop, you will need to install Buildx as well.

Install

If you're using macOS, you can install Cog using Homebrew:

brew install cog

You can also download and install the latest release using our install script:

# fish shell
sh (curl -fsSL https://cog.run/install.sh | psub)

# bash, zsh, and other shells
sh <(curl -fsSL https://cog.run/install.sh)

# download with wget and run in a separate command
wget -qO- https://cog.run/install.sh
sh ./install.sh

You can manually install the latest release of Cog directly from GitHub by running the following commands in a terminal:

sudo curl -o /usr/local/bin/cog -L "https://github.com/replicate/cog/releases/latest/download/cog_$(uname -s)_$(uname -m)"
sudo chmod +x /usr/local/bin/cog

Alternatively, you can build Cog from source and install it with these commands:

make
sudo make install

Or if you are on docker:

RUN sh -c "INSTALL_DIR=\"/usr/local/bin\" SUDO=\"\" $(curl -fsSL https://cog.run/install.sh)"

Upgrade

If you're using macOS and you previously installed Cog with Homebrew, run the following:

brew upgrade cog

Otherwise, you can upgrade to the latest version by running the same commands you used to install it.

Next steps

Need help?

Join us in #cog on Discord.

Contributors โœจ

Thanks goes to these wonderful people (emoji key):

Ben Firshman
Ben Firshman

๐Ÿ’ป ๐Ÿ“–
Andreas Jansson
Andreas Jansson

๐Ÿ’ป ๐Ÿ“– ๐Ÿšง
Zeke Sikelianos
Zeke Sikelianos

๐Ÿ’ป ๐Ÿ“– ๐Ÿ”ง
Rory Byrne
Rory Byrne

๐Ÿ’ป ๐Ÿ“– โš ๏ธ
Michael Floering
Michael Floering

๐Ÿ’ป ๐Ÿ“– ๐Ÿค”
Ben Evans
Ben Evans

๐Ÿ“–
shashank agarwal
shashank agarwal

๐Ÿ’ป ๐Ÿ“–
VictorXLR
VictorXLR

๐Ÿ’ป ๐Ÿ“– โš ๏ธ
hung anna
hung anna

๐Ÿ›
Brian Whitman
Brian Whitman

๐Ÿ›
JimothyJohn
JimothyJohn

๐Ÿ›
ericguizzo
ericguizzo

๐Ÿ›
Dominic Baggott
Dominic Baggott

๐Ÿ’ป โš ๏ธ
Dashiell Stander
Dashiell Stander

๐Ÿ› ๐Ÿ’ป โš ๏ธ
Shuwei Liang
Shuwei Liang

๐Ÿ› ๐Ÿ’ฌ
Eric Allam
Eric Allam

๐Ÿค”
Ivรกn Perdomo
Ivรกn Perdomo

๐Ÿ›
Charles Frye
Charles Frye

๐Ÿ“–
Luan Pham
Luan Pham

๐Ÿ› ๐Ÿ“–
TommyDew
TommyDew

๐Ÿ’ป
Jesse Andrews
Jesse Andrews

๐Ÿ’ป ๐Ÿ“– โš ๏ธ
Nick Stenning
Nick Stenning

๐Ÿ’ป ๐Ÿ“– ๐ŸŽจ ๐Ÿš‡ โš ๏ธ
Justin Merrell
Justin Merrell

๐Ÿ“–
Rurik Ylรค-Onnenvuori
Rurik Ylรค-Onnenvuori

๐Ÿ›
Youka
Youka

๐Ÿ›
Clay Mullis
Clay Mullis

๐Ÿ“–
Mattt
Mattt

๐Ÿ’ป ๐Ÿ“– ๐Ÿš‡
Eng Zer Jun
Eng Zer Jun

โš ๏ธ
BB
BB

๐Ÿ’ป
williamluer
williamluer

๐Ÿ“–
Simon Eskildsen
Simon Eskildsen

๐Ÿ’ป
F
F

๐Ÿ› ๐Ÿ’ป
Philip Potter
Philip Potter

๐Ÿ› ๐Ÿ’ป
Joanne Chen
Joanne Chen

๐Ÿ“–
technillogue
technillogue

๐Ÿ’ป
Aron Carroll
Aron Carroll

๐Ÿ“– ๐Ÿ’ป ๐Ÿค”
Bohdan Mykhailenko
Bohdan Mykhailenko

๐Ÿ“– ๐Ÿ›
Daniel Radu
Daniel Radu

๐Ÿ“– ๐Ÿ›
Itay Etelis
Itay Etelis

๐Ÿ’ป
Gennaro Schiano
Gennaro Schiano

๐Ÿ“–
Andrรฉ Knรถrig
Andrรฉ Knรถrig

๐Ÿ“–

This project follows the all-contributors specification. Contributions of any kind welcome!

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cog-0.12.1.tar.gz (934.7 kB view details)

Uploaded Source

Built Distribution

cog-0.12.1-py3-none-any.whl (66.2 kB view details)

Uploaded Python 3

File details

Details for the file cog-0.12.1.tar.gz.

File metadata

  • Download URL: cog-0.12.1.tar.gz
  • Upload date:
  • Size: 934.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for cog-0.12.1.tar.gz
Algorithm Hash digest
SHA256 cec6499e94fa3afda7fd9618fcd6a6f7882df03e2074393b08a9a719db2b4ff3
MD5 ae9d800cb93fd6b5f612e678fded74b3
BLAKE2b-256 33256d154e6ae076c9bf873d44697d52d2d82dc854f5433a17f97668215ed770

See more details on using hashes here.

Provenance

The following attestation bundles were made for cog-0.12.1.tar.gz:

Publisher: pypi-package.yaml on replicate/cog

Attestations:

File details

Details for the file cog-0.12.1-py3-none-any.whl.

File metadata

  • Download URL: cog-0.12.1-py3-none-any.whl
  • Upload date:
  • Size: 66.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for cog-0.12.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e7a7c4eba220016b09fa8a9dff0c590716ea381665b5754091fd063b200256bf
MD5 3947bba073c25660e282fdcb374cf2d7
BLAKE2b-256 14b6c3f8ef484b5c9c4de55b2920f6e94b5332d488852b4b9a25cfe7ba0f0592

See more details on using hashes here.

Provenance

The following attestation bundles were made for cog-0.12.1-py3-none-any.whl:

Publisher: pypi-package.yaml on replicate/cog

Attestations:

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page