Skip to main content

ComfyUI inference engine as a standalone Python library (no server, no UI).

Project description

comfy-diffusion

PyPI version Python 3.12+ CI License: GPL-3.0

comfy-diffusion is a standalone Python package that exposes ComfyUI's inference engine as importable modules. It is not a server, node graph runner, web UI, MCP server, daemon, or binary app.

The package vendors ComfyUI at vendor/ComfyUI and makes its internal comfy.* modules available when runtime APIs need them. Application authors can install this package, import comfy_diffusion, and compose inference flows directly in Python.

Install

Use uv for development and dependency resolution:

uv sync --extra cpu --extra comfyui

For CUDA environments:

uv sync --extra cuda --extra comfyui

Useful extras:

Extra Includes Use
cpu torch, torchvision CPU-only development and CI
cuda torch, torchvision via the configured PyTorch CUDA index NVIDIA GPU inference
comfyui ComfyUI runtime dependencies Importing and running ComfyUI internals
audio torchaudio Audio helpers and pipelines
video av, imageio, opencv-python Video I/O helpers
all CUDA, audio, video, and ComfyUI runtime dependencies Full local runtime

Python API

The public package root intentionally stays small:

from comfy_diffusion import check_runtime, vae_decode, vae_encode, apply_lora

Most APIs are imported from explicit submodules:

from comfy_diffusion.models import ModelManager
from comfy_diffusion.conditioning import encode_prompt
from comfy_diffusion.sampling import sample

Quick Start

Call check_runtime() before loading models or sampling. On first runtime use, comfy-diffusion can perform an automatic download of the pinned ComfyUI release when the vendored runtime is missing. Expected failures are returned as an error dict instead of being raised; check_runtime() returns an error dict for runtime bootstrap problems.

Example:

from comfy_diffusion import apply_lora, check_runtime, vae_decode
from comfy_diffusion.conditioning import encode_prompt
from comfy_diffusion.models import ModelManager
from comfy_diffusion.sampling import sample

runtime = check_runtime()
if "error" in runtime:
    raise RuntimeError(runtime["error"])

manager = ModelManager(models_dir="/path/to/models")
checkpoint = manager.load_checkpoint("model.safetensors")

model, clip = apply_lora(
    checkpoint.model,
    checkpoint.clip,
    "style.safetensors",
    0.8,
    0.8,
)

positive = encode_prompt(clip, "a portrait, studio lighting")
negative = encode_prompt(clip, "blurry, low quality")

import torch

latent = {"samples": torch.zeros(1, 4, 64, 64)}
denoised = sample(
    model,
    positive,
    negative,
    latent,
    steps=20,
    cfg=7.0,
    sampler_name="euler",
    scheduler="normal",
    seed=42,
)
image = vae_decode(checkpoint.vae, denoised)
image.save("output.png")

comfy_diffusion.pipelines remains available as an optional helper namespace for explicit ready-made flows, but the main interface is the modular Python API above.

CLI

The first-party CLI is named comfy-diffusion and provides operational package tools only.

uv run comfy-diffusion runtime check --json
uv run comfy-diffusion runtime paths
uv run comfy-diffusion models list --models-dir /path/to/models
uv run comfy-diffusion models download --manifest models.json --models-dir /path/to/models

Model manifest shape:

{
  "models": [
    {
      "type": "hf",
      "repo_id": "org/model",
      "filename": "model.safetensors",
      "dest": "checkpoints",
      "sha256": null
    },
    {
      "type": "url",
      "url": "https://example.com/model.safetensors",
      "dest": "unet/model.safetensors"
    },
    {
      "type": "civitai",
      "model_id": 12345,
      "version_id": 67890,
      "dest": "loras"
    }
  ]
}

The CLI does not start servers, manage services, expose MCP tools, run a web UI, queue background jobs, or provide Parallax commands.

Development

uv sync --extra cpu --extra comfyui
uv run pytest
uv run ruff check .

ComfyUI is pinned as a git submodule at vendor/ComfyUI. Do not edit vendored ComfyUI code directly.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

comfy_diffusion-2.0.0.tar.gz (7.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

comfy_diffusion-2.0.0-py3-none-any.whl (240.3 kB view details)

Uploaded Python 3

File details

Details for the file comfy_diffusion-2.0.0.tar.gz.

File metadata

  • Download URL: comfy_diffusion-2.0.0.tar.gz
  • Upload date:
  • Size: 7.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for comfy_diffusion-2.0.0.tar.gz
Algorithm Hash digest
SHA256 1a16cb669ce8d7730f518e1337c763bbd2f125d97400384cf8b0276672c6c4af
MD5 3a80a810f4f473f02c2960f6f494fcfd
BLAKE2b-256 5d0dc8c6a473c2106bef8bd31c2e86497f5bf82fdbd0fc89525705b1eeecd04c

See more details on using hashes here.

Provenance

The following attestation bundles were made for comfy_diffusion-2.0.0.tar.gz:

Publisher: publish.yml on quinteroac/comfy-diffusion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file comfy_diffusion-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: comfy_diffusion-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 240.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for comfy_diffusion-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 818770f80a04fe1679df89b98cc386d0c743e971887393e8ba1c9d9b60539416
MD5 e28f029cb19f40263002df299ff7b854
BLAKE2b-256 47c8f4e65220c3b8eca4b4e38d5c4fe21aa75ee5d73f2966fa96b244082227f5

See more details on using hashes here.

Provenance

The following attestation bundles were made for comfy_diffusion-2.0.0-py3-none-any.whl:

Publisher: publish.yml on quinteroac/comfy-diffusion

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page