Skip to main content

Vectalab - Professional High-Fidelity Image Vectorization

Project description

Vectalab

Professional High-Fidelity Image Vectorization

Python 3.10-3.12 License: MIT

Convert raster images (PNG, JPG) to optimized SVG with 97%+ quality and 70–80% file size reduction.

Installation

pip install vectalab

# Optional: install SVGO (Node.js) for best compression
# recommended: Node 16+ or current LTS
npm install -g svgo

Quick Start

# Vectorize an image (recommended)
vectalab premium logo.png

# Optimize existing SVG
vectalab optimize icon.svg

# Check SVGO status
vectalab svgo-info

Results

Metric Value
Quality (SSIM) 97-99%
File reduction 70-80%
Color accuracy (ΔE) < 1 (imperceptible)
Processing time 0.2-2s

Commands

Command Description
premium ⭐ SOTA vectorization (recommended)
optimize Compress existing SVG with SVGO
convert Basic vectorization
logo Logo-optimized conversion
info Analyze image
svgo-info Check SVGO status
benchmark 📊 Run performance benchmarks

Usage

CLI

# Best quality + smallest file
vectalab premium image.png

# Maximum compression
vectalab premium logo.png --precision 1 --mode logo

# Photo vectorization
vectalab premium photo.jpg --mode photo --colors 32

# Compress existing SVG
vectalab optimize icon.svg

Benchmarking

Run comprehensive benchmarks on your own images to evaluate quality and performance.

# Run the Python benchmark runner (reproducible & auditable)
python scripts/benchmark_runner.py --input-dir ./my_images --mode premium

# Run targeted 80/20 optimization checks
python scripts/benchmark_80_20.py examples/test_logo.png

# Run the Golden Dataset using the runner
python scripts/benchmark_runner.py --input-dir golden_data --mode premium

Python

from vectalab import vectorize_premium

svg_path, metrics = vectorize_premium("input.png", "output.svg")

print(f"Quality: {metrics['ssim']*100:.1f}%")
print(f"Size: {metrics['file_size']/1024:.1f} KB")
print(f"Color accuracy: ΔE={metrics['delta_e']:.2f}")

Options

Flag Default Description
--precision, -p 2 Coordinate decimals (1=smallest)
--mode, -m auto logo, photo, or auto
--colors, -c auto Palette size (4-64)
--svgo/--no-svgo on SVGO optimization

Cloud Acceleration (Modal)

Vectalab supports offloading heavy segmentation tasks (SAM) to the cloud using Modal.com. This enables using the largest models (vit_h) on any machine.

  1. Setup: modal setup
  2. Run: vectalab convert input.png --method sam --use-modal

See Modal Setup Guide for details.

Documentation

Scripts cleanup

Some older, ad-hoc testing/analysis scripts were moved into scripts/archived/ to keep the main scripts/ directory concise. See scripts/README.md for details on which tools live in scripts/ vs. scripts/archived/.

Architecture

PNG/JPG → Analysis → Preprocessing → vtracer → SVGO → SVG
                ↓           ↓            ↓        ↓
          Type detect   Color quant   Tracing   Compress
          (logo/photo)  Edge-aware    (Rust)    (30-50%)

Requirements

  • Python 3.10–3.12 (see pyproject.toml; the package requires >=3.10)
  • Node.js (for SVGO, optional but recommended; use an LTS release)

Core Dependencies

vtracer      # Rust vectorization engine (primary tracing backend)
opencv-python # Image processing
scikit-image # Quality & image metrics
cairosvg     # SVG rendering (used in tests and helpers)

Optional/advanced features (SAM segmentation, Modal cloud acceleration):

segment-anything  # SAM-based segmentation (optional)
modal             # cloud acceleration (optional — see docs/modal_setup.md)
torch/torchvision # hardware-accelerated segmentation models

License

MIT License - see LICENSE

Credits

Publishing / Releases 🔧

We include a tiny helper script to build and upload releases to PyPI or TestPyPI: scripts/publish_to_pypi.py.

Quick usage:

# Install the tools used by the script
python -m pip install --upgrade build twine

# Dry-run to TestPyPI (default is testpypi)
python scripts/publish_to_pypi.py --dry-run

# Upload to TestPyPI (use env TWINE_USERNAME/TWINE_PASSWORD or ~/.pypirc)
python scripts/publish_to_pypi.py --repository testpypi

# Upload to production PyPI
python scripts/publish_to_pypi.py --repository pypi

# Build, upload to PyPI and tag the current version (reads pyproject.toml)
python scripts/publish_to_pypi.py --repository pypi --tag

# If you want to inspect only the build artifacts and skip upload
python scripts/publish_to_pypi.py --no-upload

Notes & recommendations:

  • The script expects build artifacts in dist/ and will run python -m build by default.
  • Use --dry-run to preview commands to be executed before actually uploading.
  • For CI, set TWINE_USERNAME and TWINE_PASSWORD as environment secrets, or configure ~/.pypirc so twine can use that.
  • The script supports both TestPyPI (--repository testpypi) and production PyPI (--repository pypi).
  • You can also target a custom PyPI-compatible endpoint using --repository-url (e.g. a private index or an internal upload endpoint). This overrides --repository.

CI publishing (recommended)

To safely publish to PyPI on releases, add a GitHub Actions secret named PYPI_API_TOKEN containing a PyPI API token (create one at https://pypi.org/manage/account/token/). A workflow is included that will run on push tags named like v* and publish built distributions automatically.

Typical workflow:

  1. Create a PyPI API token (project or account token) on https://pypi.org/account/.
  2. Add the token to your repository under Settings → Secrets → Actions → PYPI_API_TOKEN.
  3. Push a git tag (example: git tag v0.1.0 && git push origin v0.1.0). The CI workflow will build & publish.

Workflow note: older versions of the pypa/gh-action-pypi-publish action required using @release/v1 or a specific @vX.Y.Z tag instead of @release; the workflow in this repo now uses pypa/gh-action-pypi-publish@release/v1 to avoid the "unable to find version 'release'" error.

Trusted Publishing (OIDC) support

This workflow now supports GitHub's OpenID Connect (OIDC) / Trusted Publishing flow in case you prefer not to store a PyPI API token in repository secrets.

What changed: the publishing job has job-level permissions so it can request an OIDC id token from GitHub:

jobs:
      publish:
            permissions:
                  id-token: write
                  contents: read
            runs-on: ubuntu-latest
            # ...

How to use Trusted Publishing (summary):

  • Configure a Trusted Publisher on PyPI and link it to your GitHub repo / org. See PyPI's Trusted Publisher docs (https://pypi.org/help/#trusted-publishers) for setup details.
  • Once PyPI trusts your repository/organization, the publishing job will request an OIDC id token and exchange it with PyPI to authenticate — no token stored in GitHub secrets required.

Notes:

  • Trusted Publishing is more secure but requires extra PyPI-side steps and verification; if you prefer a simpler setup, create a project-scoped PyPI API token and set it as the PYPI_API_TOKEN secret for CI.
  • If you want I can help configure the PyPI side (e.g., add a trusted publisher) or update the workflow to support both modes depending on whether the secret is present.

Repository protections

This repository now has a conservative branch protection policy applied to main to reduce accidental direct pushes and require code review for changes. The policy applied includes:

  • Require at least 1 approving PR review.
  • Disallow force-pushes and branch deletions on main.
  • Do not enforce admin exemptions (admins are not required to follow the rules in this conservative setup).
  • No required CI contexts (you can add these later once GitHub Actions workflows exist).

If you prefer to manage branch protection manually, these are the gh commands used (run locally as a repository admin):

# Example: conservative (require 1 review, strict status checks w/ no contexts, disallow force pushes)
cat > /tmp/prot.json <<'JSON'
{
      "required_status_checks": { "strict": true, "contexts": [] },
      "enforce_admins": false,
      "required_pull_request_reviews": {
            "dismiss_stale_reviews": true,
            "require_code_owner_reviews": false,
            "required_approving_review_count": 1
      },
      "restrictions": null,
      "allow_force_pushes": false,
      "allow_deletions": false
}
JSON

gh api --method PUT /repos/<ORG_OR_USER>/<REPO>/branches/main/protection --input /tmp/prot.json | cat

If you'd like stricter rules (enforce admin rules, require CI contexts, or restrict push access to certain teams), I can update the policy accordingly — tell me what you want and I'll apply it.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vectalab-0.1.0.tar.gz (99.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vectalab-0.1.0-py3-none-any.whl (98.5 kB view details)

Uploaded Python 3

File details

Details for the file vectalab-0.1.0.tar.gz.

File metadata

  • Download URL: vectalab-0.1.0.tar.gz
  • Upload date:
  • Size: 99.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for vectalab-0.1.0.tar.gz
Algorithm Hash digest
SHA256 2df90939fc0454a54a2cb521aa8add7c6cc4075771a85d086b07771009e47739
MD5 ac363bbc00a26f8631113f090e7a9672
BLAKE2b-256 c98d4c9106f51757968cd0ffc6cae993d70a3787b75226fcacc3765bca6ef241

See more details on using hashes here.

Provenance

The following attestation bundles were made for vectalab-0.1.0.tar.gz:

Publisher: publish.yml on raphaelmansuy/vectalab

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file vectalab-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: vectalab-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 98.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for vectalab-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 1f63c03a03ea2824638149ce821ec4d262fd523645efe1362fcd2f91dcd6dbb8
MD5 156eb81ad7b5e240eb74c6d736f3f978
BLAKE2b-256 a3949a5f9b3ad9106584ab9e391d1743f561908027a823f80503f2cf1f814eac

See more details on using hashes here.

Provenance

The following attestation bundles were made for vectalab-0.1.0-py3-none-any.whl:

Publisher: publish.yml on raphaelmansuy/vectalab

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page