Skip to main content

XSlim is an offline quantization tools based on PPQ

Project description

XSlim

中文版 | English

Version License Python

XSlim is a Post-Training Quantization (PTQ) tool developed by SpacemiT. It integrates chip-optimized quantization strategies and provides a unified interface for ONNX model quantization via JSON configuration files.


Features

  • INT8 / FP16 / Dynamic Quantization – multiple precision levels for different deployment scenarios
  • JSON-driven configuration – simple, declarative quantization setup
  • Python API & CLI – use as a library or from the command line
  • Custom preprocessing – plug in your own preprocessing functions
  • ONNX-based workflow – built on the ONNX ecosystem

Installation

pip install xslim

Or install from source:

git clone https://github.com/spacemit-com/xslim.git
cd xslim
pip install -r requirements.txt

Quick Start

Python API

import xslim

# Using a JSON config file
xslim.quantize_onnx_model("config.json")

# Using a dict
config = {
    "model_parameters": {
        "onnx_model": "model.onnx",
        "working_dir": "./output"
    },
    "calibration_parameters": {
        "input_parameters": [{
            "mean_value": [123.675, 116.28, 103.53],
            "std_value": [58.395, 57.12, 57.375],
            "color_format": "rgb",
            "preprocess_file": "PT_IMAGENET",
            "data_list_path": "./calib_img_list.txt"
        }]
    }
}
xslim.quantize_onnx_model(config)

# You can also pass the model path and output path directly
xslim.quantize_onnx_model("config.json", "input.onnx", "output.onnx")

Command Line

# INT8 quantization with a JSON config
python -m xslim --config config.json

# Specify input and output model paths
python -m xslim -c config.json -i input.onnx -o output.onnx

# Dynamic quantization (no config file needed)
python -m xslim -i input.onnx -o output.onnx --dynq

# FP16 conversion (no config file needed)
python -m xslim -i input.onnx -o output.onnx --fp16

# ONNX simplification only (no config file needed)
python -m xslim -i input.onnx -o output.onnx

Documentation

Samples

See the samples directory for ready-to-run examples covering ResNet-18, MobileNet V3, BERT, and more.

Changelog

For a full list of changes, see the Releases page.

Version Highlights
2.0.13 Current development version
2.0.12 Latest release; complete README changelog/release metadata, add accuracy-tuning docs and README links, introduce the xslim-accuracy-tuning GitHub skill, add YOLO truncation guidance, and rename input parameters for consistency
2.0.11 Fix Pad/missing-input handling, add Or/Einsum/Selu support, normalize Conv/ConvTranspose kernel shapes, and raise minimum Python to 3.9
2.0.10 Align release metadata, improve CI/test coverage, normalize missing default ONNX opset before dynamic quantization, and refine shape inference handling
2.0.9 Add documentation, preserve tensor dtype metadata during FP16 conversion, and restore compatibility with onnxslim 0.1.87
2.0.8 Improve packaging/CI, add torch executor operator coverage, add PyPI publish workflow, and centralize version metadata
2.0.7 Fix FP16 conversion bug on complex models
2.0.6 Fix metadata props deletion; default CLI behavior changed to model simplification (use --dynq for dynamic quantization)

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

License

This project is licensed under the Apache License 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xslim-2.0.13.tar.gz (270.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xslim-2.0.13-py3-none-any.whl (301.2 kB view details)

Uploaded Python 3

File details

Details for the file xslim-2.0.13.tar.gz.

File metadata

  • Download URL: xslim-2.0.13.tar.gz
  • Upload date:
  • Size: 270.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for xslim-2.0.13.tar.gz
Algorithm Hash digest
SHA256 3d52306cac6917f76bedfc6099342ef545dec4dfa142524b2a46c4e81f9e70b7
MD5 651c0f3b34cdedec5bb6cad5f897cc99
BLAKE2b-256 9d7cce33b9f7018ff0346cd5c6b0788617f6798d7979ad7fa74afa41b37e9fab

See more details on using hashes here.

Provenance

The following attestation bundles were made for xslim-2.0.13.tar.gz:

Publisher: publish.yml on spacemit-com/xslim

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file xslim-2.0.13-py3-none-any.whl.

File metadata

  • Download URL: xslim-2.0.13-py3-none-any.whl
  • Upload date:
  • Size: 301.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for xslim-2.0.13-py3-none-any.whl
Algorithm Hash digest
SHA256 44ae6355ea992708a06914871a06b6610e5f2b4edb090a291836edb9a9cc3f34
MD5 8ae2b75e11fb58366041320f3742c3e0
BLAKE2b-256 5ee1fb475beb10c69380170ae3f6bf5f09947900b846d8a3b75969aa441534d7

See more details on using hashes here.

Provenance

The following attestation bundles were made for xslim-2.0.13-py3-none-any.whl:

Publisher: publish.yml on spacemit-com/xslim

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page