Skip to main content

XSlim is an offline quantization tools based on PPQ

Project description

XSlim

中文版 | English

Version License Python

XSlim is a Post-Training Quantization (PTQ) tool developed by SpacemiT. It integrates chip-optimized quantization strategies and provides a unified interface for ONNX model quantization via JSON configuration files.


Features

  • INT8 / FP16 / Dynamic Quantization – multiple precision levels for different deployment scenarios
  • JSON-driven configuration – simple, declarative quantization setup
  • Python API & CLI – use as a library or from the command line
  • Custom preprocessing – plug in your own preprocessing functions
  • ONNX-based workflow – built on the ONNX ecosystem

Installation

pip install xslim

Or install from source:

git clone https://github.com/spacemit-com/xslim.git
cd xslim
pip install -r requirements.txt

Quick Start

Python API

import xslim

# Using a JSON config file
xslim.quantize_onnx_model("config.json")

# Using a dict
config = {
    "model_parameters": {
        "onnx_model": "model.onnx",
        "working_dir": "./output"
    },
    "calibration_parameters": {
        "input_parameters": [{
            "mean_value": [123.675, 116.28, 103.53],
            "std_value": [58.395, 57.12, 57.375],
            "color_format": "rgb",
            "preprocess_file": "PT_IMAGENET",
            "data_list_path": "./calib_img_list.txt"
        }]
    }
}
xslim.quantize_onnx_model(config)

# You can also pass the model path and output path directly
xslim.quantize_onnx_model("config.json", "input.onnx", "output.onnx")

Command Line

# INT8 quantization with a JSON config
python -m xslim --config config.json

# Specify input and output model paths
python -m xslim -c config.json -i input.onnx -o output.onnx

# Dynamic quantization (no config file needed)
python -m xslim -i input.onnx -o output.onnx --dynq

# FP16 conversion (no config file needed)
python -m xslim -i input.onnx -o output.onnx --fp16

# ONNX simplification only (no config file needed)
python -m xslim -i input.onnx -o output.onnx

Documentation

Samples

See the samples directory for ready-to-run examples covering ResNet-18, MobileNet V3, BERT, and more.

Changelog

For a full list of changes, see the Releases page.

Version Highlights
2.0.12 Current development version
2.0.11 Latest release; fix Pad/missing-input handling, add Or/Einsum/Selu support, normalize Conv/ConvTranspose kernel shapes, raise minimum Python to 3.9
2.0.10 Align release metadata, improve CI/test coverage, normalize missing default ONNX opset before dynamic quantization, and refine shape inference handling
2.0.9 Add documentation, preserve tensor dtype metadata during FP16 conversion, and restore compatibility with onnxslim 0.1.87
2.0.8 Improve packaging/CI, add torch executor operator coverage, add PyPI publish workflow, and centralize version metadata
2.0.7 Fix FP16 conversion bug on complex models
2.0.6 Fix metadata props deletion; default CLI behavior changed to model simplification (use --dynq for dynamic quantization)

Contributing

Contributions are welcome! Please open an issue or submit a pull request.

License

This project is licensed under the Apache License 2.0.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xslim-2.0.12.tar.gz (269.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

xslim-2.0.12-py3-none-any.whl (300.3 kB view details)

Uploaded Python 3

File details

Details for the file xslim-2.0.12.tar.gz.

File metadata

  • Download URL: xslim-2.0.12.tar.gz
  • Upload date:
  • Size: 269.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for xslim-2.0.12.tar.gz
Algorithm Hash digest
SHA256 87540279b72917c69433b3ca00d2ba4306b977a2d233b23bf5c3618e76460974
MD5 38c4d18ff1736d5059e90138498d2071
BLAKE2b-256 c619b2e68e18c1148ddfef534dbfadbca813e8af263a96b37b9e835f755673ae

See more details on using hashes here.

Provenance

The following attestation bundles were made for xslim-2.0.12.tar.gz:

Publisher: publish.yml on spacemit-com/xslim

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file xslim-2.0.12-py3-none-any.whl.

File metadata

  • Download URL: xslim-2.0.12-py3-none-any.whl
  • Upload date:
  • Size: 300.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for xslim-2.0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 0dde95afbe5d7bafc97229ed5923548a0cb69191dd07bba75df6a4ec91571677
MD5 1450d025839d22f14a9261d30ec56974
BLAKE2b-256 32a78c4a3fa19995f2f74b4f35b5db8132eb64c33180c4136c5cd94910edb52b

See more details on using hashes here.

Provenance

The following attestation bundles were made for xslim-2.0.12-py3-none-any.whl:

Publisher: publish.yml on spacemit-com/xslim

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page