Skip to main content

No project description provided

Project description

Quantizers

PyPI version License Tests Coverage

Hardware-oriented numerical quantizers for deep learning models, implemented in Keras v3 and NumPy. Provides bit-accurate precision matching with Vivado/Vitis HLS implementations.

Features

Supported Quantizers

Fixed-Point Quantizer

Parameters:

  • k (keep_negative): Enable negative numbers
  • i (integer_bits): Number of bits before decimal point (excludes sign bit)
  • f (fractional_bits): Number of bits after decimal point
  • For C++: W = k + i + f, I = k + i, S = k

Supported modes:

  • Rounding: TRN, RND, RND_CONV, TRN_ZERO, RND_ZERO, RND_MIN_INF, RND_INF
    • S_RND and S_RND_CONV for stochastic rounding; Not available in NumPy implementation as it is for training only
  • Overflow: WRAP, SAT, SAT_SYM, WRAP_SM

Limitations:

  • WRAP_SM only works with RND or RND_CONV rounding
  • WRAP* modes don't provide surrogate gradients for integer bits
  • Saturation bit forced to zero for WRAP and WRAP_SM

Minifloat Quantizer

Parameters:

  • m (mantissa_bits): Mantissa width
  • e (exponent_bits): Exponent width
  • e0 (exponent_zero): Exponent bias (default: 0)
  • Range: [-2^(e-1) + e0, 2^(e-1) - 1 + e0]

Features:

  • Supports subnormal numbers
  • Uses RND_CONV rounding and SAT overflow
  • HLS-synthesizable implementation in test/cpp_source/ap_types/ap_float.h

Simplified Quantizers

  • Binary: Maps to {-1,1} with 0 to -1. (preliminary implementation)
  • Ternary: Shorthand for fixed-point fixed<2, 1, RND_CONV, SAT_SYM>

Installation

requires python>=3.10

pip install quantizers

keras>=3.0 and at least one compatible backend (pytorch, jax, or tensorflow) is required for training.

Usage

Stateless Quantizers

from quantizers import (
  float_quantize(_np), # add _np for NumPy implementation
  get_fixed_quantizer(_np),
  binary_quantize(_np),
  ternary_quantize(_np),
)

# Fixed-point quantizer
fixed_quantizer = get_fixed_quantizer(round_mode, overflow_mode)
fixedp_qtensor = fixed_quantizer(
    x,
    integer_bits,
    fractional_bits,
    keep_negative,
    training, # For stochastic rounding, and WRAP does not happen during training
    seed, # For stochastic rounding only
)

# Minifloat quantizer
floatp_qtensor = float_quantize(x, mantissa_bits, exponent_bits, exponent_zero)

# Simplified quantizers
binary_qtensor = binary_quantize(x)
ternary_qtensor = ternary_quantize(x)

Stateful Quantizers

# Can be used for, but not intended for training
fixed_q = FixedQ(
    width,
    integer_bits, # including the sign bit)
    keep_negative,
    fixed_round_mode, # No stochastic rounding
    fixed_overflow_mode
)
quantized = fixed_q(x)

mfloat_q = MinifloatQ(mantissa_bits, exponent_bits, exponent_zero)
quantized = mfloat_q(x)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

quantizers-1.0.0.post1.tar.gz (117.7 kB view details)

Uploaded Source

Built Distribution

quantizers-1.0.0.post1-py3-none-any.whl (15.0 kB view details)

Uploaded Python 3

File details

Details for the file quantizers-1.0.0.post1.tar.gz.

File metadata

  • Download URL: quantizers-1.0.0.post1.tar.gz
  • Upload date:
  • Size: 117.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.20

File hashes

Hashes for quantizers-1.0.0.post1.tar.gz
Algorithm Hash digest
SHA256 fd6b87f504614ff82a524f4d96e8d4e4fc90df6f0c021c79d695738aa4b19218
MD5 5ed27a8a8882e4cad0fad36a68021268
BLAKE2b-256 3c810e623dbf23088df3c4ea5c012818a3271909a71fe06d20962fe2ec85f8fc

See more details on using hashes here.

File details

Details for the file quantizers-1.0.0.post1-py3-none-any.whl.

File metadata

File hashes

Hashes for quantizers-1.0.0.post1-py3-none-any.whl
Algorithm Hash digest
SHA256 d881ea5ecaa49411a4b0ef7abed36d01e88734b2d086832f24763013daf36982
MD5 c9513bdef15bd66832abe556b76dce5e
BLAKE2b-256 86053828e17823a64eb06bc83add66281864ae691ea59fd2a9791a887ab7a0f1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page