Skip to main content

AQT: Accurate Quantized Training

Project description

Accurate Quantized Training

This directory contains libraries for running and analyzing neural network quantization experiments in JAX and flax.

Summary about this work is presented at paper Pareto-Optimal Quantized ResNet Is Mostly 4-bit. Please cite the paper in your publications if you find the source code useful for your research.

Contributors: Shivani Agrawal, Lisa Wang, Jonathan Malmaud, Lukasz Lew, Pouya Dormiani, Phoenix Meadowlark, Oleg Rybakov.

AQT Quantization Library

Jax and Flax quantization libraries provides what you serve is what you train quantization for convolution and matmul. See this README.md.

Reporting Tool

After a training run has completed, the reporting tool in report_utils.py allows to generate a concise experiment report with aggregated metrics and metadata. See this README.md.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

aqtp-0.0.2-py3-none-any.whl (103.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page