Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8 PyPI Version pip install spconv-cu118 pypi monthly download
CUDA 12.0 PyPI Version pip install spconv-cu120 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu116-2.3.6-cp311-cp311-win_amd64.whl (68.1 MB view details)

Uploaded CPython 3.11 Windows x86-64

spconv_cu116-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (69.5 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

spconv_cu116-2.3.6-cp310-cp310-win_amd64.whl (68.1 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu116-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (69.5 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu116-2.3.6-cp39-cp39-win_amd64.whl (68.1 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu116-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (69.5 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu116-2.3.6-cp38-cp38-win_amd64.whl (68.1 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu116-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (69.5 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu116-2.3.6-cp37-cp37m-win_amd64.whl (68.1 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu116-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (69.5 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu116-2.3.6-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 abcbf1a194e40c4aff23856e531348e82245ef578415a36882d6c2124f33fcdf
MD5 b24e2bc47be4e08e7c59359d61d5ff44
BLAKE2b-256 f9f3a6a65f0caa81801132f99e9e044866ee9dfce42952a57b2dfd78fcbec62d

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ed0185fe120552383d1a7019cffb303163ce312ad57cd12c29e1b06c20691f23
MD5 9c1ce7bb24d29f0538e6570fbd1f7627
BLAKE2b-256 92158f02d4f46c8cf12c77f202c095bd5b41a4d01fa8116f3564bf63e12fa2ec

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 3f8b979b4a1e2e23735f19f00d0f8f736ff7e158f825d5f893031b0bcc3d55fb
MD5 1cff004a4a551e5bb0b931d56a2510e9
BLAKE2b-256 87a0fffed987674a38df266fe2b6518eabba2821cfbad00b6d6861a4c504dadc

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 94885efb5fdcd2fb1125988bcc395fee9a132f48361b31aef95e6c04795a0085
MD5 a9c8d50ad966c8877da8dadd2938328e
BLAKE2b-256 76b0c87686a78eedf64fdb9adc53dd4c44795633dd8334010c3e0b82155aa66f

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 2f103712c7bfcb006e5386ac7591a0b980c76c10f9138ef3c6a2db8870ce6d0c
MD5 bdd40aaae911066375f5deced63e01a7
BLAKE2b-256 f2e059f5c91ba3d8b231d26f20bffc5372baca76025ef0b1490065cefbd35e93

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 9d2a5612336a4fa31c04e92933763621e24107c462ce90d12dc4c5875629ffb8
MD5 881d03649f7a43501321fc49e7db2857
BLAKE2b-256 66ba59e157ae692323b9aa50ab7d901632d70aaf3f1798b3d567e6c820c38c48

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 a071d6ca8b8a3fe2bae91a3441a66fc44cccf45e93a27e6145c97666cb6bc675
MD5 ba22a74b9a22ebb03ad9afe65e56815e
BLAKE2b-256 9a2900f673e14e65157d612979803d6fe0f3eabab451389e4389ccd2bda202da

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3bc3035f5e0cfbad9af09eb5134d57be5b67d183650e7546def9b5c25daf98fe
MD5 736839cdad182d3598dd119f2762d5d0
BLAKE2b-256 11467a284e224dd544e0162ffdfdf9b8ecdbdadb423cc2d877192c52b3d14f52

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 8a3e4f200f9ace20216178393bdc3f40c8b35b7042fa9374ed05b061581f41e1
MD5 7a1d4c2dfde1c3754c9b9bb5aacb187f
BLAKE2b-256 77675a9c543b9657459ef296708db6961aa8107e7f9cca27cf86bb1687cf45df

See more details on using hashes here.

File details

Details for the file spconv_cu116-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu116-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ca4bcdc4357aec2b701b25389dd9926ae94d90bd99ae5f01769f828f395ac703
MD5 3167ed6172b83045d3699d9a5816ac3e
BLAKE2b-256 0f363eb6b784d852cfa04fd44cded52bebda35db6f401734cc532d625e277aca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page