Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8 PyPI Version pip install spconv-cu118 pypi monthly download
CUDA 12.0 PyPI Version pip install spconv-cu120 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu118-2.3.6-cp311-cp311-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.11 Windows x86-64

spconv_cu118-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.6-cp310-cp310-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu118-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.6-cp39-cp39-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu118-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.6-cp38-cp38-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu118-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.6-cp37-cp37m-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu118-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu118-2.3.6-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 6b4f135c91208c5d55b869a1fb4fe97dabb36bc1b8f3621f4bf307412ba6e838
MD5 aeb59f530609f390a0a28d79afb61ab4
BLAKE2b-256 faf65efd13bef900c985b6bba27fb14d50fc35270d7cb242db2c2ec4d4f4307b

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 67c409287dbd43c12312062827f36e366b3febf2cb774bbcd5c5717af4063b49
MD5 faca1e23f8cbab76e65747541d811eb5
BLAKE2b-256 6690c44a6eb2a9dd53c7d9b0dce5dd489de420953f632a79f1d40893b4ba2641

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 b47420aa68019bff1107b7b8b52e698a08e0d6f9116193eacf784a8ccf0a01c4
MD5 31192eb66538c10948b0e286c8ffe7ba
BLAKE2b-256 900c281989bcfe888fe5263d2852c8927e4de24f45268d3d58fc92452c4f7f23

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3b647dbf63a2ac2886192f9b0528a6bd86c8e32841de7cbc548ebddc58ee41e9
MD5 6a4d60709f459dced4f01e68b3a18bec
BLAKE2b-256 08875f7b93b056e72fde67ca086132aa7724bf61c0223324f6cab7923aa1ca03

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 4d4d299286196f40c5c4967cee01db9239915d331339c54c5fa2079b1037322c
MD5 10b2fd491c6da3d83b48b40d10181da0
BLAKE2b-256 bd7d33eab2fe11cbe994126f0aff7d87d95a97726ff8cc97e20594b5d75245fd

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e8e587f46ec4ac09c3308bff4a7120c9d5c51f092c997acd05ec2ffdbb39416f
MD5 5fb2150d33dd08b50b4f80075ef58469
BLAKE2b-256 f6c675d40e078ee54e25e28351f6917f5d83851fad08c877c2945c7f66d5fd0c

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 1248bc21b7e39a68bea653e76e0ec476acee021717531e5ef49e8b689337a86e
MD5 5575a6b5cc334e565e7f42da83c3615b
BLAKE2b-256 fc2aa3d18a0f28fc100e8691d232a1df67fdfbb43910f96347a958cf31585fd8

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 462a18770099a9efe3e1c3f6a78ac051b2b78b0a13a136407df87788a427ec0e
MD5 37e8ac8130ff02d521e07d4520df55c2
BLAKE2b-256 084d764b9c0e702c224076f6cb4a2333f9bf240c1582690767169b7603602da5

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 a5700c7380c9dedd60842f1fd56a25f4f6c25c63e5070692551b117639c3900c
MD5 c08c31d7fe2edb1160c74dc414de5c0e
BLAKE2b-256 e3e59ab38cd70ae3abd625f3364d149f6b78115c5682d5235d6bf6b8ea84bdd6

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.6-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 af6048ae1dcad0e488a64b47f7e5af22e1da8902745bc78b8c05502b95daf55d
MD5 8ece241252d7cb1902b986c652e2dce7
BLAKE2b-256 6f7dc40b692ff281bb60b9314bbedf782bb027a273aae3cad975d86163612716

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page