Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8 PyPI Version pip install spconv-cu118 pypi monthly download
CUDA 12.0 PyPI Version pip install spconv-cu120 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

spconv_cu126-2.3.8-cp313-cp313-win_amd64.whl (68.8 MB view details)

Uploaded CPython 3.13Windows x86-64

spconv_cu126-2.3.8-cp313-cp313-manylinux_2_28_x86_64.whl (70.5 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

spconv_cu126-2.3.8-cp312-cp312-win_amd64.whl (68.8 MB view details)

Uploaded CPython 3.12Windows x86-64

spconv_cu126-2.3.8-cp312-cp312-manylinux_2_28_x86_64.whl (70.5 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

spconv_cu126-2.3.8-cp311-cp311-win_amd64.whl (68.8 MB view details)

Uploaded CPython 3.11Windows x86-64

spconv_cu126-2.3.8-cp311-cp311-manylinux_2_28_x86_64.whl (70.5 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

spconv_cu126-2.3.8-cp310-cp310-win_amd64.whl (68.8 MB view details)

Uploaded CPython 3.10Windows x86-64

spconv_cu126-2.3.8-cp310-cp310-manylinux_2_28_x86_64.whl (70.5 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

spconv_cu126-2.3.8-cp39-cp39-win_amd64.whl (68.8 MB view details)

Uploaded CPython 3.9Windows x86-64

spconv_cu126-2.3.8-cp39-cp39-manylinux_2_28_x86_64.whl (70.5 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.28+ x86-64

File details

Details for the file spconv_cu126-2.3.8-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 3fa683e0b30eb565e71745ba433337a42d32cc2d9b6269544217b111dc5b6f69
MD5 cd67b850db480db7daa667c4675111e1
BLAKE2b-256 b97f42df26bfe29a3c7fd7d6d1b0a94d40dda315ca1668fb826aee31665b6fb5

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 b7de4c878a5aadfcdb84e627a07752a06318e8961f7a83afa12a70db718cc7dd
MD5 f4a6b411ed34f230e6df2b092173838a
BLAKE2b-256 affdc52d71468849d09b333123f9d0cac27b4a3815a8faecff21ebd66d9c7b45

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 e5935a3948c0126d2e15042dc2a0c987d70e66b6198add08885590c78b682f7b
MD5 a2983236c57d7fad1ecc9e9e0166c4f4
BLAKE2b-256 f114880b7a50ff2e44f948847706a07d0841aa2e95e64c07bff3f40d0b9b22b6

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 88002b375d33561ac956e29e00dade16cad11d0e3413630a8ef35e92ddfda6e1
MD5 e98fe23022e80fa89060cf3675020b99
BLAKE2b-256 d2c80b8e710b76b997f8c020dcac4ea0856c3702392464d2cb8ae63510a451fc

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 65d6b974b05123aac9f94da3de7a66600dc1ac7d2d2396bb25cc68d973afda4d
MD5 46e721ec1a8dc3eb05c578cfefdc9876
BLAKE2b-256 4b548bc22c76f637e261e4cceee87d63cb340a4ef610187132e2ad106b13707f

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 bd17eacd5d24f1b52f5400692b2f5753ca646f166bc2e8b13c1f45181f3205b9
MD5 8b93145ba6e108f6d15cef7ebf165d9f
BLAKE2b-256 42e0ef737838e9376a2265320e24897f18fe5d73b2ca3433e195a2f969e90085

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 7ba250523def9cf19ee176d1a9b5d78df4939c20ad4dc1826b300bb334d6eaea
MD5 e143ca0303c8fbe22268507f4aee6718
BLAKE2b-256 ae6f77479cbfdf90a15f57f417b4780f330ef64f1ad5900a7cc42f7dd0330969

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 16f3744cd697e225b1d6a9f37fd2a4aed7bd6b89dd4c431af5a76c39fda0ab38
MD5 46f8a1ecb2ff6dad9408633d89485133
BLAKE2b-256 307f5c1c0ba114f4bd50aed095f2695ffd28f292f04463f99ba3a8cfc2f183eb

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: spconv_cu126-2.3.8-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 68.8 MB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.9.13

File hashes

Hashes for spconv_cu126-2.3.8-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 88c377e32e5d1bb1edc5fcdc8eac9bc97145dcf9ea6051aa9b919c21bd54959d
MD5 a239453f2aba5021a85bc586abafa7d9
BLAKE2b-256 49aa274971a07f566226abc04c5cf74f7768e355d434eb9480aeb2038f32f546

See more details on using hashes here.

File details

Details for the file spconv_cu126-2.3.8-cp39-cp39-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu126-2.3.8-cp39-cp39-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 33273af27b8b7499aade0f357e2fc1f89a32e3c22422cf7449d663c2ebb63118
MD5 77fffe1677214069e894def111651604
BLAKE2b-256 b73e024b7995998699a9d22ede45b7f17826cc55013befcc2b0662942ba83ca0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page