Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8 PyPI Version pip install spconv-cu118 pypi monthly download
CUDA 12.0 PyPI Version pip install spconv-cu120 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

spconv_cu118-2.3.8-cp313-cp313-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.13Windows x86-64

spconv_cu118-2.3.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.8-cp312-cp312-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.12Windows x86-64

spconv_cu118-2.3.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.8-cp311-cp311-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.11Windows x86-64

spconv_cu118-2.3.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.8-cp310-cp310-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.10Windows x86-64

spconv_cu118-2.3.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

spconv_cu118-2.3.8-cp39-cp39-win_amd64.whl (74.5 MB view details)

Uploaded CPython 3.9Windows x86-64

spconv_cu118-2.3.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (75.9 MB view details)

Uploaded CPython 3.9manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu118-2.3.8-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 91452eb131782bcb9b74720dc9c3ea2e5e9f76dd81dc54c790cb17e0cf019b96
MD5 e876625b4309cb154ef26d2d8a5c934f
BLAKE2b-256 3dafea9f1cc2acc9b0cb6e32313b051798797272d7e746e181cb432a46483a20

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a634ec81a402e60942d9fdc6b53fd510524fe5c832cbc30967bfd5df70e6022c
MD5 2744f3ffaea41dede557fffb51998618
BLAKE2b-256 663c557c60f9cbfa774839c7931af5d4dc7a5a174ca8f7acf05394d21789ffba

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 51c12effb7e64eb7487716e2e4d7bf7ce06a7605479f0a44b70d6dd20493c397
MD5 2e84b3166e1f9e2300a3e895bcf43863
BLAKE2b-256 6735d73143bb7ead81ecb2a11d8934b845b3f99ff39cd3784d5820cdab199357

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 40a1779ef90b6ecd6085073eec656b99a9e02d10c09784e9483d73d01dd3f3c5
MD5 af76dd14ad95e6a47fa94f4db68c80af
BLAKE2b-256 5c1d5f084d806860c843348ccfcc12526b7419baab1b58fbb78504583d698f47

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 f0b6df0b9d2aa88fd9efb1a45aec130591bcb0ce04ee529ff075a32f0acd2b28
MD5 5cbf51c3f5f1e3e7401038525e6e7047
BLAKE2b-256 71974d591572870cf6d892d861ad65626d9c67984628c6c793aa2d1b01c1bee6

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 c15891ce1a3983266eaf21b16cb727da1d58910fdb4556a9f5de7bf4621faf33
MD5 a8619a2c3fab25b30946e9296f12a99f
BLAKE2b-256 00c8e7e4986bf86e61ed166b1a235d8bdbacfdba4c5ea95759cdc001ab614877

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 f1dcc8195b87f99913511b76b91e40d429b69f34b73e036072af3aed4f350bb2
MD5 7fb1d9fe4a152864fa26c62ce80ec6fa
BLAKE2b-256 3bc6e20b7234de19367e7b3ca0398d19d64bbfd6ebe2e228986531259234ae24

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 d32f056457773522b54fefcd3380684c92b531fc7f4fb303d623eee8c24d4e54
MD5 a00581f6291051f3745ec15e3f810f93
BLAKE2b-256 f0436bc7940e9d6b582b729c1abe8b4987e9652e555016bc38868f8cd69d30e0

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp39-cp39-win_amd64.whl.

File metadata

  • Download URL: spconv_cu118-2.3.8-cp39-cp39-win_amd64.whl
  • Upload date:
  • Size: 74.5 MB
  • Tags: CPython 3.9, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.0.1 CPython/3.9.13

File hashes

Hashes for spconv_cu118-2.3.8-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 c21dafc90a0b00a58017c3afaa41be12e1f6013127c3bc6c242f80e21319e1cb
MD5 4650686b54af7e8e08f7b1e1ec77af9d
BLAKE2b-256 0723f7d4885ec47ef91c6906aced92a5a73824d63cc1dc6f98861a7761a31b4b

See more details on using hashes here.

File details

Details for the file spconv_cu118-2.3.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu118-2.3.8-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5c94a4f527f271fc7f07922422fb11ec9803c51a56042f6e1ad412fe67197d3f
MD5 cfd0780cd26f079c5e90b6acfd6531a1
BLAKE2b-256 40ea1ae5aa5e8c3bbbddb1f1340496a44735a8e54fcc97b496ec7a08fe900657

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page