Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8* PyPI Version pip install spconv-cu118 pypi monthly download

*: sm_89 and sm_90 is added in CUDA 11.8. If you use RTX 4090 or H100, you should use this version.

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu117-2.3.2-cp311-cp311-win_amd64.whl (66.0 MB view details)

Uploaded CPython 3.11 Windows x86-64

spconv_cu117-2.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.3 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.2-cp310-cp310-win_amd64.whl (66.0 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu117-2.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.3 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.2-cp39-cp39-win_amd64.whl (65.9 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu117-2.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.3 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.2-cp38-cp38-win_amd64.whl (66.0 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu117-2.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.3 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.2-cp37-cp37m-win_amd64.whl (66.0 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu117-2.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.3 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu117-2.3.2-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 18b83ac6cdca13fd1b701e99b280d32b046d9cada155f403d0f3e9a132a6652e
MD5 b181365efe24aceebf51a8ae3fcda9fa
BLAKE2b-256 d9f859d7fe3e916137799134255cbf9ee6afe2cd77da81c99dc5f6eb38433035

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 2b66fd38f06cf3e8a27cffafd41ddc15117d97fbb435be484f7b2544ebe86975
MD5 8dbdac819ab765b9be74f91111f8faac
BLAKE2b-256 2d6a84c9f10224887a3c4e22769a7e9a23f799d4bbc5404eecb9ed5a409a011d

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 d9acea5e123546284d55b3d14834d476b98f7d77d950349e1b8da4b6222b7120
MD5 d1b6b7d6134492586e4b42813d9acb34
BLAKE2b-256 a14d9d123998272ec95c4de191f4fdfef88f25264807624aeb39f1436016b7ec

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 417407893c6f6c28675f859b1cd16e537c2c174c6726977578d9c119062801dc
MD5 7411f573e61a151d20f9eb80d413e91e
BLAKE2b-256 a8d15b75ce7837d0019b1ba81486dcc990a8e3deda36883e2d0dcc43a53b6dcf

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 349753d7ce01d24480cc5a66719466cc556a99350b3846818bc877ab9e023e2f
MD5 d376f567132505e3aa9e23d2a7fd53f2
BLAKE2b-256 a6f553f58870a0c02b4afdfea5c8cc0a24f8e91a1cf1bcc13fe96db08647581b

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 3fd4b032f5c5fd81354f10c7a64427bc4e9b70d072690a043b4b34587c780b64
MD5 64f25851be07f8dd57769ed25ac94b3b
BLAKE2b-256 8f7c8d5aac6c28d9b945ebafabea13480fee8d77e6ba04c8f2e7ca662afd11c5

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 c3baa9ad7edc3fed3695ac93608627ad9f5353ef1d506aefe09285f626663ae9
MD5 698a54794d7900293041c5f929200915
BLAKE2b-256 4707a60024584e4831dc9418d7364abda8713b993c566e4b3f5f636fce7d5e62

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 627d16dc603f7083675d8632f813cb9441196e1066678ca265e8ef67ee553e01
MD5 251e53928075da64d6a8bb87d7b063a0
BLAKE2b-256 9f8a81b6b408be9925c0ea63804617bf716fdfa20a028e99f5d9572fa8ce5992

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 90de639606566cfc2955c15787cfb52ed232a665ee03ed6dddfb6c1b20cb7e60
MD5 fd2d2699612a48a8ed191c32b31a8051
BLAKE2b-256 2811ca43a77b7beed077d9f31f10495d492410526898aecd7fcb8db23759bcee

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 bde00c81811c2f2d372f791c8c5c0edb302f48782c6f3dc084321c1447089b97
MD5 01fa076986fb24849b8d12e52cef5479
BLAKE2b-256 884a62ad0166d143e334691c96a93158cf8086c8ed6dfc746739e539b09fb929

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page