Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status pypi versions

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.6 PyPI Version pip install spconv-cu116 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download
CUDA 11.8 PyPI Version pip install spconv-cu118 pypi monthly download
CUDA 12.0 PyPI Version pip install spconv-cu120 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

Update Spconv: you MUST UNINSTALL all spconv/cumm/spconv-cuxxx/cumm-cuxxx first, use pip list | grep spconv and pip list | grep cumm to check all installed package. then use pip to install new spconv.

NEWS

  • spconv 2.3: int8 quantization support. see docs and examples for more details.

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (~1.2x-2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.
  • all weights are KRSC layout, some old model can't be loaded anymore.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82. You may need newer driver if you use newer CUDA. for cuda 11.8, you need to have driver >= 520 installed.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.1~11.7 52,60,61,70,75,80,86
11.8+ 60,70,75,80,86,89,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Citation

If you find this project useful in your research, please consider cite:

@misc{spconv2022,
    title={Spconv: Spatially Sparse Convolution Library},
    author={Spconv Contributors},
    howpublished = {\url{https://github.com/traveller59/spconv}},
    year={2022}
}

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu117-2.3.5-cp311-cp311-win_amd64.whl (66.3 MB view details)

Uploaded CPython 3.11 Windows x86-64

spconv_cu117-2.3.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.5-cp310-cp310-win_amd64.whl (66.3 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu117-2.3.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.5-cp39-cp39-win_amd64.whl (66.3 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu117-2.3.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.5-cp38-cp38-win_amd64.whl (66.3 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu117-2.3.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.3.5-cp37-cp37m-win_amd64.whl (66.3 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu117-2.3.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu117-2.3.5-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 40dbb432d23e973091f9ab86787e533870bd4d6260cb74555493f60ceb43c553
MD5 99be615d72c75650e340ba9e77040072
BLAKE2b-256 0fa8f13c970393a6f957ffa7c541a19f8fbc80b065182e1ece5a4e4caee49b71

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 442cd935051dc5b7d7d9bd9c3bd01701a967f04d75c70e8cf3df3dbb8034dd68
MD5 3493ea485557a9c85f24e287573a9081
BLAKE2b-256 6642148aa99edbb4931a7ed730ff4485c5dedbb1d9b64358d3af25d72c83aad2

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 889ee104064955bee18c07005a797b4c8ea2aa819650fe5482d556c08ca69f27
MD5 0ce292f084a504b0d12f03417e6006a7
BLAKE2b-256 22e049e50a04fe0b459896c9e979aab16601207368a92d15552e634fe3ef2ca8

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a34183ef873c1f2da7b555e14df08ba1b318d63e6ec210df232f9f22c5cc161b
MD5 41e1ef3d22fc30dad231642554872e02
BLAKE2b-256 aca0da07296efffa303bbc4827bb56bae1db226c6d473bbd27abec8ae6c396d3

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 3f026513f9177d2d32f2f6552edcd7b736edb4d802683af7663ca32cfb5502d6
MD5 1e14965d21a979a2d8ecc0ed5399ef7a
BLAKE2b-256 d92ea7d2c6976857d75a46154e73c7bec137d6bd0364a0df89a3d82a99206d71

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 380d7c5f1b0f9a5fe0792524d94fd8b98d8d50e807efa4bbfd61da051c6ec9cf
MD5 f45d5736c603ac6a34a99453f5f83ebc
BLAKE2b-256 2d4b54229449e9bdcaa86e421c2c5fceebd965544e6a3580856a628d28dca258

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 e7ab7b03709504974c482973daf9055e56e1aebb0a3c3602be41119b6ea5b5aa
MD5 d1fd0997a94bc5ad444d8f8635f6efed
BLAKE2b-256 d4554b1b3167651670199d4a59544ea073897ee0cb1e4d3ad1d1ede1c41e13ca

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 8096f6bcea551931cecccb9e981234c95f5e6a6bb329bc4901ab1a2ee2a3e45b
MD5 b40577c435d4682255b5f563e0cb8aba
BLAKE2b-256 e73e8076b26781a6118356231a40a0c90aac76f24efd1115ddebbfe270ed1e2a

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 da91687e6656b7da67b8205a94262cc106540edb33897f2436f7d584afe5dd92
MD5 81b7856081f1259701aef1f8edcba290
BLAKE2b-256 b8e813d9c8438dc2965896f6ed4b8167e1a75bd2a739689e36bc769cac21f5c1

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.3.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.3.5-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 bf6921a64aefcca3addfcdd8a036d1b879e8088a3b11661998dd799b5af0973e
MD5 aa45eed3673346557a8ee4fc394e8f53
BLAKE2b-256 84c38920b36029840b8d81bd86298aa69755474e48601830026d2293eb117aa7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page