Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.3 (Linux Only) PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download
CUDA 11.7 PyPI Version pip install spconv-cu117 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING

Use spconv >= cu114 if possible. cuda 11.4 can compile greatly faster kernel in some situation.

NEWS

  • spconv 2.2: ampere feature support (by EvernightAurora), pure c++ code generation, nvrtc, drop python 3.6

Spconv 2.2 vs Spconv 2.1

  • faster fp16 conv kernels (~5-30%) in ampere GPUs (tested in RTX 3090)
  • greatly faster int8 conv kernels (1.2x2.7x) in ampere GPUs (tested in RTX 3090)
  • drop python 3.6 support
  • nvrtc support: kernel in old GPUs will be compiled in runtime.
  • libspconv: pure c++ build of all spconv ops. see example
  • tf32 kernels, faster fp32 training, disabled by default. set import spconv as spconv_core; spconv_core.constants.SPCONV_ALLOW_TF32 = True to enable them.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Common Solution for Some Bugs

see common problems.

Install

You need to install python >= 3.7 first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 11.0 to build and run spconv 2.x. We won't offer any support for CUDA < 11.0.

Prebuilt

We offer python 3.7-3.11 and cuda 10.2/11.3/11.4/11.7/12.0 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.11 and cuda 10.2/11.4/11.7/12.0 prebuilt binaries for windows 10/11.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

WARNING: spconv-cu117 may require CUDA Driver >= 515.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

pip install spconv-cu117 for CUDA 11.7

pip install spconv-cu120 for CUDA 12.0

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

If you use a GPU architecture that isn't compiled in prebuilt, spconv will use NVRTC to compile a slightly slower kernel.

CUDA version GPU Arch List
11.x 52,60,61,70,75,80,86
12.x 70,75,80,86,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Contributers

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu117-2.2.2-cp311-cp311-win_amd64.whl (66.2 MB view details)

Uploaded CPython 3.11 Windows x86-64

spconv_cu117-2.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.11 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.2.2-cp310-cp310-win_amd64.whl (66.2 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu117-2.2.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.2.2-cp39-cp39-win_amd64.whl (66.2 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu117-2.2.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.2.2-cp38-cp38-win_amd64.whl (66.2 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu117-2.2.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu117-2.2.2-cp37-cp37m-win_amd64.whl (66.2 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu117-2.2.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (67.6 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu117-2.2.2-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 1c60ef22dc4c7ac5ede847cd881bbae13c9edc651d8222fb9668b5a4d46f962f
MD5 eebc8ccc675e9ab45f4d14cffd758650
BLAKE2b-256 7e5a4c824f20536d416307314e7f274a2ce263c16c6716a8fbf18aea5c803b05

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a26b00e59455be3e0c997488d7b8ee25c8aed5af03206f57996b251e8960384b
MD5 0e595ba00be7f6a65ddc4ea033b1c7c2
BLAKE2b-256 bd912f157594ecd974ed154c11c54ec8f4f7ead59d0c03810be5d81180b2fc62

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 e0a44ed94b8dae59a6cda45ac05d80dde6e1c55021843e8ca32a5d938d70ba56
MD5 bee5241dfccd2813124086482556c8b4
BLAKE2b-256 168a3c4763c15545f279e27aae9597af42601c5d9936938e3aab41cae8ea253d

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 a77f35ff1675ccb6dddfdc83db09a87758472b181ae19179b577d254f0daf12f
MD5 421c1121a9a42934cf01a88882ba97f7
BLAKE2b-256 49a8d88b958b5113aeb9314dfa16e06ce90bff4ab4e261002fbb5cb5e6d26254

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 c35ea5a20505bc0c237bffaf98ca1d0c5f0cead16b32af2e692687fc87d601f7
MD5 b7f7c98f71d2e07ce22df73193da91b2
BLAKE2b-256 b4cdb4bd7596f22ab59322737e3f31d7ab56a8f4c039fb8e70c979a05f9f9072

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 79721d2a8b936811605b2d4635d169e556f0e41c8e003e499fa376784890b375
MD5 613c262d7af3a61d19ddd99834360f82
BLAKE2b-256 85ae867abb90b8e842c176acc45a06c97f5ec3023fa797443e11dec3a509f09c

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 a45084e59f8ec04f7dc4c6853bb5a103d389cfa64b4fb928582036014407c2b7
MD5 b8fe2f5a6f6a60c8c7ad64129a326bb4
BLAKE2b-256 e962fc85de60107501effebba2c5e9bf55971064772d230ff567194e6096cb98

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 82a19938a4bd87c60ad2bb33bf9f0fb94f32fc7f89d92bec2fef666e4b11f0b0
MD5 aab225694cd94ab9394d0bfd1182c49d
BLAKE2b-256 458bedb1a5040c51c3f35fe27744fe211caa85f5d798a726c84198cb9c58d33f

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 04bb242b23725af476102af07c8ff75842cc9cf2d9fc5425f1cdb5dedcee54ed
MD5 dbb2ad36a24bb799fdcfd559af4c3d24
BLAKE2b-256 cae77ef98f5ba19d57a11f2571dd558b1a396cdfcb29ed2b83283ff530095898

See more details on using hashes here.

File details

Details for the file spconv_cu117-2.2.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu117-2.2.2-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 0a32c50986911d9e81a7a56948800bd4e4c2ca63c39963430b3abbea8bc3fd53
MD5 9636961d9d8ecd693f5aceae31c5bdf4
BLAKE2b-256 02e8778ebdd44fb5cb4c1365ea74e1bdbd5f7601285caacad2ff6afd28165ff1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page