Skip to main content

spatial sparse convolution

Project description

SpConv: Spatially Sparse Convolution Library

Build Status

PyPI Install Downloads
CPU (Linux Only) PyPI Version pip install spconv pypi monthly download
CUDA 10.2 PyPI Version pip install spconv-cu102 pypi monthly download
CUDA 11.1 PyPI Version pip install spconv-cu111 pypi monthly download
CUDA 11.3 (Linux Only) PyPI Version pip install spconv-cu113 pypi monthly download
CUDA 11.4 PyPI Version pip install spconv-cu114 pypi monthly download

spconv is a project that provide heavily-optimized sparse convolution implementation with tensor core support. check benchmark to see how fast spconv 2.x runs.

Spconv 1.x code. We won't provide any support for spconv 1.x since it's deprecated. use spconv 2.x if possible.

Check spconv 2.x algorithm introduction to understand sparse convolution algorithm in spconv 2.x!

WARNING spconv < 2.1.18 users need to upgrade your version to 2.1.18, it fix a bug in conv weight init which cause std of inited weight too large, and a bug in PointToVoxel.

Breaking changes in Spconv 2.x

Spconv 1.x users NEED READ THIS before using spconv 2.x.

Spconv 2.1 vs Spconv 1.x

  • spconv now can be installed by pip. see install section in readme for more details. Users don't need to build manually anymore!
  • Microsoft Windows support (only windows 10 has been tested).
  • fp32 (not tf32) training/inference speed is increased (+50~80%)
  • fp16 training/inference speed is greatly increased when your layer support tensor core (channel size must be multiple of 8).
  • int8 op is ready, but we still need some time to figure out how to run int8 in pytorch.
  • doesn't depend on pytorch binary, but you may need at least pytorch >= 1.5.0 to run spconv 2.x.
  • since spconv 2.x doesn't depend on pytorch binary (never in future), it's impossible to support torch.jit/libtorch inference.

Spconv 2.x Development and Roadmap

Spconv 2.2 development has started. See this issue for more details.

See dev plan. A complete guide of spconv development will be released soon.

Usage

Firstly you need to use import spconv.pytorch as spconv in spconv 2.x.

Then see this.

Don't forget to check performance guide.

Install

You need to install python >= 3.6 (>=3.7 for windows) first to use spconv 2.x.

You need to install CUDA toolkit first before using prebuilt binaries or build from source.

You need at least CUDA 10.2 to build and run spconv 2.x. We won't offer any support for CUDA < 10.2.

Prebuilt

We offer python 3.6-3.10 and cuda 10.2/11.1/11.3/11.4 prebuilt binaries for linux (manylinux).

We offer python 3.7-3.10 and cuda 10.2/11.1/11.4 prebuilt binaries for windows 10/11.

We will provide prebuilts for CUDA versions supported by latest pytorch release. For example, pytorch 1.10 provide cuda 10.2 and 11.3 prebuilts, so we provide them too.

For Linux users, you need to install pip >= 20.3 first to install prebuilt.

CUDA 11.1 will be removed in spconv 2.2 because pytorch 1.10 don't provide prebuilts for it.

pip install spconv for CPU only (Linux Only). you should only use this for debug usage, the performance isn't optimized due to manylinux limit (no omp support).

pip install spconv-cu102 for CUDA 10.2

pip install spconv-cu111 for CUDA 11.1

pip install spconv-cu113 for CUDA 11.3 (Linux Only)

pip install spconv-cu114 for CUDA 11.4

NOTE It's safe to have different minor cuda version between system and conda (pytorch) in CUDA >= 11.0 because of CUDA Minor Version Compatibility. For example, you can use spconv-cu114 with anaconda version of pytorch cuda 11.1 in a OS with CUDA 11.2 installed.

For CUDA 10, we don't know whether spconv-cu102 works with CUDA 10.0 and 10.1. Users can have a try.

NOTE In Linux, you can install spconv-cuxxx without install CUDA to system! only suitable NVIDIA driver is required. for CUDA 11, we need driver >= 450.82.

Prebuilt GPU Support Matrix

See this page to check supported GPU names by arch.

CUDA version GPU Arch List
10.2 50,52,60,61,70,75
11.x 52,60,61,70,75,80,86
12.x 60,61,70,75,80,86,90

Build from source for development (JIT, recommend)

The c++ code will be built automatically when you change c++ code in project.

For NVIDIA Embedded Platforms, you need to specify cuda arch before build: export CUMM_CUDA_ARCH_LIST="7.2" for xavier, export CUMM_CUDA_ARCH_LIST="6.2" for TX2, export CUMM_CUDA_ARCH_LIST="8.7" for orin.

You need to remove cumm in requires section in pyproject.toml after install editable cumm and before install spconv due to pyproject limit (can't find editable installed cumm).

You need to ensure pip list | grep spconv and pip list | grep cumm show nothing before install editable spconv/cumm.

Linux

  1. uninstall spconv and cumm installed by pip
  2. install build-essential, install CUDA
  3. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  4. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  5. in python, import spconv and wait for build finish.

Windows

  1. uninstall spconv and cumm installed by pip
  2. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  3. set powershell script execution policy
  4. start a new powershell, run tools/msvc_setup.ps1
  5. git clone https://github.com/FindDefinition/cumm, cd ./cumm, pip install -e .
  6. git clone https://github.com/traveller59/spconv, cd ./spconv, pip install -e .
  7. in python, import spconv and wait for build finish.

Build wheel from source (not recommend, this is done in CI.)

You need to rebuild cumm first if you are build along a CUDA version that not provided in prebuilts.

Linux

  1. install build-essential, install CUDA
  2. run export SPCONV_DISABLE_JIT="1"
  3. run pip install pccm cumm wheel
  4. run python setup.py bdist_wheel+pip install dists/xxx.whl

Windows

  1. install visual studio 2019 or newer. make sure C++ development component is installed. install CUDA
  2. set powershell script execution policy
  3. start a new powershell, run tools/msvc_setup.ps1
  4. run $Env:SPCONV_DISABLE_JIT = "1"
  5. run pip install pccm cumm wheel
  6. run python setup.py bdist_wheel+pip install dists/xxx.whl

Know issues

  • Spconv 2.x F16 runs slow in A100.

Note

The work is done when the author is an employee at Tusimple.

LICENSE

Apache 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

spconv_cu111-2.1.25-cp310-cp310-win_amd64.whl (33.6 MB view details)

Uploaded CPython 3.10 Windows x86-64

spconv_cu111-2.1.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.2 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

spconv_cu111-2.1.25-cp39-cp39-win_amd64.whl (33.6 MB view details)

Uploaded CPython 3.9 Windows x86-64

spconv_cu111-2.1.25-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.2 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

spconv_cu111-2.1.25-cp38-cp38-win_amd64.whl (33.6 MB view details)

Uploaded CPython 3.8 Windows x86-64

spconv_cu111-2.1.25-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.2 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

spconv_cu111-2.1.25-cp37-cp37m-win_amd64.whl (33.6 MB view details)

Uploaded CPython 3.7m Windows x86-64

spconv_cu111-2.1.25-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.2 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

spconv_cu111-2.1.25-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (34.2 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

File details

Details for the file spconv_cu111-2.1.25-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 d7df51fe783887ea8b95c32b1bc4ab95d642bde12bf9fc7867ec68b3aa1b572c
MD5 f57fcec78c6e730141569c014408d26a
BLAKE2b-256 39e5c935776147fb752bff0690f890d6c285cc6ef117940b18ddbef4f983cfe6

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7332d6df8dcd131f343baeacd6b1be2b2cb56da268d332451a4e098cc513d25b
MD5 ba8b4ae45170f3f51777f99215d6e1dd
BLAKE2b-256 e0c6882fdc8910fbf3932d221106f3b2b4faf204822b9912d76cf222faef422d

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp39-cp39-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp39-cp39-win_amd64.whl
Algorithm Hash digest
SHA256 0777735fbc90e96d89227b003792dd7d0d2ba47f3a1b343030c3af8facdce4c9
MD5 7aebb6c7f74676c4ae6f05f6b92483f9
BLAKE2b-256 6182e6ecbbd303964346edeed8a4186619b73fffa32a72ad9c331579a62921e1

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5519bfbeeb97d50c351eda1650bae4bb54088b293ddb7f92805873a5e5825936
MD5 e588876665638531d49ac8f9bc3bc46d
BLAKE2b-256 730d7e301266a92514950b860ab1ca0f2ff227f2815dc32fc9b7079b062ed656

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp38-cp38-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 e976917cd545738ff861631e35746fb06cf83d18b68a9edc1b1b6a99c0d8dd95
MD5 f9c8813623731da6e88863b4b745be21
BLAKE2b-256 0e1b5f01291098e3e9fd876b023f59c6cc3b3802ab073f07a7eb0186aa385341

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7895c72333cabedc76691d0f2c64b3588e4746dd817873d41b6be08da8936f68
MD5 79af2d8b617f426bde908f0bef14c590
BLAKE2b-256 bce2d82e85317ed4d5ea1b3590a17a30f8f151c673a7f0bb8928bf2fa3a58e6d

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp37-cp37m-win_amd64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 6421a33ccc2e4a9030a16833196bcdea502ca160d8fe89450dd35549965d1b64
MD5 6890136057c83b570a5dde6206e51ec7
BLAKE2b-256 0312b89cd42bd8cf684f05dc4d848bc63ee34bdacfedbb7689b7c55fddc7e33d

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 40b3b2eaeaa4297804bfee61906f00e41ec8db1fc152f19834bb2fecb2f2dde8
MD5 63d37c6e555cd1a2c4f92b4f8fa7453b
BLAKE2b-256 f403833567767a23fbece6f2930617727bbc9f8d6dbefd0000792af17d1c6124

See more details on using hashes here.

File details

Details for the file spconv_cu111-2.1.25-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for spconv_cu111-2.1.25-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 35909cd30618c864d96a964de3d1b0cfb8754b82f55aa6b8a301e6875090dd07
MD5 ed9fd565f272217f9853221c6466bb7b
BLAKE2b-256 2580667c76001b1fde074cc6aade6092b3db3217f77561d9edf7cfab991f6f85

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page