Skip to main content

Octree-based Sparse Convolutional Neural Networks

Project description

O-CNN

Documentation

Documentation Status Downloads Downloads PyPI

This repository contains the pure PyTorch-based implementation of O-CNN. The code has been tested with Pytorch>=1.6.0, and Pytorch>=1.9.0 is preferred.

O-CNN is an octree-based sparse convolutional neural network framework for 3D deep learning. O-CNN constrains the CNN storage and computation into non-empty sparse voxels for efficiency and uses the octree data structure to organize and index these sparse voxels.

The concept of sparse convolution in O-CNN is the same with H-CNN, SparseConvNet, and MinkowskiNet. The key difference is that our O-CNN uses the octree to index the sparse voxels, while these 3 works use the Hash Table.

Our O-CNN is published in SIGGRAPH 2017, H-CNN is published in TVCG 2018, SparseConvNet is published in CVPR 2018, and MinkowskiNet is published in CVPR 2019. Actually, our O-CNN was submitted to SIGGRAPH in the end of 2016 and was officially accepted in March, 2017. The camera-ready version of our O-CNN was submitted to SIGGRAPH in April, 2017. We just did not post our paper on Arxiv during the review process of SIGGRAPH. Therefore, the idea of constraining CNN computation into sparse non-emtpry voxels is first proposed by our O-CNN. Currently, this type of 3D convolution is known as Sparse Convolution in the research community.

Key benefits of ocnn-pytorch

  • Simplicity. The ocnn-pytorch is based on pure PyTorch, it is portable and can be installed with a simple command:pip install ocnn. Other sparse convolution frameworks heavily rely on C++ and CUDA, and it is complicated to configure the compiling environment.

  • Efficiency. The ocnn-pytorch is very efficient compared with other sparse convolution frameworks. It only takes 18 hours to train the network on ScanNet for 600 epochs with 4 V100 GPUs. For reference, under the same training settings, MinkowskiNet 0.4.3 takes 60 hours and MinkowskiNet 0.5.4 takes 30 hours.

Citation

@article {Wang-2017-ocnn,
  title    = {{O-CNN}: Octree-based Convolutional Neural Networksfor {3D} Shape Analysis},
  author   = {Wang, Peng-Shuai and Liu, Yang and Guo, Yu-Xiao and Sun, Chun-Yu and Tong, Xin},
  journal  = {ACM Transactions on Graphics (SIGGRAPH)},
  volume   = {36},
  number   = {4},
  year     = {2017},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ocnn-2.2.5.tar.gz (36.1 kB view details)

Uploaded Source

Built Distribution

ocnn-2.2.5-py3-none-any.whl (53.1 kB view details)

Uploaded Python 3

File details

Details for the file ocnn-2.2.5.tar.gz.

File metadata

  • Download URL: ocnn-2.2.5.tar.gz
  • Upload date:
  • Size: 36.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for ocnn-2.2.5.tar.gz
Algorithm Hash digest
SHA256 1f28ffed6aacd3b5c19dd8b8427818f5cbafc0f75e642e8de9674dea9696fc95
MD5 594e976774781e5ca490f5a3e723f7f2
BLAKE2b-256 43ddff07d19a9dd64d64136e61a5560ee9ca5be1133ef3e83c18cd33211a2ded

See more details on using hashes here.

File details

Details for the file ocnn-2.2.5-py3-none-any.whl.

File metadata

  • Download URL: ocnn-2.2.5-py3-none-any.whl
  • Upload date:
  • Size: 53.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for ocnn-2.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 bdf6ea5a846f78a4622d562f4d8ce91a367ecf7d977c2b24c4a20399c7570289
MD5 9f3ea310f931191247a297f1a886a32e
BLAKE2b-256 bff36637cdac27fee85e5f8aad78224aac5453bf7aad64cabe07fa1dcc12c693

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page