Skip to main content

Point Cloud Deep Learning Extension Library for PyTorch

Project description

PyPI version codecov Actions Status Documentation Status

This is a framework for running common deep learning models for point cloud analysis tasks against classic benchmark. It heavily relies on Pytorch Geometric and Facebook Hydra.

The framework allows lean and yet complex model to be built with minimum effort and great reproducibility.

Torch-Points3d Templates

Secondary repo containing code templates for Pytorch Lightning and FastAI framework.

For Pytorch Lightning, you will find a script running a point cloud classifier with several bakcbones on ModelNet under 100 lines.

Available Backbones:

  • KPConv,
  • PointNet2
  • RSconv

Project structure

├─ benchmark               # Output from various benchmark runs
├─ conf                    # All configurations for training nad evaluation leave there
├─ notebooks               # A collection of notebooks that allow result exploration and network debugging
├─ docker                  # Docker image that can be used for inference or training
├─ docs                    # All the doc
├─ eval.py                 # Eval script
├─ find_neighbour_dist.py  # Script to find optimal #neighbours within neighbour search operations
├─ forward_scripts         # Script that runs a forward pass on possibly non annotated data
├─ outputs                 # All outputs from your runs sorted by date
├─ scripts                 # Some scripts to help manage the project
├─ torch_points3d
    ├─ core                # Core components
    ├─ datasets            # All code related to datasets
    ├─ metrics             # All metrics and trackers
    ├─ models              # All models
    ├─ modules             # Basic modules that can be used in a modular way
    ├─ utils               # Various utils
    └─ visualization       # Visualization
├─ test
└─ train.py                # Main script to launch a training

As a general philosophy we have split datasets and models by task. For example, datasets has three subfolders:

  • segmentation
  • classification
  • registration

where each folder contains the dataset related to each task.

Methods currently implemented

Available datasets

Segmentation

* S3DIS 1x1
* S3DIS Room
* S3DIS Fused

Registration

Classification

Getting started

Requirements:

  • CUDA > 10
  • Python 3 + headers (python-dev)
  • Poetry (Optional but highly recommended)

Setup repo

Clone the repo to your local machine

Run the following command from the root of the repo

poetry install --no-root

This will install all required dependencies in a new virtual environment.

Activate it

poetry shell

You can check that the install has been successful by running

python -m unittest -v

or from pypi

pip install torch_points3d

Minkowski Engine

The repository is supporting Minkowski Engine which requires openblas-dev and nvcc if you have a CUDA device on your machine. First install openblas

sudo apt install libopenblas-dev

then make sure that nvcc is in your path:

nvcc -V

If it's not then locate it (locate nvcc) and add its location to your PATH variable. On my machine:

export PATH="/usr/local/cuda-10.2/bin:$PATH"

You are now in a position to install MinkowskiEngine with GPU support:

poetry install -E MinkowskiEngine --no-root

Pycuda

pip install pycuda

Train pointnet++ on part segmentation task for dataset shapenet

poetry run python train.py task=segmentation model_type=pointnet2 model_name=pointnet2_charlesssg dataset=shapenet-fixed

And you should see something like that

logging

The config for pointnet++ is a good example of how to define a model and is as follow:

# PointNet++: Deep Hierarchical Feature Learning on Point Sets in a Metric Space (https://arxiv.org/abs/1706.02413)
# Credit Charles R. Qi: https://github.com/charlesq34/pointnet2/blob/master/models/pointnet2_part_seg_msg_one_hot.py

pointnet2_onehot:
  architecture: pointnet2.PointNet2_D
  conv_type: 'DENSE'
  use_category: True
  down_conv:
    module_name: PointNetMSGDown
    npoint: [1024, 256, 64, 16]
    radii: [[0.05, 0.1], [0.1, 0.2], [0.2, 0.4], [0.4, 0.8]]
    nsamples: [[16, 32], [16, 32], [16, 32], [16, 32]]
    down_conv_nn:
      [
        [[FEAT, 16, 16, 32], [FEAT, 32, 32, 64]],
        [[32 + 64, 64, 64, 128], [32 + 64, 64, 96, 128]],
        [[128 + 128, 128, 196, 256], [128 + 128, 128, 196, 256]],
        [[256 + 256, 256, 256, 512], [256 + 256, 256, 384, 512]],
      ]
  up_conv:
    module_name: DenseFPModule
    up_conv_nn:
      [
        [512 + 512 + 256 + 256, 512, 512],
        [512 + 128 + 128, 512, 512],
        [512 + 64 + 32, 256, 256],
        [256 + FEAT, 128, 128],
      ]
    skip: True
  mlp_cls:
    nn: [128, 128]
    dropout: 0.5

Benchmark

S3DIS 1x1

Model Name # params Speed Train / Test Cross Entropy OAcc mIou mAcc
pointnet2_original 3,026,829 04:29 / 01:07(RTX 2060) 0.0512 85.26 45.58 73.11

Shapenet part segmentation

The data reported below correspond to the part segmentation problem for Shapenet for all categories. We report against mean instance IoU and mean class IoU (average of the mean instance IoU per class)

Model Name Use Normals # params Speed Train / Test Cross Entropy CmIou ImIou
pointnet2_charlesmsg Yes 1,733,946 15:07 / 01:20 (K80) 0.089 82.1 85.1
RSCNN_MSG No 3,488,417 05:40 / 0:24 (RTX 2060) 0.04 82.811 85.3

Explore your experiments

We provide a notebook based pyvista and panel that allows you to explore your past experiments visually. When using jupyter lab you will have to install an extension:

jupyter labextension install @pyviz/jupyterlab_pyviz

Run through the notebook and you should see a dashboard starting that looks like the following:

dashboard

Inference

Inference script

We provide a script for running a given pre trained model on custom data that may not be annotated. You will find an example of this for the part segmentation task on Shapenet. Just like for the rest of the codebase most of the customization happens through config files and the provided example can be extended to other datasets. You can also easily create your own from there. Going back to the part segmentation task, say you have a folder full of point clouds that you know are Airplanes, and you have the checkpoint of a model trained on Airplanes and potentially other classes, simply edit the config.yaml and shapenet.yaml and run the following command:

python forward_scripts/forward.py

The result of the forward run will be placed in the specified output_folder and you can use the notebook provided to explore the results. Below is an example of the outcome of using a model trained on caps only to find the parts of airplanes and caps.

resexplore

Containerize your model with Docker

Finally, for people interested in deploying their models to production environments, we provide a Dockerfile as well as a build script. Say you have trained a network for semantic segmentation that gave the weight <outputfolder/weights.pt>, the following command will build a docker image for you:

cd docker
./build.sh outputfolder/weights.pt

You can then use it to run a forward pass on a all the point clouds in input_path and generate the results in output_path

docker run -v /test_data:/in -v /test_data/out:/out pointnet2_charlesssg:latest python3 forward_scripts/forward.py dataset=shapenet data.forward_category=Cap input_path="/in" output_path="/out"

The -v option mounts a local directory to the container's file system. For example in the command line above, /test_data/out will be mounted at the location /out. As a consequence, all files written in /out will be available in the folder /test_data/out on your machine.

Profiling

We advice to use snakeviz and cProfile

Use cProfile to profile your code

poetry run python -m cProfile -o {your_name}.prof train.py ... debugging.profiling=True

And visualize results using snakeviz.

snakeviz {your_name}.prof

It is also possible to use torch.utils.bottleneck

python -m torch.utils.bottleneck /path/to/source/script.py [args]

Troubleshooting

Undefined symbol / Updating Pytorch

When we update the version of Pytorch that is used, the compiled packages need to be reinstalled, otherwise you will run into an error that looks like this:

... scatter_cpu.cpython-36m-x86_64-linux-gnu.so: undefined symbol: _ZN3c1012CUDATensorIdEv

This can happen for the following libraries:

  • torch-points
  • torch-scatter
  • torch-cluster
  • torch-sparse

An easy way to fix this is to run the following command with the virtual env activated:

pip uninstall torch-scatter torch-sparse torch-cluster torch-points-kernels -y
rm -rf ~/.cache/pip
poetry install

Contributing

Contributions are welcome! The only asks are that you stick to the styling and that you add tests as you add more features!

For styling you can use pre-commit hooks to help you:

pre-commit install

A sequence of checks will be run for you and you may have to add the fixed files again to the stashed files.

When it comes to docstrings we use numpy style docstrings, for those who use Visual Studio Code, there is a great extension that can help with that. Install it and set the format to numpy and you should be good to go!

Finaly, if you want to have a direct chat with us feel free to join our slack, just shoot us an email and we'll add you.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torch_points3d-1.0.0.tar.gz (163.7 kB view details)

Uploaded Source

Built Distribution

torch_points3d-1.0.0-py3-none-any.whl (228.1 kB view details)

Uploaded Python 3

File details

Details for the file torch_points3d-1.0.0.tar.gz.

File metadata

  • Download URL: torch_points3d-1.0.0.tar.gz
  • Upload date:
  • Size: 163.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.6.10 Linux/5.3.0-1020-azure

File hashes

Hashes for torch_points3d-1.0.0.tar.gz
Algorithm Hash digest
SHA256 d391109a2f3738a6f0e1694c5b364161c58af796ddf5b994a7a3c19f517c05f4
MD5 7a7846d479d38b62e61a09ffb522b9b5
BLAKE2b-256 c9d0b46ac9d0fd1d3131d938ef3dfcf6e533aad40c864ca9751b1e0ecad69307

See more details on using hashes here.

File details

Details for the file torch_points3d-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: torch_points3d-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 228.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.0.5 CPython/3.6.10 Linux/5.3.0-1020-azure

File hashes

Hashes for torch_points3d-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3e8eaf1d56f74419fb77de6b3d729594bb50e79f6d0cce22eeeee1ee33c48194
MD5 d883dfa27650527227bbcccab998afc4
BLAKE2b-256 1bc762cf680c023af6681e0eed8b71d9396d79f11cc7797c55be1766afb245bf

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page