Skip to main content

A Python tool for 3D adaptive binary space partitioning and beyond

Project description


License: MIT PyPI version Build status

Introduction

abspy is a Python tool for 3D adaptive binary space partitioning and beyond: an ambient 3D space is recessively partitioned into non-overlapping convexes with pre-detected planar primitives, where the adjacency graph is dynamically obtained. It is implemented initially for surface reconstruction, but can be extrapolated to other applications nevertheless.

partition

An exact kernel of SageMath is used for robust Boolean spatial operations. This rational-based representation help avoid degenerate cases that may otherwise result in inconsistencies in the geometry.

Installation

Install requirements

All dependencies except for SageMath can be easily installed with PyPI:

pip install -r requirements.txt

Optionally, install trimesh and pyglet for benchmarking and visualisation, respectively:

pip install trimesh pyglet

Install SageMath

For Linux and macOS users, the easist is to install from conda-forge:

conda config --add channels conda-forge
conda install sage

Alternatively, you can use mamba for faster parsing and package installation:

conda config --add channels conda-forge
conda install mamba
mamba install sage

For Windows users, you may have to build SageMath from source or install all other dependencies into a pre-built SageMath environment.

Install abspy

abspy can be found and installed via PyPI:

pip install abspy

Quick start

Here is an example of loading a point cloud in VertexGroup (.vg), partitioning the ambient space into candidate convexes, creating the adjacency graph and extracting the outer surface of the object. For the data structure of a .vg file, please refer to VertexGroup.

import numpy as np
from abspy import VertexGroup, AdjacencyGraph, CellComplex

# load a point cloud in VertexGroup 
vertex_group = VertexGroup(filepath='points.vg')

# normalise the point cloud
vertex_group.normalise_to_centroid_and_scale()

# retrieve planes, bounds and points from VertexGroup
planes, bounds, points = np.array(vertex_group.planes), np.array(vertex_group.bounds), np.array(vertex_group.points_grouped, dtype=object)

# additional planes to append (e.g., the bounding planes)
additional_planes = [[0, 0, 1, -bounds[:, 0, 2].min()]]

# initialise CellComplex from planar prititives
cell_complex = CellComplex(planes, bounds, points, build_graph=True, additional_planes=additional_planes)

# refine planar primitives
cell_complex.refine_planes()

# prioritise certain planes
cell_complex.prioritise_planes()

# construct CellComplex 
cell_complex.construct()

# print info on the cell complex
cell_complex.print_info()

# visualise the cell complex (only if trimesh installation is found)
cell_complex.visualise()

# build adjacency graph of the cell complex
graph = AdjacencyGraph(cell_complex.graph)

# apply random weights (could instead be the predicted probability
# for each convex being selected as composing the object in practice)
weights_list = np.array([random.random() for _ in range(cell_complex.num_cells)])
weights_list *= cell_complex.volumes(multiplier=10e5)
weights_dict = graph.to_dict(weights_list)

# assign weights to n-links and st-links to the graph
graph.assign_weights_to_n_links(cell_complex.cells, attribute='area_overlap', factor=0.1, cache_interfaces=True)
graph.assign_weights_to_st_links(weights_dict)

# perform graph-cut
_, _ = graph.cut()

# save surface model to an obj file
graph.save_surface_obj('surface.obj', engine='rendering')

Misc

  • Why adaptive?

Adaptive space partitioning can significantly reduce computations for cell complex creation, compared to an exhaustive partitioning strategy. The excessive number of cells not only hinders computation but also inclines to defective surfaces on subtle structures where inaccurate labels are more likely to be assigned.

adaptive

  • What weights to assign to the adjacency graph?

There are two kinds of weights to assign to an S-T graph: either to n-links or to st-links. For surface reconstruction using graph-cut, assign the predicted probability of occupancy for each cell to the st-links, while assign overlap area to the n-links. Read this paper for more information on this Markov random field formulation.

License

MIT

Citation

If you use abspy in a scientific work, please cite:

@article{chen2021reconstructing,
  title={Reconstructing Compact Building Models from Point Clouds Using Deep Implicit Fields},
  author={Chen, Zhaiyu and Khademi, Seyran and Ledoux, Hugo and Nan, Liangliang},
  journal={arXiv preprint arXiv:2112.13142},
  year={2021}
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

abspy-0.0.2.tar.gz (21.3 kB view details)

Uploaded Source

Built Distribution

abspy-0.0.2-py3-none-any.whl (19.2 kB view details)

Uploaded Python 3

File details

Details for the file abspy-0.0.2.tar.gz.

File metadata

  • Download URL: abspy-0.0.2.tar.gz
  • Upload date:
  • Size: 21.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.12

File hashes

Hashes for abspy-0.0.2.tar.gz
Algorithm Hash digest
SHA256 7abe8b7067efa9d24139818848fa9b2718bf80df42d24371cbef8cfa675e2252
MD5 31a31126810b548a3bd89a681c31bedc
BLAKE2b-256 d349517be50fcad98cc34e977dbd6500e9d775fb1093f2ac3e946c924890e309

See more details on using hashes here.

File details

Details for the file abspy-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: abspy-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 19.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.10.0 pkginfo/1.8.2 requests/2.27.1 requests-toolbelt/0.9.1 tqdm/4.58.0 CPython/3.8.12

File hashes

Hashes for abspy-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 23372a7717e3ea7266ff72a4bf0692e3a41a438f80ee14cfc55b5a738f7c7622
MD5 774b598be0a464efff26bd2ad461ff93
BLAKE2b-256 761cb593f29378272ae2cda3aadcf760e8d369ef2f147e9cbbf331b0c70fa717

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page