Skip to main content

A toolkit for loading Unseen Object Instance Segmentation (UOIS) datasets.

Project description

uois_toolkit

A toolkit for Unseen Object Instance Segmentation (UOIS)
banner

Sanity Check

A PyTorch-based toolkit for loading and processing datasets for Unseen Object Instance Segmentation (UOIS). This repository provides a standardized, easy-to-use interface for several popular UOIS datasets, simplifying the process of training and evaluating segmentation models.


Table of Contents


Installation

Prerequisites

  • Python 3.9+
  • An environment manager like conda is recommended.

Steps

  1. Clone the repository:

    git clone https://github.com/OnePunchMonk/uois_toolkit.git
    cd uois_toolkit
    
  2. Install the package: Installing in editable mode (-e) allows you to modify the source code without reinstalling. The command will automatically handle all necessary dependencies listed in pyproject.toml.

    pip install -e .
    

Note about detectron2

This project depends on detectron2 for some dataset utilities and mask handling. detectron2 includes C++ extensions and must be built for your platform — it cannot always be installed as a pure Python wheel. Please follow the official installation instructions in the Detectron2 meta-repository and install a version compatible with your PyTorch and CUDA (or CPU-only) environment before running the tests or using the datasets:

On many systems you can install a compatible CPU-only wheel using the prebuilt index, or build from source if needed. If you are running on CI, ensure the runner has the necessary build tools and compatible PyTorch version.


Supported Datasets

This toolkit provides dataloaders for the following datasets:

  • Tabletop Object Discovery (TOD)
  • OCID
  • OSD
  • Robot Pushing
  • iTeach-HumanPlay

Download Links

Directory Setup

It is recommended to organize the downloaded datasets into a single DATA/ directory for convenience, though you can specify the path to each dataset individually.


Usage Example

You can easily import the datamodule into your own projects. The example below demonstrates how to load the tabletop dataset using pytorch-lightning.

from uois_toolkit import get_datamodule, cfg
import pytorch_lightning as pl

# 1. Define the dataset name and its location
dataset_name = "tabletop"
data_path = "/path/to/your/data/tabletop"

# 2. Get the datamodule instance
# The default configuration can be customized by modifying the `cfg` object
data_module = get_datamodule(
    dataset_name=dataset_name,
    data_path=data_path,
    batch_size=4,
    num_workers=2,
    config=cfg
)

# 3. The datamodule is ready to be used with a PyTorch Lightning Trainer
# model = YourLightningModel()
# trainer = pl.Trainer(accelerator="auto")
# trainer.fit(model, datamodule=data_module)

# Alternatively, you can inspect a data batch directly
data_module.setup()
train_loader = data_module.train_dataloader()
batch = next(iter(train_loader))

print(f"Successfully loaded a batch from the {dataset_name} dataset!")
print("Image tensor shape:", batch["image_color"].shape)

Testing

Local Validation

The repository includes a pytest suite to verify that the dataloaders and processing pipelines are working correctly.

To run the tests, you must provide the root paths to your downloaded datasets using the --dataset_path argument.

python -m pytest test/test_datamodule.py -v \
  --dataset_path tabletop=/path/to/your/data/tabletop \
  --dataset_path ocid=/path/to/your/data/ocid \
  --dataset_path osd=/path/to/your/data/osd
  # Add other dataset paths as needed

Note: You only need to provide paths for the datasets you wish to test.

Continuous Integration

This repository uses GitHub Actions to perform automated sanity checks on every push and pull request to the main branch. This workflow ensures that:

  1. The package installs correctly.
  2. The code adheres to basic linting standards.
  3. All core modules remain importable.

This automated process helps maintain code quality and prevents the introduction of breaking changes.


For Maintainers

Click to expand for PyPI publishing instructions
# 1. Install build tools
python -m pip install build twine

# 2. Clean previous builds
rm -rf build/ dist/ *.egg-info

# 3. Build the distribution files
python -m build

# 4. Upload to PyPI (requires a configured PyPI token)
twine upload dist/*

License

This project is licensed under the MIT License. See the LICENSE file for more details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

uois_toolkit-0.1.2.tar.gz (3.4 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

uois_toolkit-0.1.2-py3-none-any.whl (4.1 MB view details)

Uploaded Python 3

File details

Details for the file uois_toolkit-0.1.2.tar.gz.

File metadata

  • Download URL: uois_toolkit-0.1.2.tar.gz
  • Upload date:
  • Size: 3.4 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for uois_toolkit-0.1.2.tar.gz
Algorithm Hash digest
SHA256 df708396aa159c07d8c1795893d0a2d2766ca4aa61309aff1ea04123c57a2f43
MD5 fb4d47affe8213e4519a039176b83f82
BLAKE2b-256 73b7f843ef48ea8584bf17acf7870d4eb092444bcad3b656842916d5f71e2d19

See more details on using hashes here.

File details

Details for the file uois_toolkit-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: uois_toolkit-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 4.1 MB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.10.11

File hashes

Hashes for uois_toolkit-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 78d890dfceeb3b01a9888389cd68653ced08c2313b2f72fd58907dd1a264b295
MD5 b303dab59e42f1f1da7c2f5cbeaffe3a
BLAKE2b-256 3e0d9461846d68ccc3f127a952fa72ee9df7477d4b0acaf8d05a0a94958e4e21

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page