Skip to main content

All-in-one repository for state-of-the-art NeRFs

Project description

Test Status Viewer build Status Documentation Status Documentation Status PyPI version

nerfstudio

The all-in-one repo for NeRFs

documentation viewer

Philosophy

All-in-one repository for state-of-the-art NeRFs.

nerfstudio provides a simple API that allows for a seamless and simplified end-to-end process of creating, training, and visualizing NeRFs. The library supports a more interpretable implementation of NeRFs by modularizing each component. With more modular NeRFs, not only does your code become far more user-friendly, but using this framework also makes it easier for the community to build upon your implementation.

It’s as simple as plug and play with nerfstudio!

Ontop of our API, we are commited to providing learning resources to help you understand the basics of (if you're just getting start), and keep up-to-date with (if you're a seasoned veteran) all things NeRF. As researchers, we know just how hard it is to get onboarded with this next-gen technology. So we're here to help with tutorials, documentation, and more!

Finally, have feature requests? Want to add your brand-spankin'-new NeRF model? Have a new dataset? We welcome any and all contributions!

We hope nerfstudio enables you to build faster :hammer: learn together :books: and contribute to our NeRF community :sparkling_heart:.

Quickstart

The quickstart will help you get started with the default vanilla nerf trained on the classic blender lego scene. For more complex changes (e.g. running with your own data/ setting up a new NeRF graph, please refer to our references.

1. Installation: Setup the environment

Create environment

We reccomend using conda to manage dependencies. Make sure to install Conda before preceding.

conda create --name nerfstudio -y python=3.8.13;
conda activate nerfstudio
python -m pip install --upgrade pip

Dependencies

Install pytorch with CUDA (this repo has been tested with CUDA 11.3) and tiny-cuda-nn

pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 -f https://download.pytorch.org/whl/torch_stable.html
pip install git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch

Installing nerfstudio

Easy option:

pip install nerfstudio

If you would want the latest and greatest:

git clone git@github.com:plenoptix/nerfstudio.git
cd nerfstudio
pip install -e .

Optional Installs

Tab completion (bash & zsh)

This needs to be rerun when the CLI changes, for example if nerfstudio is updated.

ns-install-cli

Development packages

pip install -e.[dev]
pip install -e.[docs]

2. Getting the data

Download the original NeRF Blender dataset. We support the major datasets and allow users to create their own dataset, described in detail here.

ns-download-data --dataset=blender
ns-download-data --dataset=nerfstudio --capture=poster

Use --help to view all currently available datasets. The resulting script should download and unpack the dataset as follows:

|─ nerfstudio/
   ├─ data/
   |  ├─ blender/
   |     ├─ fern/
   |     ├─ lego/
         ...
      |- <dataset_format>/
         |- <scene>
         ...

3. Training a model

To run with all the defaults, e.g. vanilla nerf method with the blender lego image

# To see what models are available.
ns-train --help

# Run a vanilla nerf model.
ns-train vanilla-nerf

# Run with nerfacto model.
ns-train nerfacto

# Run with nerfstudio data. You'll may have to change the ports, and be sure to forward the "websocket-port".
ns-train nerfacto --vis viewer --viewer.zmq-port 8001 --viewer.websocket-port 8002 nerfactory-data --pipeline.datamanager.dataparser.data-directory data/nerfstudio/poster --pipeline.datamanager.dataparser.downscale-factor 4

3.x Training a model with the viewer

Make sure to forward a port for the websocket to localhost. The default port is 7007, which you should be expose to localhost:7007.

# with the default port
ns-train nerfacto --vis viewer

# with a specified websocket port
ns-train nerfacto --vis viewer --viewer.websocket-port=7008

4. Visualizing training runs

We support multiple methods to visualize training, the default configuration uses Tensorboard. More information on logging can be found here.

Real-time Viewer

We have developed our own Real-time web viewer, more information can be found here. This viewer runs during training and is designed to work with models that have fast rendering pipelines.

To turn on the viewer, simply add the flag --vis viewer.

Tensorboard

If you run everything with the default configuration we log all training curves, test images, and other stats. Once the job is launched, you will be able to track training by launching the tensorboard in your base experiment directory (Default: outputs/).

tensorboard --logdir outputs/
Weights & Biases

We support logging to weights and biases. To enable wandb logging, add the flag --logging.writer.1.enable.

5. Rendering a trajectories during inference

ns-eval render-trajectory --load-config=outputs/blender_lego/instant_ngp/2022-07-07_230905/config.yml--traj=spiral --output-path=output.mp4

6. In-depth guide

For a more in-depth tutorial on how to modify/implement your own NeRF Graph, please see our walk-through.

Learn More

Section Description
Documentation Full API documentation and tutorials
Interactive Guides Go-to spot for learning how NeRFs and each of its modules work.
Quick tour Example script on how to navigate Nerfactory from install, train, to test.
Creating pipelines Learn how to easily build new neural rendering pipelines by using and/or implementing new modules.
Creating datsets Have a new dataset? Learn how to use it with Nerfactory.
Mobile Capture to NerF Step-by-step tutorial on how to create beautiful renders with just your phone.
Contributing Walk-through for how you can start contributing now.
Slack Join our community to discuss more. We would love to hear from you!

Supported Features

We provide the following support strucutures to make life easier for getting started with NeRFs. For a full description, please refer to our features page.

If you are looking for a feature that is not currently supported, please do not hesitate to contact the Plenoptix team!

  • :mag_right: Web-based visualizer that allows you to:
    • Visualize training in real-time + interact with the scene
    • Create and render out scenes with custom camera trajectories
    • View different output types
    • And more!
  • :pencil2: Support for multiple logging interfaces (Tensorboard, Wandb), code profiling, and other built-in debugging tools
  • :chart_with_upwards_trend: Easy-to-use benchmarking scripts on the Blender dataset
  • :iphone: Full pipeline support (w/ Colmap or Record3D) for going from a video on your phone to a full 3D render. Follow our step-by-step tutorial. (TODO: walk-through page on end-to-end pipeline from capture -> render)

See what's possible

TODO: insert some gallery stuff here (gifs/pretty pictures w/ visualizer) TODO: For more see gallery

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerfstudio-0.0.3.1.tar.gz (470.6 kB view hashes)

Uploaded Source

Built Distribution

nerfstudio-0.0.3.1-py3-none-any.whl (579.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page