Skip to main content

Reproducible evaluation of NeRF and 3DGS methods

Project description

NerfBaselines

PyPI - Version GitHub License Downloads

NerfBaselines is a framework for evaluating and comparing existing NeRF and 3DGS methods. Currently, most official implementations use different dataset loaders, evaluation protocols, and metrics, which renders benchmarking difficult. Therefore, this project aims to provide a unified interface for running and evaluating methods on different datasets in a consistent way using the same metrics. But instead of reimplementing the methods, we use the official implementations and wrap them so that they can be run easily using the same interface.

Please visit the project page to see the results of implemented methods on dataset benchmarks.

๐ŸŒ Web ย |ย  ๐Ÿ“„ Paper ย |ย  ๐Ÿ“š Docs

Getting started

Start by installing the nerfbaselines pip package on your host system.

pip install nerfbaselines

Now you can use the nerfbaselines cli to interact with NerfBaselines.

The next step is to choose the backend which will be used to install different methods. At the moment there are the following backends implemented:

  • docker: Offers good isolation, requires docker (with NVIDIA container toolkit) to be installed and the user to have access to it (being in the docker user group).
  • apptainer: Similar level of isolation as docker, but does not require the user to have privileged access.
  • conda (default): Does not require docker/apptainer to be installed, but does not offer the same level of isolation and some methods require additional dependencies to be installed. Also, some methods are not implemented for this backend because they rely on dependencies not found on conda.
  • python: Will run everything directly in the current environment. Everything needs to be installed in the environment for this backend to work.

The backend can be set as the --backend <backend> argument or using the NERFBASELINES_BACKEND environment variable.

Downloading data

For some datasets, e.g. Mip-NeRF 360, NerfStudio, Blender, or Tanks and Temples, the datasets can be downloaded automatically. You can specify the argument --data external://dataset/scene during training or download the dataset beforehand by running nerfbaselines download-dataset external://dataset/scene. Examples:

# Downloads the garden scene to the cache folder.
nerfbaselines download-dataset external://mipnerf360/garden

# Downloads all nerfstudio scenes to the cache folder.
nerfbaselines download-dataset external://nerfstudio

# Downloads kithen scene to folder kitchen
nerfbaselines download-dataset external://mipnerf360/kitchen -o kitchen

Training

To start the training, use the nerfbaselines train --method <method> --data <data> command. Use --help argument to learn about all implemented methods and supported features.

Rendering

The nerfbaselines render --checkpoint <checkpoint> command can be used to render images from a trained checkpoint. Again, use --help to learn about the arguments.

In order to render a camera trajectory (e.g., created using the interactive viewer), use the following command command:

nerfbaselines render-trajectory --checkpoint <checkpoint> --trajectory <trajectory> --output <output.mp4>

Interactive viewer

Given a trained checkpoint, the interactive viewer can be launched as follows:

nerfbaselines viewer --checkpoint <checkpoin> --data <dataset>

Even though the argument --data <dataset> is optional, it is recommended, as the camera poses are used to perform gravity alignment and rescaling for a better viewing experience. It also enables visualizing the input camera frustums.

Results

In this section, we present results of implemented methods on standard benchmark datasets. For detailed results, visit the project page: https://jkulhanek.com/nerfbaselines

Mip-NeRF 360

Mip-NeRF 360 is a collection of four indoor and five outdoor object-centric scenes. The camera trajectory is an orbit around the object with fixed elevation and radius. The test set takes each n-th frame of the trajectory as test views. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/mipnerf360

Method PSNR SSIM LPIPS (VGG) Time GPU mem.
Zip-NeRF 28.553 0.829 0.218 5h 30m 20s 26.8 GB
Scaffold-GS 27.714 0.813 0.262 23m 28s 8.7 GB
Mip-NeRF 360 27.681 0.792 0.272 30h 14m 36s 33.6 GB
Mip-Splatting 27.492 0.815 0.258 25m 37s 11.0 GB
Gaussian Splatting 27.434 0.814 0.257 23m 25s 11.1 GB
Gaussian Opacity Fields 27.421 0.826 0.234 1h 3m 54s 28.4 GB
gsplat 27.412 0.815 0.256 29m 19s 8.3 GB
NerfStudio 26.388 0.731 0.343 19m 30s 5.9 GB
Instant NGP 25.507 0.684 0.398 3m 54s 7.8 GB
COLMAP 16.670 0.445 0.590 2h 52m 55s 0 MB

Blender

Blender (nerf-synthetic) is a synthetic dataset used to benchmark NeRF methods. It consists of 8 scenes of an object placed on a white background. Cameras are placed on a semi-sphere around the object. Scenes are licensed under various CC licenses. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/blender

Method PSNR SSIM LPIPS (VGG) Time GPU mem.
Zip-NeRF 33.670 0.973 0.036 5h 21m 57s 26.2 GB
Gaussian Opacity Fields 33.451 0.969 0.038 18m 26s 3.1 GB
Mip-Splatting 33.330 0.969 0.039 6m 49s 2.7 GB
Gaussian Splatting 33.308 0.969 0.037 6m 6s 3.1 GB
TensoRF 33.172 0.963 0.051 10m 47s 16.4 GB
Scaffold-GS 33.080 0.966 0.048 7m 4s 3.7 GB
K-Planes 32.265 0.961 0.062 23m 58s 4.6 GB
Instant NGP 32.198 0.959 0.055 2m 23s 2.6 GB
Tetra-NeRF 31.951 0.957 0.056 6h 53m 20s 29.6 GB
gsplat 31.471 0.966 0.054 14m 45s 2.8 GB
Mip-NeRF 360 30.345 0.951 0.060 3h 29m 39s 114.8 GB
NerfStudio 29.191 0.941 0.095 9m 38s 3.6 GB
NeRF 28.723 0.936 0.092 23h 26m 30s 10.2 GB
COLMAP 12.123 0.766 0.214 1h 20m 34s 0 MB

Tanks and Temples

Tanks and Temples is a benchmark for image-based 3D reconstruction. The benchmark sequences were acquired outside the lab, in realistic conditions. Ground-truth data was captured using an industrial laser scanner. The benchmark includes both outdoor scenes and indoor environments. The dataset is split into three subsets: training, intermediate, and advanced. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/tanksandtemples

Method PSNR SSIM LPIPS Time GPU mem.
Zip-NeRF 24.628 0.840 0.131 5h 44m 9s 26.6 GB
Mip-Splatting 23.930 0.833 0.166 15m 56s 7.3 GB
Gaussian Splatting 23.827 0.831 0.165 13m 48s 6.9 GB
Gaussian Opacity Fields 22.395 0.825 0.172 40m 25s 26.3 GB
NerfStudio 22.043 0.743 0.270 19m 27s 3.7 GB
Instant NGP 21.623 0.712 0.340 4m 27s 4.1 GB
COLMAP 11.919 0.436 0.606 5h 16m 11s 0 MB

Implementation status

Method Blender LLFF Mip-NeRF 360 Nerfstudio Photo Tourism SeaThru-NeRF Tanks and Temples
COLMAP ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold
CamP โ” โ” โ” โ” โ” โ” โ”
GS-W โ” โ” โ” โ” โ” โ” โ”
Gaussian Opacity Fields ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” โ” ๐Ÿฅ‡ gold
Gaussian Splatting ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold
Instant NGP ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold
K-Planes ๐Ÿฅ‡ gold โ” โ” โ” ๐Ÿฅˆ silver โ” โ”
Mip-NeRF 360 ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” โ” ๐Ÿฅ‡ gold
Mip-Splatting ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold
NeRF ๐Ÿฅ‡ gold โ” โ” โ” โ” โ” โ”
NeRF On-the-go โ” โ” โ” โ” โ” โ” โ”
NeRF-W (reimplementation) โ” โ” โ” โ” ๐Ÿฅ‡ gold โ” โ”
NerfStudio ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” โ” ๐Ÿฅ‡ gold
Scaffold-GS ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold
SeaThru-NeRF โ” โ” โ” โ” โ” ๐Ÿฅ‡ gold โ”
TensoRF ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ โ” โ” โ” โ”
Tetra-NeRF ๐Ÿฅˆ silver โ” ๐Ÿฅˆ silver โ” โ” โ” โ”
WildGaussians โ” โ” โ” โ” ๐Ÿฅ‡ gold โ” โ”
Zip-NeRF ๐Ÿฅ‡ gold โŒ ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โ” โ” โ”
gsplat ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold

Contributing

Contributions are very much welcome. Please open a PR with a dataset/method/feature that you want to contribute. The goal of this project is to slowly expand by implementing more and more methods.

Citation

If you use this project in your research, please cite the following paper:

@article{kulhanek2024nerfbaselines,
  title={NerfBaselines: Consistent and Reproducible Evaluation of Novel View Synthesis Methods},
  author={Jonas Kulhanek and Torsten Sattler},
  year={2024},
  journal={arXiv},
}

License

This project is licensed under the MIT license Each implemented method is licensed under the license provided by the authors of the method. For the currently implemented methods, the following licenses apply:

Acknowledgements

A big thanks to the authors of all implemented methods for the great work they have done. We would also like to thank the authors of NerfStudio, especially Brent Yi, for viser - a great framework powering the viewer. This work was supported by the Czech Science Foundation (GAฤŒR) EXPRO (grant no. 23-07973X), the Grant Agency of the Czech Technical University in Prague (grant no. SGS24/095/OHK3/2T/13), and by the Ministry of Education, Youth and Sports of the Czech Republic through the e-INFRA CZ (ID:90254).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerfbaselines-1.2.1.tar.gz (526.7 kB view details)

Uploaded Source

Built Distribution

nerfbaselines-1.2.1-py3-none-any.whl (480.8 kB view details)

Uploaded Python 3

File details

Details for the file nerfbaselines-1.2.1.tar.gz.

File metadata

  • Download URL: nerfbaselines-1.2.1.tar.gz
  • Upload date:
  • Size: 526.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.5

File hashes

Hashes for nerfbaselines-1.2.1.tar.gz
Algorithm Hash digest
SHA256 fd244bb506bf39b4d6242770177b1a2e481bf87c99bb9bf70423283d2b9d8568
MD5 0b6ea300c6ac07101aca89e5daf409f6
BLAKE2b-256 f71eeae41f0f9117377209eb5891d7fc63c3ff58e5f79ddec62ddc6d44040ce5

See more details on using hashes here.

File details

Details for the file nerfbaselines-1.2.1-py3-none-any.whl.

File metadata

File hashes

Hashes for nerfbaselines-1.2.1-py3-none-any.whl
Algorithm Hash digest
SHA256 09f7c680a3b32915eb166bf1f393a6e9d87c4a3deee6aa332427045d25b3bb9f
MD5 dcc868fc0afc15a49cb8fbd29dbc73c3
BLAKE2b-256 5ffb031baecd26b1151393b6951b70801b64cdfde6ae8ac6bcc9c613dd5c454d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page