Skip to main content

Reproducible evaluation of NeRF and 3DGS methods

Project description

NerfBaselines

PyPI - Version GitHub License Downloads

NerfBaselines is a framework for evaluating and comparing existing NeRF and 3DGS methods. Currently, most official implementations use different dataset loaders, evaluation protocols, and metrics, which renders benchmarking difficult. Therefore, this project aims to provide a unified interface for running and evaluating methods on different datasets in a consistent way using the same metrics. But instead of reimplementing the methods, we use the official implementations and wrap them so that they can be run easily using the same interface.

Please visit the project page to see the results of implemented methods on dataset benchmarks.

Project Page + Results ย |ย  Paper

Getting started

Start by installing the nerfbaselines pip package on your host system.

pip install nerfbaselines

Now you can use the nerfbaselines cli to interact with NerfBaselines.

The next step is to choose the backend which will be used to install different methods. At the moment there are the following backends implemented:

  • docker: Offers good isolation, requires docker (with NVIDIA container toolkit) to be installed and the user to have access to it (being in the docker user group).
  • apptainer: Similar level of isolation as docker, but does not require the user to have privileged access.
  • conda (default): Does not require docker/apptainer to be installed, but does not offer the same level of isolation and some methods require additional dependencies to be installed. Also, some methods are not implemented for this backend because they rely on dependencies not found on conda.
  • python: Will run everything directly in the current environment. Everything needs to be installed in the environment for this backend to work.

The backend can be set as the --backend <backend> argument or using the NERFBASELINES_BACKEND environment variable.

Downloading data

For some datasets, e.g. Mip-NeRF 360, NerfStudio, Blender, or Tanks and Temples, the datasets can be downloaded automatically. You can specify the argument --data external://dataset/scene during training or download the dataset beforehand by running nerfbaselines download-dataset dataset/scene. Examples:

# Downloads the garden scene to the cache folder.
nerfbaselines download-dataset mipnerf360/garden

# Downloads all nerfstudio scenes to the cache folder.
nerfbaselines download-dataset nerfstudio

# Downloads kithen scene to folder kitchen
nerfbaselines download-dataset mipnerf360/kitchen -o kitchen

Training

To start the training, use the nerfbaselines train --method <method> --data <data> command. Use --help argument to learn about all implemented methods and supported features.

Rendering

The nerfbaselines render --checkpoint <checkpoint> command can be used to render images from a trained checkpoint. Again, use --help to learn about the arguments.

In order to render a camera trajectory (e.g., created using the interactive viewer), use the following command command:

nerfbaselines render-trajectory --checkpoint <checkpoint> --trajectory <trajectory> --output <output.mp4>

Interactive viewer

Given a trained checkpoint, the interactive viewer can be launched as follows:

nerfbaselines viewer --checkpoint <checkpoin> --data <dataset>

Even though the argument --data <dataset> is optional, it is recommended, as the camera poses are used to perform gravity alignment and rescaling for a better viewing experience. It also enables visualizing the input camera frustums.

Results

In this section, we present results of implemented methods on standard benchmark datasets. For detailed results, visit the project page: https://jkulhanek.com/nerfbaselines

Mip-NeRF 360

Mip-NeRF 360 is a collection of four indoor and five outdoor object-centric scenes. The camera trajectory is an orbit around the object with fixed elevation and radius. The test set takes each n-th frame of the trajectory as test views. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/mipnerf360

Method PSNR SSIM LPIPS (VGG) Time GPU mem.
Zip-NeRF 28.553 0.829 0.218 5h 30m 20s 26.8 GB
Mip-NeRF 360 27.681 0.792 0.272 30h 14m 36s 33.6 GB
Mip-Splatting 27.492 0.815 0.258 25m 37s 11.0 GB
Gaussian Splatting 27.434 0.814 0.257 23m 25s 11.1 GB
Gaussian Opacity Fields 27.421 0.826 0.234 1h 3m 54s 28.4 GB
NerfStudio 26.388 0.731 0.343 19m 30s 5.9 GB
Instant NGP 25.507 0.684 0.398 3m 54s 7.8 GB

Blender

Blender (nerf-synthetic) is a synthetic dataset used to benchmark NeRF methods. It consists of 8 scenes of an object placed on a white background. Cameras are placed on a semi-sphere around the object. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/blender

Method PSNR SSIM LPIPS (VGG) Time GPU mem.
Zip-NeRF 33.670 0.973 0.036 5h 21m 57s 26.2 GB
Gaussian Opacity Fields 33.451 0.969 0.038 18m 26s 3.1 GB
Mip-Splatting 33.330 0.969 0.039 6m 49s 2.7 GB
Gaussian Splatting 33.308 0.969 0.037 6m 6s 3.1 GB
TensoRF 33.172 0.963 0.051 10m 47s 16.4 GB
K-Planes 32.265 0.961 0.062 23m 58s 4.6 GB
Instant NGP 32.198 0.959 0.055 2m 23s 2.6 GB
Tetra-NeRF 31.951 0.957 0.056 6h 53m 20s 29.6 GB
Mip-NeRF 360 30.345 0.951 0.060 3h 29m 39s 114.8 GB
NerfStudio 29.191 0.941 0.095 9m 38s 3.6 GB
NeRF 28.723 0.936 0.092 23h 26m 30s 10.2 GB

Tanks and Temples

Tanks and Temples is a benchmark for image-based 3D reconstruction. The benchmark sequences were acquired outside the lab, in realistic conditions. Ground-truth data was captured using an industrial laser scanner. The benchmark includes both outdoor scenes and indoor environments. The dataset is split into three subsets: training, intermediate, and advanced. Detailed results are available on the project page: https://jkulhanek.com/nerfbaselines/tanksandtemples

Method PSNR SSIM LPIPS Time GPU mem.
Zip-NeRF 24.628 0.840 0.131 5h 44m 9s 26.6 GB
Mip-Splatting 23.930 0.833 0.166 15m 56s 7.3 GB
Gaussian Splatting 23.827 0.831 0.165 13m 48s 6.9 GB
Gaussian Opacity Fields 22.395 0.825 0.172 40m 25s 26.3 GB
NerfStudio 22.043 0.743 0.270 19m 27s 3.7 GB
Instant NGP 21.623 0.712 0.340 4m 27s 4.1 GB

Reproducing results

Method Mip-NeRF 360 Blender NerfStudio Tanks and Temples LLFF Photo Tourism SeaThru-NeRF
NerfStudio ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โ” ๐Ÿฅ‡ gold โŒ โ” โ”
Instant-NGP ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ โ” โ”
Gaussian Splatting ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ ๐Ÿฅ‡ gold โŒ โ” ๐Ÿฅ‡ gold
Mip-Splatting ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ ๐Ÿฅ‡ gold โŒ โ” ๐Ÿฅ‡ gold
Gaussian Opacity Fields ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ ๐Ÿฅ‡ gold โŒ โ” โ”
Tetra-NeRF ๐Ÿฅˆ silver ๐Ÿฅˆ silver โ” โ” โŒ โ” โ”
Mip-NeRF 360 ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โ” โ” โŒ โ” โ”
Zip-NeRF ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold ๐Ÿฅ‡ gold โŒ โ” โ”
CamP โ” โ” โ” โ” โŒ โ” โ”
TensoRF โŒ ๐Ÿฅ‡ gold โ” โ” ๐Ÿฅ‡ gold โ” โ”
NeRF โ” ๐Ÿฅ‡ gold โ” โ” โ” โ” โ”
K-Planes โ” ๐Ÿฅ‡ gold โ” โ” โ” ๐Ÿฅˆ silver โ”
Nerf-W (reimpl.) โ” โ” โ” โ” โ” ๐Ÿฅ‡ gold โ”
GS-W โ” โ” โ” โ” โ” ๐Ÿฅ‡ gold โ”
WildGaussians โ” โ” โ” โ” โ” ๐Ÿฅ‡ gold โ”
SeaThru-NeRF โ” โ” โ” โ” โ” โ” ๐Ÿฅ‡ gold

Contributing

Contributions are very much welcome. Please open a PR with a dataset/method/feature that you want to contribute. The goal of this project is to slowly expand by implementing more and more methods.

Citation

If you use this project in your research, please cite the following paper:

@article{kulhanek2024nerfbaselines,
  title={NerfBaselines: Consistent and Reproducible Evaluation of Novel View Synthesis Methods},
  author={Jonas Kulhanek and Torsten Sattler},
  year={2024},
  journal={arXiv},
}

License

This project is licensed under the MIT license Each implemented method is licensed under the license provided by the authors of the method. For the currently implemented methods, the following licenses apply:

Acknowledgements

A big thanks to the authors of all implemented methods for the great work they have done. We would also like to thank the authors of NerfStudio, especially Brent Yi, for viser - a great framework powering the viewer. This work was supported by the Czech Science Foundation (GAฤŒR) EXPRO (grant no. 23-07973X), the Grant Agency of the Czech Technical University in Prague (grant no. SGS24/095/OHK3/2T/13), and by the Ministry of Education, Youth and Sports of the Czech Republic through the e-INFRA CZ (ID:90254).

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nerfbaselines-1.0.2.tar.gz (448.1 kB view details)

Uploaded Source

Built Distribution

nerfbaselines-1.0.2-py3-none-any.whl (442.9 kB view details)

Uploaded Python 3

File details

Details for the file nerfbaselines-1.0.2.tar.gz.

File metadata

  • Download URL: nerfbaselines-1.0.2.tar.gz
  • Upload date:
  • Size: 448.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.4

File hashes

Hashes for nerfbaselines-1.0.2.tar.gz
Algorithm Hash digest
SHA256 c7e53205177cd6d44bb9a46fe8cdbcf905c7622da9d8864a45db8b97c03bc48d
MD5 4fa49af1d81678729ab16a170d688255
BLAKE2b-256 d66043e927ae33b2dd57c70d73b26d6dc6e9040b77049a1b605ce5f66fbc060d

See more details on using hashes here.

File details

Details for the file nerfbaselines-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for nerfbaselines-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 1d6e9494204cea900e5387e8f2fcafa2bbfa951ff47757a269a1383b3064703a
MD5 f78e1cd9a47ae732c8e54c06b0723a68
BLAKE2b-256 ba95a6f621f04093f6f61487f65a50557b661b0844d541664a0a3ffc01f14843

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page