Skip to main content

NeuralOperator: Learning in Infinite Dimensions

Project description

PyPI https://github.com/NeuralOperator/neuraloperator/actions/workflows/test.yml/badge.svg

NeuralOperator: Learning in Infinite Dimensions

neuraloperator is a comprehensive library for learning neural operators in PyTorch. It is the official implementation for Fourier Neural Operators and Tensorized Neural Operators.

Unlike regular neural networks, neural operators enable learning mapping between function spaces, and this library provides all of the tools to do so on your own data.

Neural operators are also resolution invariant, so your trained operator can be applied on data of any resolution.

Checkout the documentation for more!

Installation

Just clone the repository and install locally (in editable mode so changes in the code are immediately reflected without having to reinstall):

git clone https://github.com/NeuralOperator/neuraloperator
cd neuraloperator
pip install -e .
pip install -r requirements.txt

You can also just pip install the most recent stable release of the library on PyPI:

pip install neuraloperator

Quickstart

After you’ve installed the library, you can start training operators seamlessly:

from neuralop.models import FNO

operator = FNO(n_modes=(32, 32),
               hidden_channels=64,
               in_channels=2,
               out_channels=1)

Tensorization is also provided out of the box: you can improve the previous models by simply using a Tucker Tensorized FNO with just a few parameters:

from neuralop.models import TFNO

operator = TFNO(n_modes=(32, 32),
                hidden_channels=64,
                in_channels=2,
                out_channels=1,
                factorization='tucker',
                implementation='factorized',
                rank=0.05)

This will use a Tucker factorization of the weights. The forward pass will be efficient by contracting directly the inputs with the factors of the decomposition. The Fourier layers will have 5% of the parameters of an equivalent, dense Fourier Neural Operator!

Checkout the documentation for more!

Using with Weights and Biases

Our Trainer natively supports logging to W&B. To use these features, create a file in neuraloperator/config called wandb_api_key.txt and paste your W&B API key there. You can configure the project you want to use and your username in the main yaml configuration files.

Contributing

NeuralOperator is 100% open-source, and we welcome contributions from the community!

Our mission for NeuralOperator is to provide access to well-documented, robust implementations of neural operator methods from foundations to the cutting edge. The library is primarily intended for methods that directly relate to operator learning: new architectures, meta-algorithms, training methods and benchmark datasets. We are also interested in integrating interactive examples that showcase operator learning in action on small sample problems.

If your work provides one of the above, we would be thrilled to integrate it into the library. Otherwise, if your work simply relies on a version of the NeuralOperator codebase, we recommend publishing your code separately using a procedure outlined in our developer’s guide, under the section “Publishing code built on the library”.

If you spot a bug or a typo in the documentation, or have an idea for a feature you’d like to see, please report it on our issue tracker, or even better, open a Pull Request.

For detailed development setup, testing, and contribution guidelines, please refer to our Contributing Guide.

Code of Conduct

All participants are expected to uphold the Code of Conduct to ensure a friendly and welcoming environment for everyone.

Citing NeuralOperator

If you use NeuralOperator in an academic paper, please cite [1]

@article{kossaifi2025librarylearningneuraloperators,
   author    = {Jean Kossaifi and
                  Nikola Kovachki and
                  Zongyi Li and
                  David Pitt and
                  Miguel Liu-Schiaffini and
                  Valentin Duruisseaux and
                  Robert Joseph George and
                  Boris Bonev and
                  Kamyar Azizzadenesheli and
                  Julius Berner and
                  Anima Anandkumar},
   title     = {A Library for Learning Neural Operators},
   journal   = {arXiv preprint arXiv:2412.10354},
   year      = {2025},
}

and consider citing [2], [3]:

@article{kovachki2021neural,
   author    = {Nikola B. Kovachki and
                  Zongyi Li and
                  Burigede Liu and
                  Kamyar Azizzadenesheli and
                  Kaushik Bhattacharya and
                  Andrew M. Stuart and
                  Anima Anandkumar},
   title     = {Neural Operator: Learning Maps Between Function Spaces},
   journal   = {CoRR},
   volume    = {abs/2108.08481},
   year      = {2021},
}

@article{berner2025principled,
   author    = {Julius Berner and
                  Miguel Liu-Schiaffini and
                  Jean Kossaifi and
                  Valentin Duruisseaux and
                  Boris Bonev and
                  Kamyar Azizzadenesheli and
                  Anima Anandkumar},
   title     = {Principled Approaches for Extending Neural Architectures to Function Spaces for Operator Learning},
   journal   = {arXiv preprint arXiv:2506.10973},
   year      = {2025},
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

neuraloperator-2.0.0.tar.gz (197.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

neuraloperator-2.0.0-py3-none-any.whl (248.6 kB view details)

Uploaded Python 3

File details

Details for the file neuraloperator-2.0.0.tar.gz.

File metadata

  • Download URL: neuraloperator-2.0.0.tar.gz
  • Upload date:
  • Size: 197.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for neuraloperator-2.0.0.tar.gz
Algorithm Hash digest
SHA256 dcf156d88c6a350ca785b22af94880fdd7673ebd83c3f27fd08eef624d20368e
MD5 e07457d20e397adb68a40bbb6c2f9199
BLAKE2b-256 ba4b3af891b4099b9d3c0b21f34b3d9a4d58443422520d57fe712cb9cca14bee

See more details on using hashes here.

Provenance

The following attestation bundles were made for neuraloperator-2.0.0.tar.gz:

Publisher: deploy_pypi.yml on neuraloperator/neuraloperator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file neuraloperator-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: neuraloperator-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 248.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for neuraloperator-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 2f4edf579057a100d03a5d8728c145bd114e6da77563fa9f8c8f7c213784d265
MD5 a6d0b0061bf9c1b23884c9b087094dbe
BLAKE2b-256 08fe5d74759a33732ce4dbac7bfef339bba371868544a7bcd44e5083495940e1

See more details on using hashes here.

Provenance

The following attestation bundles were made for neuraloperator-2.0.0-py3-none-any.whl:

Publisher: deploy_pypi.yml on neuraloperator/neuraloperator

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page