Learning function operators with neural networks.
Project description
continuiti is a Python package for deep learning on function operators with a focus on elegance and generality. It provides a unified interface for neural operators (such as DeepONet or FNO) to be used in a plug and play fashion. As operator learning is particularly useful in scientific machine learning, continuiti also includes physics-informed loss functions and a collection of relevant benchmarks.
Installation
Install the package using pip:
pip install continuiti
Or install the latest development version from the repository:
git clone https://github.com/aai-institute/continuiti.git
cd continuiti
pip install -e .[dev]
Usage
Our Documentation contains a verbose introduction to operator learning, a collection of examples using continuiti, and a class documentation.
In general, the operator syntax in continuiti is
v = operator(x, u(x), y)
mapping a function u
(evaluated at x
) to function v
(evaluated in y
).
For more details, see Learning Operators.
Examples
Contributing
Contributions are welcome from anyone in the form of pull requests, bug reports and feature requests. If you find a bug or have a feature request, please open an issue on GitHub. If you want to contribute code, please fork the repository and submit a pull request. See CONTRIBUTING.md for details on local development.
License
This project is licensed under the GNU LGPLv3 License - see the LICENSE file for details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for continuiti-0.1.0-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0dab07bc15c9f49cf4bb4bd4cc3bdf1c21859772dd6365fc8094861fd4721f66 |
|
MD5 | 36f7724c3e88d7337a374836a8602929 |
|
BLAKE2b-256 | 3f696fa46f64b16808f06463a2aa259e365bcd7ef8f44adc56d6eb32318b4902 |