AI and ML workflows module for scientific digital twins.
Project description
itwinai
See the latest version of our docs for a quick overview of this platform for advanced AI/ML workflows in digital twin applications.
Installation
Requirements:
- Linux environment. Windows and macOS were never tested.
Python virtual environment
Depending on your environment, there are different ways to select a specific python version.
Laptop or GPU node
If you are working on a laptop or on a simple on-prem setup, you could consider using pyenv. See the installation instructions. If you are using pyenv, make sure to read this.
Install itwinai environment
Regardless of how you loaded your environment, you can create the python virtual environments with the following commands. Once the correct Python version is loaded, create the virtual environments using our pre-make Makefile:
make torch-env # or make torch-env-cpu
make tensorflow-env # or make tensorflow-env-cpu
# Juelich supercomputer
make torch-gpu-jsc
make tf-gpu-jsc
Environment setup
Requirements:
- Linux environment. Windows and macOS were never tested.
- VS Code, for development.
TensorFlow
Installation:
# Install TensorFlow 2.13
make tensorflow-env
# Activate env
source .venv-tf/bin/activate
A CPU-only version is available at the target tensorflow-env-cpu
.
PyTorch (+ Lightning)
Installation:
# Install PyTorch + lightning
make torch-env
# Activate env
source .venv-pytorch/bin/activate
A CPU-only version is available at the target torch-env-cpu
.
Development environment
This is for developers only. To have it, update the installed itwinai
package
adding the dev
extra:
pip install -e .[dev]
Test with pytest
Do this only if you are a developer wanting to test your code with pytest.
First, you need to create virtual environments both for torch and tensorflow. For instance, you can use:
make torch-env-cpu
make tensorflow-env-cpu
To select the name of the torch and tf environments you can set the following
environment variables, which allow to run the tests in environments with
custom names which are different from .venv-pytorch
and .venv-tf
.
export TORCH_ENV="my_torch_env"
export TF_ENV="my_tf_env"
Functional tests (marked with pytest.mark.functional
) will be executed under
/tmp/pytest
location to guarantee they are run in a clean environment.
To run functional tests use:
pytest -v tests/ -m "functional"
To run all tests on itwinai package:
make test
Run tests in JSC virtual environments:
make test-jsc
Micromamba installation (deprecated)
To manage Conda environments we use micromamba, a light weight version of conda.
It is suggested to refer to the Manual installation guide.
Consider that Micromamba can eat a lot of space when building environments because packages are cached on
the local filesystem after being downloaded. To clear cache you can use micromamba clean -a
.
Micromamba data are kept under the $HOME
location. However, in some systems, $HOME
has a limited storage
space and it would be cleverer to install Micromamba in another location with more storage space.
Thus by changing the $MAMBA_ROOT_PREFIX
variable. See a complete installation example for Linux below, where the
default $MAMBA_ROOT_PREFIX
is overridden:
cd $HOME
# Download micromamba (This command is for Linux Intel (x86_64) systems. Find the right one for your system!)
curl -Ls https://micro.mamba.pm/api/micromamba/linux-64/latest | tar -xvj bin/micromamba
# Install micromamba in a custom directory
MAMBA_ROOT_PREFIX='my-mamba-root'
./bin/micromamba shell init $MAMBA_ROOT_PREFIX
# To invoke micromamba from Makefile, you need to add explicitly to $PATH
echo 'PATH="$(dirname $MAMBA_EXE):$PATH"' >> ~/.bashrc
Reference: Micromamba installation guide.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.