Skip to main content

A super-easy way to record, search and compare AI experiments.

Project description

An easy-to-use & supercharged open-source experiment tracker

Aim logs your training runs, enables a beautiful UI to compare them and an API to query them programmatically.

AboutFeaturesDemosExamplesQuick StartDocumentationRoadmapSlack CommunityTwitter

Platform Support PyPI - Python Version PyPI Package License PyPI Downloads Issues

Integrates seamlessly with your favorite tools



About Aim

Track and version ML runs Visualize runs via beautiful UI Query runs metadata via SDK

Aim is an open-source, self-hosted ML experiment tracking tool. It's good at tracking lots (1000s) of training runs and it allows you to compare them with a performant and beautiful UI.

You can use not only the great Aim UI but also its SDK to query your runs' metadata programmatically. That's especially useful for automations and additional analysis on a Jupyter Notebook.

Aim's mission is to democratize AI dev tools.

Why use Aim?

Compare 100s of runs in a few clicks - build models faster

  • Compare, group and aggregate 100s of metrics thanks to effective visualizations.
  • Analyze, learn correlations and patterns between hparams and metrics.
  • Easy pythonic search to query the runs you want to explore.

Deep dive into details of each run for easy debugging

  • Hyperparameters, metrics, images, distributions, audio, text - all available at hand on an intuitive UI to understand the performance of your model.
  • Easily track plots built via your favourite visualisation tools, like plotly and matplotlib.
  • Analyze system resource usage to effectively utilize computational resources.

Have all relevant information organised and accessible for easy governance

  • Centralized dashboard to holistically view all your runs, their hparams and results.
  • Use SDK to query/access all your runs and tracked metadata.
  • You own your data - Aim is open source and self hosted.

Demos

Machine translation lightweight-GAN
Training logs of a neural translation model(from WMT'19 competition). Training logs of 'lightweight' GAN, proposed in ICLR 2021.
FastSpeech 2 Simple MNIST
Training logs of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech". Simple MNIST training logs.

Quick Start

Follow the steps below to get started with Aim.

1. Install Aim on your training environment

pip3 install aim

2. Integrate Aim with your code

from aim import Run

# Initialize a new run
run = Run()

# Log run parameters
run["hparams"] = {
    "learning_rate": 0.001,
    "batch_size": 32,
}

# Log metrics
for i in range(10):
    run.track(i, name='loss', step=i, context={ "subset":"train" })
    run.track(i, name='acc', step=i, context={ "subset":"train" })

See the full list of supported trackable objects(e.g. images, text, etc) here.

3. Run the training as usual and start Aim UI

aim up

4. Or query runs programmatically via SDK

from aim import Repo

my_repo = Repo('/path/to/aim/repo')

query = "metric.name == 'loss'" # Example query

# Get collection of metrics
for run_metrics_collection in my_repo.query_metrics(query).iter_runs():
    for metric in run_metrics_collection:
        # Get run params
        params = metric.run[...]
        # Get metric values
        steps, metric_values = metric.values.sparse_numpy()

Integrations

Integrate PyTorch Lightning
from aim.pytorch_lightning import AimLogger

# ...
trainer = pl.Trainer(logger=AimLogger(experiment='experiment_name'))
# ...

See documentation here.

Integrate Hugging Face
from aim.hugging_face import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='mnli')
trainer = Trainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    callbacks=[aim_callback],
    # ...
)
# ...

See documentation here.

Integrate Keras & tf.keras
import aim

# ...
model.fit(x_train, y_train, epochs=epochs, callbacks=[
    aim.keras.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
    
    # Use aim.tensorflow.AimCallback in case of tf.keras
    aim.tensorflow.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
])
# ...

See documentation here.

Integrate XGBoost
from aim.xgboost import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
bst = xgb.train(param, xg_train, num_round, watchlist, callbacks=[aim_callback])
# ...

See documentation here.

Comparisons to familiar tools

Tensorboard

Training run comparison

Order of magnitude faster training run comparison with Aim

  • The tracked params are first class citizens at Aim. You can search, group, aggregate via params - deeply explore all the tracked data (metrics, params, images) on the UI.
  • With tensorboard the users are forced to record those parameters in the training run name to be able to search and compare. This causes a super-tedius comparison experience and usability issues on the UI when there are many experiments and params. TensorBoard doesn't have features to group, aggregate the metrics

Scalability

  • Aim is built to handle 1000s of training runs - both on the backend and on the UI.
  • TensorBoard becomes really slow and hard to use when a few hundred training runs are queried / compared.

Beloved TB visualizations to be added on Aim

  • Embedding projector.
  • Neural network visualization.

MLFlow

MLFlow is an end-to-end ML Lifecycle tool. Aim is focused on training tracking. The main differences of Aim and MLflow are around the UI scalability and run comparison features.

Run comparison

  • Aim treats tracked parameters as first-class citizens. Users can query runs, metrics, images and filter using the params.
  • MLFlow does have a search by tracked config, but there are no grouping, aggregation, subplotting by hyparparams and other comparison features available.

UI Scalability

  • Aim UI can handle several thousands of metrics at the same time smoothly with 1000s of steps. It may get shaky when you explore 1000s of metrics with 10000s of steps each. But we are constantly optimizing!
  • MLflow UI becomes slow to use when there are a few hundreds of runs.

Weights and Biases

Hosted vs self-hosted

  • Weights and Biases is a hosted closed-source MLOps platform.
  • Aim is self-hosted, free and open-source experiment tracking tool.

Roadmap

Detailed Sprints

:sparkle: The Aim product roadmap

  • The Backlog contains the issues we are going to choose from and prioritize weekly
  • The issues are mainly prioritized by the highly-requested features

High-level roadmap

The high-level features we are going to work on the next few months

Done

  • Live updates (Shipped: Oct 18 2021)
  • Images tracking and visualization (Start: Oct 18 2021, Shipped: Nov 19 2021)
  • Distributions tracking and visualization (Start: Nov 10 2021, Shipped: Dec 3 2021)
  • Jupyter integration (Start: Nov 18 2021, Shipped: Dec 3 2021)
  • Audio tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Transcripts tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Plotly integration (Start: Dec 1 2021, Shipped: Dec 17 2021)
  • Colab integration (Start: Nov 18 2021, Shipped: Dec 17 2021)
  • Centralized tracking server (Start: Oct 18 2021, Shipped: Jan 22 2022)
  • Tensorboard adaptor - visualize TensorBoard logs with Aim (Start: Dec 17 2021, Shipped: Feb 3 2022)
  • Track git info, env vars, CLI arguments, dependencies (Start: Jan 17 2022, Shipped: Feb 3 2022)
  • MLFlow adaptor (visualize MLflow logs with Aim) (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Activeloop Hub integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • PyTorch-Ignite integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Run summary and overview info(system params, CLI args, git info, ...) (Start: Feb 14 2022, Shipped: Mar 9 2022)
  • Add DVC related metadata into aim run (Start: Mar 7 2022, Shipped: Mar 26 2022)
  • Ability to attach notes to Run from UI (Start: Mar 7 2022, Shipped: Apr 29 2022)
  • Fairseq integration (Start: Mar 27 2022, Shipped: Mar 29 2022)
  • LightGBM integration (Start: Apr 14 2022, Shipped: May 17 2022)
  • CatBoost integration (Start: Apr 20 2022, Shipped: May 17 2022)
  • Run execution details(display stdout/stderr logs) (Start: Apr 25 2022, Shipped: May 17 2022)

In Progress

  • Cloud storage support – store runs blob(e.g. images) data on the cloud (Start: Mar 21 2022)
  • Artifact storage – store files, model checkpoints, and beyond (Start: Mar 21 2022)
  • Long sequences(up to 5M of steps) support (Start: Apr 25 2022)

To Do

Aim UI

  • Runs management
    • Runs explorer – query and visualize runs data(images, audio, distributions, ...) in a central dashboard
  • Explorers
    • Audio Explorer
    • Text Explorer
    • Figures Explorer
    • Distributions Explorer
  • Dashboards – customizable layouts with embedded explorers

SDK and Storage

  • Scalability
    • Smooth UI and SDK experience with over 10.000 runs
  • Runs management
    • SDK interfaces
      • Reporting – query and compare runs, explore data with familiar tools such as matlpotlib and pandas
      • Manipulations – copy, move, delete runs, params and sequences
    • CLI interfaces
      • Reporting - runs summary and run details in a CLI compatible format
      • Manipulations – copy, move, delete runs, params and sequences

Integrations

  • ML Frameworks:
    • Shortlist: MONAI, SpaCy, AllenNLP, Raytune, fast.ai, KerasTuner
  • Datasets versioning tools
    • Shortlist: HuggingFace Datasets
  • Resource management tools
    • Shortlist: Kubeflow, Slurm
  • Workflow orchestration tools
  • Others: Hydra, Google MLMD, Streamlit, ...

On hold

  • scikit-learn integration

Community

If you have questions

  1. Read the docs
  2. Open a feature request or report a bug
  3. Join our slack

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aim-3.11.2.tar.gz (1.5 MB view details)

Uploaded Source

Built Distributions

aim-3.11.2-cp310-cp310-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.24+ x86-64

aim-3.11.2-cp310-cp310-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

aim-3.11.2-cp310-cp310-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10 macOS 10.14+ x86-64

aim-3.11.2-cp39-cp39-manylinux_2_24_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.24+ x86-64

aim-3.11.2-cp39-cp39-macosx_11_0_arm64.whl (2.2 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

aim-3.11.2-cp39-cp39-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.9 macOS 10.14+ x86-64

aim-3.11.2-cp38-cp38-manylinux_2_24_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.24+ x86-64

aim-3.11.2-cp38-cp38-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

aim-3.11.2-cp38-cp38-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.8 macOS 10.14+ x86-64

aim-3.11.2-cp37-cp37m-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.24+ x86-64

aim-3.11.2-cp37-cp37m-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.7m macOS 10.14+ x86-64

aim-3.11.2-cp36-cp36m-manylinux_2_24_x86_64.whl (5.2 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.24+ x86-64

aim-3.11.2-cp36-cp36m-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.6m macOS 10.14+ x86-64

File details

Details for the file aim-3.11.2.tar.gz.

File metadata

  • Download URL: aim-3.11.2.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for aim-3.11.2.tar.gz
Algorithm Hash digest
SHA256 09e3318157c46ca75d253ac11928bb7201b74a9dda2ce1fff62dc6a58872844d
MD5 1946673654a206cc49f29a7ecf847f3e
BLAKE2b-256 52d304cea1924ce56da45edb47c04810d16a68ed75cffd14f23a3ca4a2683aed

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp310-cp310-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp310-cp310-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 f711f88aec335a7e46b555d0b839ca8d2ce63aa201300c2214c43d587456237f
MD5 c5dd90a35e327618a875b3f337c03b57
BLAKE2b-256 ab2dee4f3952ee53396bd08635137780a455e2b19804e3a40f6c236057bc1a4b

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f3d9814ed93848523b2e536cb1f5d0b82a998ae20d2260c0ba63e04ee67445cd
MD5 1aa8d469ca43f6dbe7982bb6a8486211
BLAKE2b-256 4d0a2ef9e81bafdf041e131dfec56dc431c16d203561804c84deb5a3868b1346

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp310-cp310-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp310-cp310-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 f5036b73143ec6c7a82e1c9f437d04f261be75cd797a8ebb25575bd10a2e8445
MD5 089d8f7310fd7755fe51a79af78ce288
BLAKE2b-256 4e14071b697d73358b81c3e1e972494370fec53db6a1f995edeb6b997b93428f

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp39-cp39-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp39-cp39-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 09a2a986c8357fe722fef955a83b38d88a7322dfadf5ac9c8b331667fc869dbb
MD5 fff9c98fee50a86ef2d73ddae6ba0497
BLAKE2b-256 adfc80bf9f66c7e0182f6788b29fd8bab305a64663b944f163a3b74e48512300

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 446060309af83b78a7471b0a89ab94e15600eae28fb33259901e2a7898aa1d18
MD5 57e00b7c16ed6944d397018299a147a7
BLAKE2b-256 b18f82cdb03ce2d9704a0f1886e12b31882634ce7f7a615cc3d29dda6f63b21c

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp39-cp39-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp39-cp39-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 7fe292de36db118d0830411715c40fad3d865a4141caf442237c3801819c8028
MD5 42ed13f803c0938f3dacc9c639e4a8de
BLAKE2b-256 15aee1910fb0ed8ba3f839023a5c6ddb80b91e3b0fc89ddaf271d3c68321bce4

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp38-cp38-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp38-cp38-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 2f1d60bb53a4f3cd4c7178c33a4c24b4fe179cb04d0f7001a88cd68cb6d5109f
MD5 00dcec74c6345d8b2e94df896dba6e4a
BLAKE2b-256 3620960127b255de000167a49582bf0ef6257c6edd74bb23f0fc816cf5dffa63

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f574ef12b117a9774f6fd3ce7a90ecc6da85a8404b5c4821de6af1d4d99c3fbf
MD5 66892af78f2266e5febbfd4ec02fc6a8
BLAKE2b-256 9d6f1c2155b926790e49d344b0a8881e574e5ae201e40fb821c865fdd017cf01

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 85aefe9a62699fe4c026400520e8ce8703a3c537bbd5b33660e76ce98321142e
MD5 1565da234e44d79d08b98934a4064c14
BLAKE2b-256 aaaeeb311ce8b787bdb3c33dfdda558de6a56d69c079c9f2355a293291200035

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp37-cp37m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp37-cp37m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 e17d1dee4bb790020e8a6feeed4a401dfca2477206f8cee0846000e290b6160f
MD5 469e34c47e712eaa2e0259018bdf3332
BLAKE2b-256 d3dee79e0aa14d80bfcd9bca21a784258d9d248e1d8656e56c6166f007ce59f0

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 16183e9d5556869a3af904cf9d564df5af1bfb2b6b7890407fc4e0a620c0d3e8
MD5 cbd408d916f6be925b9e4974e3c812a8
BLAKE2b-256 bbd40a0e59474a53829b533d950debca36ca29659e69a65ef327e18b638fdc1e

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp36-cp36m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.11.2-cp36-cp36m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 dc0caaaf3c040d1b11c79f21c5452f5bcbd13a42206609fba87e112886d0e40d
MD5 050a6ab09faa84260e1d6331dad3f440
BLAKE2b-256 212b6c3c4dffd24322ce1c2c54b9eeddc40a6e278bb46ad6ffcc8526a5ba2fdc

See more details on using hashes here.

File details

Details for the file aim-3.11.2-cp36-cp36m-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: aim-3.11.2-cp36-cp36m-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 2.3 MB
  • Tags: CPython 3.6m, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.64.0 importlib-metadata/4.8.3 keyring/23.4.1 rfc3986/1.5.0 colorama/0.4.4 CPython/3.6.13

File hashes

Hashes for aim-3.11.2-cp36-cp36m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 e10cb9810b93b81572ed2cb7e717fc9a89a98a99307b16cc8249e2e144b36d0c
MD5 6084406ece2e95f0e0a803ef9427357d
BLAKE2b-256 c1d0ba3190c5378288f6a2d618251cf854f6d64b4f05d8cd1360589015d74558

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page