Skip to main content

A super-easy way to record, search and compare AI experiments.

Project description

An easy-to-use & supercharged open-source experiment tracker

Aim logs your training runs, enables a beautiful UI to compare them and an API to query them programmatically.

AboutFeaturesDemosExamplesQuick StartDocumentationRoadmapSlack CommunityTwitter

Platform Support PyPI - Python Version PyPI Package License PyPI Downloads Issues

Integrates seamlessly with your favorite tools



About Aim

Track and version ML runs Visualize runs via beautiful UI Query runs metadata via SDK

Aim is an open-source, self-hosted ML experiment tracking tool. It's good at tracking lots (1000s) of training runs and it allows you to compare them with a performant and beautiful UI.

You can use not only the great Aim UI but also its SDK to query your runs' metadata programmatically. That's especially useful for automations and additional analysis on a Jupyter Notebook.

Aim's mission is to democratize AI dev tools.

Why use Aim?

Compare 100s of runs in a few clicks - build models faster

  • Compare, group and aggregate 100s of metrics thanks to effective visualizations.
  • Analyze, learn correlations and patterns between hparams and metrics.
  • Easy pythonic search to query the runs you want to explore.

Deep dive into details of each run for easy debugging

  • Hyperparameters, metrics, images, distributions, audio, text - all available at hand on an intuitive UI to understand the performance of your model.
  • Easily track plots built via your favourite visualisation tools, like plotly and matplotlib.
  • Analyze system resource usage to effectively utilize computational resources.

Have all relevant information organised and accessible for easy governance

  • Centralized dashboard to holistically view all your runs, their hparams and results.
  • Use SDK to query/access all your runs and tracked metadata.
  • You own your data - Aim is open source and self hosted.

Demos

Machine translation lightweight-GAN
Training logs of a neural translation model(from WMT'19 competition). Training logs of 'lightweight' GAN, proposed in ICLR 2021.
FastSpeech 2 Simple MNIST
Training logs of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech". Simple MNIST training logs.

Quick Start

Follow the steps below to get started with Aim.

1. Install Aim on your training environment

pip3 install aim

2. Integrate Aim with your code

from aim import Run

# Initialize a new run
run = Run()

# Log run parameters
run["hparams"] = {
    "learning_rate": 0.001,
    "batch_size": 32,
}

# Log metrics
for i in range(10):
    run.track(i, name='loss', step=i, context={ "subset":"train" })
    run.track(i, name='acc', step=i, context={ "subset":"train" })

See the full list of supported trackable objects(e.g. images, text, etc) here.

3. Run the training as usual and start Aim UI

aim up

4. Or query runs programmatically via SDK

from aim import Repo

my_repo = Repo('/path/to/aim/repo')

query = "metric.name == 'loss'" # Example query

# Get collection of metrics
for run_metrics_collection in my_repo.query_metrics(query).iter_runs():
    for metric in run_metrics_collection:
        # Get run params
        params = metric.run[...]
        # Get metric values
        steps, metric_values = metric.values.sparse_numpy()

Integrations

Integrate PyTorch Lightning
from aim.pytorch_lightning import AimLogger

# ...
trainer = pl.Trainer(logger=AimLogger(experiment='experiment_name'))
# ...

See documentation here.

Integrate Hugging Face
from aim.hugging_face import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='mnli')
trainer = Trainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    callbacks=[aim_callback],
    # ...
)
# ...

See documentation here.

Integrate Keras & tf.keras
import aim

# ...
model.fit(x_train, y_train, epochs=epochs, callbacks=[
    aim.keras.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
    
    # Use aim.tensorflow.AimCallback in case of tf.keras
    aim.tensorflow.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
])
# ...

See documentation here.

Integrate XGBoost
from aim.xgboost import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
bst = xgb.train(param, xg_train, num_round, watchlist, callbacks=[aim_callback])
# ...

See documentation here.

Comparisons to familiar tools

Tensorboard

Training run comparison

Order of magnitude faster training run comparison with Aim

  • The tracked params are first class citizens at Aim. You can search, group, aggregate via params - deeply explore all the tracked data (metrics, params, images) on the UI.
  • With tensorboard the users are forced to record those parameters in the training run name to be able to search and compare. This causes a super-tedius comparison experience and usability issues on the UI when there are many experiments and params. TensorBoard doesn't have features to group, aggregate the metrics

Scalability

  • Aim is built to handle 1000s of training runs - both on the backend and on the UI.
  • TensorBoard becomes really slow and hard to use when a few hundred training runs are queried / compared.

Beloved TB visualizations to be added on Aim

  • Embedding projector.
  • Neural network visualization.

MLFlow

MLFlow is an end-to-end ML Lifecycle tool. Aim is focused on training tracking. The main differences of Aim and MLflow are around the UI scalability and run comparison features.

Run comparison

  • Aim treats tracked parameters as first-class citizens. Users can query runs, metrics, images and filter using the params.
  • MLFlow does have a search by tracked config, but there are no grouping, aggregation, subplotting by hyparparams and other comparison features available.

UI Scalability

  • Aim UI can handle several thousands of metrics at the same time smoothly with 1000s of steps. It may get shaky when you explore 1000s of metrics with 10000s of steps each. But we are constantly optimizing!
  • MLflow UI becomes slow to use when there are a few hundreds of runs.

Weights and Biases

Hosted vs self-hosted

  • Weights and Biases is a hosted closed-source MLOps platform.
  • Aim is self-hosted, free and open-source experiment tracking tool.

Roadmap

Detailed Sprints

:sparkle: The Aim product roadmap

  • The Backlog contains the issues we are going to choose from and prioritize weekly
  • The issues are mainly prioritized by the highly-requested features

High-level roadmap

The high-level features we are going to work on the next few months

Done

  • Live updates (Shipped: Oct 18 2021)
  • Images tracking and visualization (Start: Oct 18 2021, Shipped: Nov 19 2021)
  • Distributions tracking and visualization (Start: Nov 10 2021, Shipped: Dec 3 2021)
  • Jupyter integration (Start: Nov 18 2021, Shipped: Dec 3 2021)
  • Audio tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Transcripts tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Plotly integration (Start: Dec 1 2021, Shipped: Dec 17 2021)
  • Colab integration (Start: Nov 18 2021, Shipped: Dec 17 2021)
  • Centralized tracking server (Start: Oct 18 2021, Shipped: Jan 22 2022)
  • Tensorboard adaptor - visualize TensorBoard logs with Aim (Start: Dec 17 2021, Shipped: Feb 3 2022)
  • Track git info, env vars, CLI arguments, dependencies (Start: Jan 17 2022, Shipped: Feb 3 2022)
  • MLFlow adaptor (visualize MLflow logs with Aim) (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Activeloop Hub integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • PyTorch-Ignite integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Run summary and overview info(system params, CLI args, git info, ...) (Start: Feb 14 2022, Shipped: Mar 9 2022)
  • Add DVC related metadata into aim run (Start: Mar 7 2022, Shipped: Mar 26 2022)
  • Ability to attach notes to Run from UI (Start: Mar 7 2022, Shipped: Apr 29 2022)

In Progress

  • Cloud storage support – store runs blob(e.g. images) data on the cloud (Start: Mar 21 2022)
  • Artifact storage – store files, model checkpoints, and beyond (Start: Mar 21 2022)
  • Run execution details(display stdout/stderr logs) (Start: 25 Apr 2022)
  • Long sequences(up to 5M of steps) support (Start: 25 Apr 2022)

To Do

Aim UI

  • Runs management
    • Runs explorer – query and visualize runs data(images, audio, distributions, ...) in a central dashboard
  • Explorers
    • Audio Explorer
    • Text Explorer
    • Figures Explorer
    • Distributions Explorer
  • Dashboards – customizable layouts with embedded explorers

SDK and Storage

  • Scalability
    • Smooth UI and SDK experience with over 10.000 runs
  • Runs management
    • SDK interfaces
      • Reporting – query and compare runs, explore data with familiar tools such as matlpotlib and pandas
      • Manipulations – copy, move, delete runs, params and sequences
    • CLI interfaces
      • Reporting - runs summary and run details in a CLI compatible format
      • Manipulations – copy, move, delete runs, params and sequences

Integrations

  • ML Frameworks:
    • Shortlist: MONAI, SpaCy, AllenNLP, LightGBM, Raytune, Fairseq, fast.ai, KerasTuner
  • Datasets versioning tools
    • Shortlist: HuggingFace Datasets
  • Resource management tools
    • Shortlist: Kubeflow, Slurm
  • Workflow orchestration tools
  • Others: Hydra, Google MLMD, Streamlit, ...

On hold

  • scikit-learn integration

Community

If you have questions

  1. Read the docs
  2. Open a feature request or report a bug
  3. Join our slack

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aim-3.10.0.tar.gz (1.5 MB view details)

Uploaded Source

Built Distributions

aim-3.10.0-cp310-cp310-manylinux_2_24_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.24+ x86-64

aim-3.10.0-cp310-cp310-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

aim-3.10.0-cp310-cp310-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10 macOS 10.14+ x86-64

aim-3.10.0-cp39-cp39-manylinux_2_24_x86_64.whl (5.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.24+ x86-64

aim-3.10.0-cp39-cp39-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

aim-3.10.0-cp39-cp39-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.9 macOS 10.14+ x86-64

aim-3.10.0-cp38-cp38-manylinux_2_24_x86_64.whl (5.8 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.24+ x86-64

aim-3.10.0-cp38-cp38-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

aim-3.10.0-cp38-cp38-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.8 macOS 10.14+ x86-64

aim-3.10.0-cp37-cp37m-manylinux_2_24_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.24+ x86-64

aim-3.10.0-cp37-cp37m-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.7m macOS 10.14+ x86-64

aim-3.10.0-cp36-cp36m-manylinux_2_24_x86_64.whl (5.2 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.24+ x86-64

aim-3.10.0-cp36-cp36m-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.6m macOS 10.14+ x86-64

File details

Details for the file aim-3.10.0.tar.gz.

File metadata

  • Download URL: aim-3.10.0.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for aim-3.10.0.tar.gz
Algorithm Hash digest
SHA256 adac2f57d84b1db60f8bb2dc3bc50fc8a8ab410138e388412d3fbe7c19d4ce0f
MD5 d7829291f47e4898e9bfa907105316da
BLAKE2b-256 309153bfbad5bcc61372501c96ead614694fbe82c68b6b220685ebb08b3a606c

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp310-cp310-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp310-cp310-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 2c443916c5ca6195f367c2778a11ec4a42b9d0cbd31801df6ad1fde6dbd8e8e6
MD5 6973d764a73bf839c9462f689901f707
BLAKE2b-256 606c450991f5e85bbbceb6ed108c8f98cdea7f5a51452706a9ca8c3af04213be

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e181b46c481be8875f5150b5a5a96216bd6b8cf53d0336b36ce8224ec5cf61f1
MD5 145876d23a7ecfbabe2fcd253090014c
BLAKE2b-256 adfb7e5795b625d405038dae259ef9f8cfc5ef0a73ee649efb0ccf74aa37037b

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp310-cp310-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp310-cp310-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 169c96ea144d23aea6f80923fd546dc909a243e38e622cb0483916f116b0b85f
MD5 26f7690fb535c228e2708ebc8610475f
BLAKE2b-256 853278708710eab9b3563cdda48548282498f63cbeccd5b1a53f719a4df71e2d

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp39-cp39-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp39-cp39-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 fdd70e880f75b4ec03576eb6120c70c9891aea6c6c6ed8d7db6c453b3c2143aa
MD5 54fbf7ef80a5cee78a7c39024fb480b9
BLAKE2b-256 6241119657bdad3fb3e1f5a8bea0f5111b1a80b6b9aad66bff76139fd6e7a108

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 24a449e98747be34905b4886d2d259e9e6286d9ee887e0b95dcf0d47b3017ebb
MD5 544572ade138eb4122d480217169e0b8
BLAKE2b-256 216616e25795b5e000c79b3298c871c4643c61b5fc59c1954d25fcbd0f6c213b

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp39-cp39-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp39-cp39-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 49f382f659e3ec585a88b7d5c48c022844c9d486a29394335ba03b5d0e217758
MD5 56d194386f59d93a29f0f88bedb40373
BLAKE2b-256 c38755cac4b0a3835563f80749fda1eb84c3ba920b683641a12c50cb069f0126

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp38-cp38-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp38-cp38-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 8115145f62ef49d5453c28f813b58ff18ad60d83706b0bae1af380f8e1564114
MD5 b8f429e8ebb84baff24f5615a60932ae
BLAKE2b-256 e85ab38cc59724e1e9bb3f64bb179caccf85d0976db19dec350fc18404a9b025

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3e34562c840e161f59cd95e36d14fd068cf0974bcb3b6dda76bee9caa7903935
MD5 f8c66cb64f813ed1f20b4a7f911d4942
BLAKE2b-256 4fb780b42048e9db0ce83db77ab1ffe2d92ac66d934d73c31b95bd0e5fdb984e

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 bfb32abd6a9f4dd6ca5dca262a707622ac50f82358d128585d841865ec997b6b
MD5 7a343536757aa8b075aefc04038f471c
BLAKE2b-256 e9a33343fa839ef731d7b31d7b0fcff5a81c0833cec9d7e3e4ef2fe56bd11864

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp37-cp37m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp37-cp37m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 2529bb79166e9a38a98cef6ba883196c6b523cd449621985f9b3afacb724a2e1
MD5 a85d7184b29925d4fa48a7dddd21cf48
BLAKE2b-256 37535c0992ef725612758861c02de07aecddb7713ea822146da61d9099708e1d

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 fb3baf0c820edac02ea28f9264eb868908db95076c85bd326dbe545f2e9e815b
MD5 967a9111d2d99c1ec02c195d023113bd
BLAKE2b-256 3eeebfb3a4c582a02fd4b8b20fb636b07b5b364faafe54565ccbd28419e08873

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp36-cp36m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.10.0-cp36-cp36m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 8e36cb3764d06c0236184605d1358a306deb8e98531563013cf35f97f58be9af
MD5 96537b5885235e7ebe0c18f97fe7102f
BLAKE2b-256 da0a91ca0bf5e50208f0b90b3fb0ed87546f149efe5418061ccf4c79f82d174d

See more details on using hashes here.

File details

Details for the file aim-3.10.0-cp36-cp36m-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: aim-3.10.0-cp36-cp36m-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 2.4 MB
  • Tags: CPython 3.6m, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.64.0 importlib-metadata/4.8.3 keyring/23.4.1 rfc3986/1.5.0 colorama/0.4.4 CPython/3.6.13

File hashes

Hashes for aim-3.10.0-cp36-cp36m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 e20d3210470c991da8c0c2ebed97cd3537a17d70d7087d91b76c15805b14e683
MD5 70664eb395a9c61c49364a14451418d7
BLAKE2b-256 22661cc55b9f7a35806e9d6336a88bd3b555aadbd8b3db72caa5b00afbfb28e6

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page