Skip to main content

A super-easy way to record, search and compare AI experiments.

Project description

An easy-to-use & supercharged open-source experiment tracker

Aim logs your training runs, enables a beautiful UI to compare them and an API to query them programmatically.

AboutFeaturesDemosExamplesQuick StartDocumentationRoadmapSlack CommunityTwitter

Platform Support PyPI - Python Version PyPI Package License PyPI Downloads Issues

Integrates seamlessly with your favorite tools



About Aim

Track and version ML runs Visualize runs via beautiful UI Query runs metadata via SDK

Aim is an open-source, self-hosted ML experiment tracking tool. It's good at tracking lots (1000s) of training runs and it allows you to compare them with a performant and beautiful UI.

You can use not only the great Aim UI but also its SDK to query your runs' metadata programmatically. That's especially useful for automations and additional analysis on a Jupyter Notebook.

Aim's mission is to democratize AI dev tools.

Why use Aim?

Compare 100s of runs in a few clicks - build models faster

  • Compare, group and aggregate 100s of metrics thanks to effective visualizations.
  • Analyze, learn correlations and patterns between hparams and metrics.
  • Easy pythonic search to query the runs you want to explore.

Deep dive into details of each run for easy debugging

  • Hyperparameters, metrics, images, distributions, audio, text - all available at hand on an intuitive UI to understand the performance of your model.
  • Easily track plots built via your favourite visualisation tools, like plotly and matplotlib.
  • Analyze system resource usage to effectively utilize computational resources.

Have all relevant information organised and accessible for easy governance

  • Centralized dashboard to holistically view all your runs, their hparams and results.
  • Use SDK to query/access all your runs and tracked metadata.
  • You own your data - Aim is open source and self hosted.

Demos

Machine translation lightweight-GAN
Training logs of a neural translation model(from WMT'19 competition). Training logs of 'lightweight' GAN, proposed in ICLR 2021.
FastSpeech 2 Simple MNIST
Training logs of Microsoft's "FastSpeech 2: Fast and High-Quality End-to-End Text to Speech". Simple MNIST training logs.

Quick Start

Follow the steps below to get started with Aim.

1. Install Aim on your training environment

pip3 install aim

2. Integrate Aim with your code

from aim import Run

# Initialize a new run
run = Run()

# Log run parameters
run["hparams"] = {
    "learning_rate": 0.001,
    "batch_size": 32,
}

# Log metrics
for i in range(10):
    run.track(i, name='loss', step=i, context={ "subset":"train" })
    run.track(i, name='acc', step=i, context={ "subset":"train" })

See the full list of supported trackable objects(e.g. images, text, etc) here.

3. Run the training as usual and start Aim UI

aim up

4. Or query runs programmatically via SDK

from aim import Repo

my_repo = Repo('/path/to/aim/repo')

query = "metric.name == 'loss'" # Example query

# Get collection of metrics
for run_metrics_collection in my_repo.query_metrics(query).iter_runs():
    for metric in run_metrics_collection:
        # Get run params
        params = metric.run[...]
        # Get metric values
        steps, metric_values = metric.values.sparse_numpy()

Integrations

Integrate PyTorch Lightning
from aim.pytorch_lightning import AimLogger

# ...
trainer = pl.Trainer(logger=AimLogger(experiment='experiment_name'))
# ...

See documentation here.

Integrate Hugging Face
from aim.hugging_face import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='mnli')
trainer = Trainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset if training_args.do_train else None,
    eval_dataset=eval_dataset if training_args.do_eval else None,
    callbacks=[aim_callback],
    # ...
)
# ...

See documentation here.

Integrate Keras & tf.keras
import aim

# ...
model.fit(x_train, y_train, epochs=epochs, callbacks=[
    aim.keras.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
    
    # Use aim.tensorflow.AimCallback in case of tf.keras
    aim.tensorflow.AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
])
# ...

See documentation here.

Integrate KerasTuner
from aim.keras_tuner import AimCallback

# ...
tuner.search(
    train_ds,
    validation_data=test_ds,
    callbacks=[AimCallback(tuner=tuner, repo='.', experiment='keras_tuner_test')],
)
# ...

See documentation here.

Integrate XGBoost
from aim.xgboost import AimCallback

# ...
aim_callback = AimCallback(repo='/path/to/logs/dir', experiment='experiment_name')
bst = xgb.train(param, xg_train, num_round, watchlist, callbacks=[aim_callback])
# ...

See documentation here.

Integrate CatBoost
from aim.catboost import AimLogger

# ...
model.fit(train_data, train_labels, log_cout=AimLogger(loss_function='Logloss'), logging_level="Info")
# ...

See documentation here.

Integrate fastai
from aim.fastai import AimCallback

# ...
learn = cnn_learner(dls, resnet18, pretrained=True,
                    loss_func=CrossEntropyLossFlat(),
                    metrics=accuracy, model_dir="/tmp/model/",
                    cbs=AimCallback(repo='.', experiment='fastai_test'))
# ...

See documentation here.

Integrate LightGBM
from aim.lightgbm import AimCallback

# ...
aim_callback = AimCallback(experiment='lgb_test')
aim_callback.experiment['hparams'] = params

gbm = lgb.train(params,
                lgb_train,
                num_boost_round=20,
                valid_sets=lgb_eval,
                callbacks=[aim_callback, lgb.early_stopping(stopping_rounds=5)])
# ...

See documentation here.

Integrate PyTorch Ignite
from aim.pytorch_ignite import AimLogger

# ...
aim_logger = AimLogger()

aim_logger.log_params({
    "model": model.__class__.__name__,
    "pytorch_version": str(torch.__version__),
    "ignite_version": str(ignite.__version__),
})

aim_logger.attach_output_handler(
    trainer,
    event_name=Events.ITERATION_COMPLETED,
    tag="train",
    output_transform=lambda loss: {'loss': loss}
)
# ...

See documentation here.

Comparisons to familiar tools

Tensorboard

Training run comparison

Order of magnitude faster training run comparison with Aim

  • The tracked params are first class citizens at Aim. You can search, group, aggregate via params - deeply explore all the tracked data (metrics, params, images) on the UI.
  • With tensorboard the users are forced to record those parameters in the training run name to be able to search and compare. This causes a super-tedius comparison experience and usability issues on the UI when there are many experiments and params. TensorBoard doesn't have features to group, aggregate the metrics

Scalability

  • Aim is built to handle 1000s of training runs - both on the backend and on the UI.
  • TensorBoard becomes really slow and hard to use when a few hundred training runs are queried / compared.

Beloved TB visualizations to be added on Aim

  • Embedding projector.
  • Neural network visualization.

MLFlow

MLFlow is an end-to-end ML Lifecycle tool. Aim is focused on training tracking. The main differences of Aim and MLflow are around the UI scalability and run comparison features.

Run comparison

  • Aim treats tracked parameters as first-class citizens. Users can query runs, metrics, images and filter using the params.
  • MLFlow does have a search by tracked config, but there are no grouping, aggregation, subplotting by hyparparams and other comparison features available.

UI Scalability

  • Aim UI can handle several thousands of metrics at the same time smoothly with 1000s of steps. It may get shaky when you explore 1000s of metrics with 10000s of steps each. But we are constantly optimizing!
  • MLflow UI becomes slow to use when there are a few hundreds of runs.

Weights and Biases

Hosted vs self-hosted

  • Weights and Biases is a hosted closed-source MLOps platform.
  • Aim is self-hosted, free and open-source experiment tracking tool.

Roadmap

Detailed Sprints

:sparkle: The Aim product roadmap

  • The Backlog contains the issues we are going to choose from and prioritize weekly
  • The issues are mainly prioritized by the highly-requested features

High-level roadmap

The high-level features we are going to work on the next few months

Done

  • Live updates (Shipped: Oct 18 2021)
  • Images tracking and visualization (Start: Oct 18 2021, Shipped: Nov 19 2021)
  • Distributions tracking and visualization (Start: Nov 10 2021, Shipped: Dec 3 2021)
  • Jupyter integration (Start: Nov 18 2021, Shipped: Dec 3 2021)
  • Audio tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Transcripts tracking and visualization (Start: Dec 6 2021, Shipped: Dec 17 2021)
  • Plotly integration (Start: Dec 1 2021, Shipped: Dec 17 2021)
  • Colab integration (Start: Nov 18 2021, Shipped: Dec 17 2021)
  • Centralized tracking server (Start: Oct 18 2021, Shipped: Jan 22 2022)
  • Tensorboard adaptor - visualize TensorBoard logs with Aim (Start: Dec 17 2021, Shipped: Feb 3 2022)
  • Track git info, env vars, CLI arguments, dependencies (Start: Jan 17 2022, Shipped: Feb 3 2022)
  • MLFlow adaptor (visualize MLflow logs with Aim) (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Activeloop Hub integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • PyTorch-Ignite integration (Start: Feb 14 2022, Shipped: Feb 22 2022)
  • Run summary and overview info(system params, CLI args, git info, ...) (Start: Feb 14 2022, Shipped: Mar 9 2022)
  • Add DVC related metadata into aim run (Start: Mar 7 2022, Shipped: Mar 26 2022)
  • Ability to attach notes to Run from UI (Start: Mar 7 2022, Shipped: Apr 29 2022)
  • Fairseq integration (Start: Mar 27 2022, Shipped: Mar 29 2022)
  • LightGBM integration (Start: Apr 14 2022, Shipped: May 17 2022)
  • CatBoost integration (Start: Apr 20 2022, Shipped: May 17 2022)
  • Run execution details(display stdout/stderr logs) (Start: Apr 25 2022, Shipped: May 17 2022)
  • Long sequences(up to 5M of steps) support (Start: Apr 25 2022, Shipped: Jun 22 2022)
  • Figures Explorer (Start: Mar 1 2022, Shipped: Aug 21 2022)
  • Notify on stuck runs (Start: Jul 22 2022, Shipped: Aug 21 2022)
  • Integration with KerasTuner (Start: Aug 10 2022, Shipped: Aug 21 2022)
  • Integration with WandB (Start: Aug 15 2022, Shipped: Aug 21 2022)
  • Stable remote tracking server (Start: Jun 15 2022, Shipped: Aug 21 2022)

In Progress

  • Project overview page (Start: Sep 1 2022)
  • Remote tracking server scaling (Start: Sep 1 2022)
  • Integration with fast.ai (Start: Aug 22 2022)
  • Integration with MXNet (Start: Sep 20 2022)
  • Aim SDK low-level interface (Start: Aug 22 2022)

To Do

Aim UI

  • Runs management
    • Runs explorer – query and visualize runs data(images, audio, distributions, ...) in a central dashboard
  • Explorers
    • Audio Explorer
    • Text Explorer
    • Distributions Explorer
  • Dashboards – customizable layouts with embedded explorers

SDK and Storage

  • Scalability
    • Smooth UI and SDK experience with over 10.000 runs
  • Runs management
    • CLI interfaces
      • Reporting - runs summary and run details in a CLI compatible format
      • Manipulations – copy, move, delete runs, params and sequences

Integrations

  • ML Frameworks:
    • Shortlist: MONAI, SpaCy, Raytune, MXNet, PaddlePaddle
  • Datasets versioning tools
    • Shortlist: HuggingFace Datasets
  • Resource management tools
    • Shortlist: Kubeflow, Slurm
  • Workflow orchestration tools
  • Others: Hydra, Google MLMD, Streamlit, ...

On hold

  • scikit-learn integration
  • Cloud storage support – store runs blob(e.g. images) data on the cloud (Start: Mar 21 2022)
  • Artifact storage – store files, model checkpoints, and beyond (Start: Mar 21 2022)

Community

If you have questions

  1. Read the docs
  2. Open a feature request or report a bug
  3. Join our slack

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aim-3.14.4.tar.gz (1.5 MB view details)

Uploaded Source

Built Distributions

aim-3.14.4-cp310-cp310-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.24+ x86-64

aim-3.14.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.5 MB view details)

Uploaded CPython 3.10 manylinux: glibc 2.17+ x86-64

aim-3.14.4-cp310-cp310-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.10 macOS 11.0+ ARM64

aim-3.14.4-cp310-cp310-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.10 macOS 10.14+ x86-64

aim-3.14.4-cp39-cp39-manylinux_2_24_x86_64.whl (5.5 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.24+ x86-64

aim-3.14.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.6 MB view details)

Uploaded CPython 3.9 manylinux: glibc 2.17+ x86-64

aim-3.14.4-cp39-cp39-macosx_11_0_arm64.whl (2.2 MB view details)

Uploaded CPython 3.9 macOS 11.0+ ARM64

aim-3.14.4-cp39-cp39-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.9 macOS 10.14+ x86-64

aim-3.14.4-cp38-cp38-manylinux_2_24_x86_64.whl (5.7 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.24+ x86-64

aim-3.14.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.6 MB view details)

Uploaded CPython 3.8 manylinux: glibc 2.17+ x86-64

aim-3.14.4-cp38-cp38-macosx_11_0_arm64.whl (2.3 MB view details)

Uploaded CPython 3.8 macOS 11.0+ ARM64

aim-3.14.4-cp38-cp38-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.8 macOS 10.14+ x86-64

aim-3.14.4-cp37-cp37m-manylinux_2_24_x86_64.whl (5.4 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.24+ x86-64

aim-3.14.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (6.2 MB view details)

Uploaded CPython 3.7m manylinux: glibc 2.17+ x86-64

aim-3.14.4-cp37-cp37m-macosx_10_14_x86_64.whl (2.4 MB view details)

Uploaded CPython 3.7m macOS 10.14+ x86-64

aim-3.14.4-cp36-cp36m-manylinux_2_24_x86_64.whl (5.2 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.24+ x86-64

aim-3.14.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (5.9 MB view details)

Uploaded CPython 3.6m manylinux: glibc 2.17+ x86-64

aim-3.14.4-cp36-cp36m-macosx_10_14_x86_64.whl (2.3 MB view details)

Uploaded CPython 3.6m macOS 10.14+ x86-64

File details

Details for the file aim-3.14.4.tar.gz.

File metadata

  • Download URL: aim-3.14.4.tar.gz
  • Upload date:
  • Size: 1.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.0 CPython/3.10.4

File hashes

Hashes for aim-3.14.4.tar.gz
Algorithm Hash digest
SHA256 d0a3a9eb939532d0572e41216cc4627f171ba6cb48f6da62fe96f3dbb0250d3a
MD5 11784214fce3cbd175636c767eaaec59
BLAKE2b-256 c99279084d54b5bf156286747df4faad63ee29296e1cd1032560192f96e7eba7

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp310-cp310-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp310-cp310-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 e711777109278ef9d2a6ae0ee49fb9267602ca74e69a4e1738d4e05b9913b9b0
MD5 c54c81d09c573e5c8e7ee1c23821e04c
BLAKE2b-256 ca6f4aa0d28b7ce9596e3f7b0ac13c98159007c90f9406b3a56d2629d7bfc279

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 5edce1390b943bdcaec55ce78c21344325027c6c8ffa42f34502618423d83159
MD5 76a1b678e17aa1ab0eda0df74d1885d4
BLAKE2b-256 e90e518c62cc320a52bec860e6b9a3e0cd481576e21090e1697725c8d33fc608

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 89197fdf88ad38abda25894e4e017aea0f7958e49795a469d7327b7a5c55a32a
MD5 9e4dd602e36bb071fa98a32ed73e023f
BLAKE2b-256 8548969eeac2e51341a8439cfcf36526ac66fcbe0f12bfa2081ec97fea4c14b3

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp310-cp310-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp310-cp310-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 840dbe86fe2a9e76d6301c5df6e8813f573977867b403750f10124a37ad8887d
MD5 eadc8e364df910e68dad46464d49e305
BLAKE2b-256 4161ea25c1b88acc3dd2c056a56a9eafd3d994663d5efebff1a3c070864810d9

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp39-cp39-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp39-cp39-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 34c9640aa98a80c9d6e97516b8bb7fdaba2026b3da227d6ddd7f0cc130ba3cc5
MD5 65f3a3e717f31e7a74d3947ebf7b4e9c
BLAKE2b-256 e8ac5182be79e1e75c20e03a8ba71d0d43f43023cd18efd58c4603d74bae3214

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 6502efb7a2cfa88841318ac9e222373af5074540e8579278c00f02caf95e85ea
MD5 2fdb29f88739603054c4febe899fb422
BLAKE2b-256 a4efddfb8b92138dad4c852d8bf84b9607dca0860807835a4b78a81bbb5eb5d1

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp39-cp39-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp39-cp39-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 f726686f2db65de27bf09c63f1a2a45c076855e9abfd974b56766026105d443f
MD5 34b8223cfe30aa7a586aea83938fb278
BLAKE2b-256 7baa7761f8605b6ad3e545320995695d49816a31bba4664504de4630f5056a22

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp39-cp39-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp39-cp39-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 827a6a79496eb6bc0444cfad6ce28d90155816813bb314cb0cf20342467a7096
MD5 bbd0b257380f38cb7165281058364364
BLAKE2b-256 5cc9d1393bb45431cf3db9f092c398971e0e30be0a89539b19e752c61cd6f5d0

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp38-cp38-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp38-cp38-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 fd309b1ef1005a23cff0bdc4aa92fd084b2c914d9371c77e1459e6e4eddf1640
MD5 58886e6c4d2b8e457c158de110a2fb30
BLAKE2b-256 f16e019fefc3894250b36a6fcfd91e0d3422c4a2b668151cd2a293ca40ba21b1

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 f95b30eab5331b0dfbea45e668dd09321f4904a6d12aba62198a5e2952e134d8
MD5 1bcf1bd73573cae3d36dd1666b3f2356
BLAKE2b-256 96dfda2e6ba45527746fc6479da3edbbdc1e1dbded09b7d2b4f763f09fe6dc9d

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp38-cp38-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp38-cp38-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 c34db88467bf32a5d1a752d86bba53360029a21b3cdf06c3e78bdac2f45d0a64
MD5 648a8a0f5708d0e5f613e9ff5a0d1e43
BLAKE2b-256 36e6b100c11ff9bd7d7a2a66620271c5fb5a06f7334c403f0dd524de1a207907

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp38-cp38-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp38-cp38-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 8d6c9e69b0ab730b1e2973e2f492cd994a1fa68a60867d8ab072e701eff8d9ff
MD5 67648b57ce22cda7f06e566aa1b70580
BLAKE2b-256 1bb48a6d373aca909712aa5011bc12ef9e3d2c25cae9570395e55e32cf6e3bd6

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp37-cp37m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp37-cp37m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 71bee31b441eaa4f34aad74cf90593fdd2721c58f64cde9809718eb3420e27c6
MD5 c755e8d697298d33c91447c3ae6ef623
BLAKE2b-256 2d68152f1c517b0b491382537d6be3a2ab6af3806f2c779c44d1d8389b1a68a6

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 e01a91e220f8fca9f16720e05b2b5a738ad8585c21d37defae5cb4834be41e7a
MD5 164ce45d5b3b5a4d6077decd5e0bcc4a
BLAKE2b-256 14f7f5b1b91e31ff4b9cc406dc5ddaf7cf996aef682dded1b5a5b64b09b51949

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp37-cp37m-macosx_10_14_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp37-cp37m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 68f4944408ef4ec773e56c6ad4de59a65c44db559f0b1dda42b9fb1970deddf7
MD5 c6c417b0c022cb58fe86886b88716796
BLAKE2b-256 380d0fef4d6f0c9e7f5051547cf23eec9680619a8f273be47f392844dc4d7fa3

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp36-cp36m-manylinux_2_24_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp36-cp36m-manylinux_2_24_x86_64.whl
Algorithm Hash digest
SHA256 e44e0f700843838b795fa640d4e9f6420fb3fcccc96a52b7b04b392c32111bc2
MD5 53d236fc725f0c5babbbb035f46d712e
BLAKE2b-256 e42e4781bfc360bdf923fc79bdbde8c5c0c5e334f7997844a8eddb398b0e513d

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for aim-3.14.4-cp36-cp36m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4bb774b9a446da4e357eaa72cb5f03ff1ae9c7f5dd22327e46c79633a87337dd
MD5 ad54e5f9caf374e953885f32fc56ba1c
BLAKE2b-256 357a3c52fb8acf54882935b86c84fba95928a7df4d120d4f5ed045eaa53b7b5c

See more details on using hashes here.

File details

Details for the file aim-3.14.4-cp36-cp36m-macosx_10_14_x86_64.whl.

File metadata

  • Download URL: aim-3.14.4-cp36-cp36m-macosx_10_14_x86_64.whl
  • Upload date:
  • Size: 2.3 MB
  • Tags: CPython 3.6m, macOS 10.14+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.27.1 requests-toolbelt/0.9.1 urllib3/1.26.9 tqdm/4.64.0 importlib-metadata/4.8.3 keyring/23.4.1 rfc3986/1.5.0 colorama/0.4.4 CPython/3.6.13

File hashes

Hashes for aim-3.14.4-cp36-cp36m-macosx_10_14_x86_64.whl
Algorithm Hash digest
SHA256 4d29db6a017f46c9efb6ad106df61e344d24f34d9d11e16794d7eefc64e48bcf
MD5 96b4d99a77778f52479b4cca82f4fbfa
BLAKE2b-256 1711ccbe6c0e3c4124d48a28c890e7041a8ba730b25bcaec3a4a6f00133c11a0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page