TRAINS - Auto-Magical Experiment Manager & Version Control for AI
Project description
Allegro Trains
Auto-Magical Experiment Manager, Version Control and ML-Ops for AI
"Because it’s a jungle out there"
:point_right: Help improve Trains by filling our 2-min user survey
Trains is our solution to a problem we share with countless other researchers and developers in the machine learning/deep learning universe: Training production-grade deep learning models is a glorious but messy process. Trains tracks and controls the process by associating code version control, research projects, performance metrics, and model provenance.
We designed Trains specifically to require effortless integration so that teams can preserve their existing methods and practices. Use it on a daily basis to boost collaboration and visibility, or use it to automatically collect your experimentation logs, outputs, and data to one centralized server.
We have a demo server up and running at https://demoapp.trains.allegro.ai.
:steam_locomotive: Getting Started Tutorial :rocket:
You can try out Trains and test your code, with no additional setup.
Trains Automatically Logs Everything
With only two lines of code, this is what you are getting:
- Git repository, branch, commit id, entry point and local git diff
- Python environment (including specific packages & versions)
- stdout and stderr
- Resource Monitoring (CPU/GPU utilization, temperature, IO, network, etc.)
- Hyper-parameters
- ArgParser for command line parameters with currently used values
- Explicit parameters dictionary
- Tensorflow Defines (absl-py)
- Initial model weights file
- Model snapshots (With optional automatic upload to central storage: Shared folder, S3, GS, Azure, Http)
- Artifacts log & store (Shared folder, S3, GS, Azure, Http)
- Tensorboard/TensorboardX scalars, metrics, histograms, images, audio and video
- Matplotlib & Seaborn
- Supported frameworks: PyTorch, Tensorflow, Keras, AutoKeras, XGBoost and Scikit-Learn (MxNet is coming soon)
- Seamless integration (including version control) with Jupyter Notebook and PyCharm remote debugging
Additionally, log data explicitly using Trains Explicit Logging.
Using Trains
Trains is a two part solution:
-
Trains python package auto-magically connects with your code
Trains requires only two lines of code for full integration.
To connect your code with Trains:
-
pip install trains
Add optional cloud storage support (S3/GoogleStorage/Azure):
pip install trains[s3] pip install trains[gs] pip install trains[azure]
-
Add the following lines to your code
from trains import Task task = Task.init(project_name="my project", task_name="my task")
- If project_name is not provided, the repository name will be used instead
- If task_name (experiment) is not provided, the current filename will be used instead
-
Run your code. When Trains connects to the server, a link is printed. For example
Trains Results page: https://demoapp.trains.allegro.ai/projects/76e5e2d45e914f52880621fe64601e85/experiments/241f06ae0f5c4b27b8ce8b64890ce152/output/log
-
Open the link and view your experiment parameters, model and tensorboard metrics
See examples here
-
-
Trains Server for logging, querying, control and UI (Web-App)
We already have a demo server up and running for you at https://demoapp.trains.allegro.ai.
You can try out Trains without the need to install your own trains-server, just add the two lines of code, and it will automatically connect to the Trains demo-server.
Note that the demo server resets every 24 hours and all of the logged data is deleted.
When you are ready to use your own Trains server, go ahead and install trains-server.
Configuring Your Own Trains server
-
Install and run trains-server (see Installing the Trains Server)
-
Run the initial configuration wizard for your Trains installation and follow the instructions to setup Trains package (http://trains-server-ip:port and user credentials)
trains-init
After installing and configuring, you can access your configuration file at ~/trains.conf
Sample configuration file available here.
Who We Are
Trains is supported by the same team behind allegro.ai, where we build deep learning pipelines and infrastructure for enterprise companies.
We built Trains to track and control the glorious but messy process of training production-grade deep learning models. We are committed to vigorously supporting and expanding the capabilities of Trains.
Why Are We Releasing Trains?
We believe Trains is ground-breaking. We wish to establish new standards of experiment management in deep-learning and ML. Only the greater community can help us do that.
We promise to always be backwardly compatible. If you start working with Trains today, even though this project is currently in the beta stage, your logs and data will always upgrade with you.
License
Apache License, Version 2.0 (see the LICENSE for more information)
Documentation, Community & Support
More information in the official documentation and on YouTube.
For examples and use cases, check the examples folder and corresponding documentation.
If you have any questions: post on our Slack Channel, or tag your questions on stackoverflow with 'trains' tag.
For feature requests or bug reports, please use GitHub issues.
Additionally, you can always find us at trains@allegro.ai
Contributing
See the Trains Guidelines for Contributing.
May the force (and the goddess of learning rates) be with you!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distribution
File details
Details for the file trains-0.16.4-py2.py3-none-any.whl
.
File metadata
- Download URL: trains-0.16.4-py2.py3-none-any.whl
- Upload date:
- Size: 855.8 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.1.3 requests-toolbelt/0.9.1 tqdm/4.45.0 CPython/3.6.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ea8bc6b7437613ae62bda63d189dd1cee1bf041f8a8addd1e2dd80f090fda932 |
|
MD5 | 2cc0f46ffa676f14770d12f3905938b2 |
|
BLAKE2b-256 | 31e49b283e7ed1ac1c9bb36e362ce6cbca6f604be000263f81f67c99aa5c34c5 |