Manage training results, weights and data flow of your Tensorflow models
Manage your Data Pipline and Tensorflow & Keras models with MLPipe. It is NOT another "wrapper" around Tensorflow, but rather adds utilities to setup an environment to control data flow and managed trained models (weights & results) with the help of Conda and MongoDB.
Table of Contents:
In case you create a new project, copy the environment.yml into your new folder and you are ready to go. If you want to use an existing tensorflow (v2) project, you will need to copy the dependencies from the this environment file to yours.
You can either install Anaconda or Miniconda (https://conda.io/miniconda.html). It is used for package and environment management. The environment.yml file is specifying all the packages needed. For a new project, copy the file from this repo.
During installation, check the box to add
conda to your PATH in the .bashrc file or do it manually afterwards.
>> conda env create -f environment.yml # And in case you need to update the environment later on >> conda env update -f environment.yml
This will create a conda environment and install all the needed packages (as described in environment.yml).
MongoDB database is used to store trained Models including their weights and results. Additionally there is also a data reader for MongoDB implemented (basically just a generator as you know and love from using keras). Currenlty that is the only implemented data reader working "out of the box".
Follow the instructions on the MongoDB website for installation e.g. for Linux: https://docs.mongodb.com/manual/administration/install-on-linux/
Install PyCharm (optional)
If you development python applications, PyCharm is most probably your goto editor. For installation, use this link and follow the instructions: https://www.jetbrains.com/pycharm/download
After the conda environment is set up, it can be added to the pycharm. Follow:
- File -> Settings -> Project -> Project Interpreter -> Add
- Chose "Existing environment"
- Select added environment
Now pycharm will use this conda environment and can access all installed dependencies while developing MLPipe-Trainer with PyCharm.
Export the MLPipe-Trainer root to the python path either with (can also be added to .bashrc):
# change path accordingly >> export PYTHONPATH="/home/USER/MLPipe-Trainer"
To activate the conda environment call:
conda activate mlpipe_env
For an example you can check out the Cifar-10 example in the examples folder
- Create and generat MkDocs documentation & host documentation
- Add tests
- Redis cache in case data is fetched from remote source
- Better dependency handling...
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size mlpipe_trainer-0.2.2-py3-none-any.whl (18.6 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size mlpipe-trainer-0.2.2.tar.gz (14.1 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for mlpipe_trainer-0.2.2-py3-none-any.whl