AIscalate your Jupyter Notebook Prototypes into Airflow Data Products
- Free software: Apache Software License 2.0
- Website: http://www.aiscalate.com
- Documentation: https://aiscalator.readthedocs.io/en/latest/
- Bugs: https://github.com/aiscalate/aiscalator/issues
Aiscalator is a toolbox to enable your team streamlining processes from innovation to productization with:
- Jupyter workbench
- Explore Data, Prototype Solutions
- Docker wrapper tools
- Share Code, Deploy Reproducible Environments
- Airflow machinery
- Schedule Tasks, Refine Products
- Data Science and Data Engineering best practices
Test if prerequisite softwares are installed:
docker --version docker-compose --version pip --version
Install AIscalator tool:
git clone https://github.com/Aiscalate/aiscalator.git cd aiscalator/ make install
Great, we are now ready to use the AIscalator!
The following setup commands are completely optional because they are dealing with prebuilding Docker images. If you choose not to do it at this point, they will get built later on whenever they are required.
However, since producing a Docker image requires a certain amount of time to download, install packages, and sometimes even compiling them, these installation steps can be initiated right away all at once. Thus, you should be free to go enjoy a nice coffee break!
You might want to customize your environment with the AIscalator, this will ask you few questions:
Build docker images to run Jupyter environments:
aiscalator jupyter setup
Build docker image to run Airflow:
# aiscalator airflow setup <path-to-workspace-folder> # for example, aiscalator airflow setup $PWD
AIscalator commands dealing with jupyter are defining tasks in Airflow jargon; In our case, they are all wrapped inside a Docker container. We also refer to them as Steps.
Whereas AIscalator commands about airflow are made to author, schedule and monitor DAGs (Directed Acyclic Graphs). They define how a workflow is composed of multiple steps, their dependencies and execution times or triggers.
Create a new Jupyter notebook to work on, define corresponding AIscalator step:
# aiscalator jupyter new <path-to-store-new-files> # For example, aiscalator jupyter new project # (CTRL + c to kill when done)
Or you can edit an existing AIscalator step:
# aiscalator jupyter edit <aiscalator step> # For example, if you cloned the git repository: aiscalator jupyter edit resources/example/example.conf # (CTRL + c to kill when done)
Run the step without GUI:
# aiscalator jupyter run <aiscalator task> # For example, if you cloned the git repository: aiscalator jupyter run resources/example/example.conf
Start Airflow services:
aiscalator airflow start
Create a new AIscalator DAG, define the airflow job:
# aiscalator airflow new <path-to-store-new-files> # For example, aiscalator airflow new project # (CTRL + c to kill when done)
Or you can edit an existing AIscalator DAG:
# aiscalator airflow edit <aiscalator DAG> # For example, if you cloned the git repository: aiscalator airflow edit resources/example/example.conf # (CTRL + c to kill when done)
Schedule AIscalator DAG into local airflow dags folder:
# aiscalator airflow push <aiscalator DAG> # For example, if you cloned the git repository: aiscalator airflow push resources/example/example.conf
Stop Airflow services:
aiscalator airflow stop
- First Alpha release on PyPI.
- Added docker_image.docker_extra_options list feature
- Handle errors in Jupytext conversions
- aiscalator run subcommand exit code propagated to cli
- Concurrent aiscalator run commands is possible
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size aiscalator-0.1.18-py3-none-any.whl (59.6 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size aiscalator-0.1.18.tar.gz (1.0 MB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for aiscalator-0.1.18-py3-none-any.whl