Lablup Backend.AI Meta-package
Project description
Backend.AI is a streamlined backend service framework hosting heterogeneous programming languages and popular AI frameworks. It manages the underlying computing resources for multi-tenant computation sessions where such sessions are spawned and executed instantly on demand.
In the names of sub-projects, we use a private code-name “Sorna” which we take from a famous science fiction “Jurassic Park” – meaning that we do all the dirty jobs in the behind. In the novel, Isla Nublar is the “front-end” island where tourists see the dinosaurs and Isla Sorna is the “back-end” island where a secret dinosaurs production facility is located.
All sub-projects are licensed under LGPLv3+.
Server-side Components
Manager with API Gateway
It routes external API requests from front-end services to individual agents. It also monitors and scales the cluster of multiple agents (a few tens to hundreds).
Package namespace: sorna.gateway, sorna.manager
Agent
It manages individual server instances and launches/destroys Docker containers where REPL daemons (kernels) run. Each agent on a new EC2 instance self-registers itself to the instance registry via heartbeats.
Package namespace: sorna.agent
REPL
A set of small ZMQ-based REPL daemons in various programming languages and configurations. It also includes a sandbox implemented using ptrace-based sytem call filtering written in Go.
Each daemon is a separate program, usually named “run.{lang-specific-extension}”.
Sorna Common
A collection of utility modules commonly shared throughout Backend.AI projects.
Package namespaces: sorna.common
Client-side Components
Client Libraries
A client library to access the Sorna API servers with ease.
- Python
pip install backend.ai-client
- Javascript (under preparation)
npm install backend.ai-client
- PHP (under preparation)
composer require lablup/backend.ai-client
Sorna Media
The front-end support libraries to handle multi-media outputs (e.g., SVG plots, animated vector graphics)
The Python package (lablup) is installed inside kernel containers.
To interpret and display media generated by the Python package, you need to load the Javascript part in the front-end.
Integrations with IDEs and Editors
Sorna Jupyter Kernel
Jupyter kernel integration of the Sorna Cloud API.
Package namespaces: sorna.integration
Visual Studio Code Extension
Extension for Visual Studio Code to run your code on the Lablup.AI clouds or your own Backend.AI servers.
Search ‘live code runner’ to VSCode extension search.
Atom Editor plugin
Atom Editor Plugin that allows running your code on the Lablup.AI clouds or your own Backend.AI servers.
Search ‘live code runner’ to Atom plugin search.
Installation
The Sorna project uses latest features in Python 3.6+ and Docker CE 17.05+.
To install the manager with API gateway, run:
pip install backend.ai[manager]
For each computing servers, install the agent using:
pip install backend.ai[agent]
NOTE: More details about configuration will be released soon.
Development
git flow
The sorna repositories use git flow to streamline branching during development and deployment. We use the default configuration (master -> preparation for release, develop -> main development, feature/ -> features, etc.) as-is.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for backend.ai-1.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2fd3058f5e41165a14098ebe6639c57ea04bab00df6f958fc7a701f18552d88f |
|
MD5 | 3359974ff5b89e986f097be34ff55f69 |
|
BLAKE2b-256 | 3fac2c5d71ec582b8c858c9448ab13805b1df9a95cfbbb7e92c5670edb70a448 |