kodosumi framework to execute and orchestrate agentic services safe and at scale
Project description
kodosumi
kodosumi is the runtime environment to manage and execute agentic services at scale. The system is based on Ray - a distributed computing framework - and a combination of litestar and fastapi to deliver agentic services to users or other agents. Similar to Ray, kodosumi follows a Python first agenda.
kodosumi is one component of a larger eco system with masumi and sokosumi.
Introduction
kodosumi consists of three main building blocks. First, a Ray cluster to execute agentic services at scale. kodosumi builds on top of Ray and actively manages the lifecycle and events of service executions from starting to finished or error. No matter you name your code an application, flow, service or script: The third building block is your application which runs on top of kodosumi.
The following architecture shows the relation between the three building blocks: 1) your service on top of 2) kodosumi which operates 3) a distributed compute cluster with Ray secure and at scale.
You build and deploy your Flow by providing an endpoint (http route) and an entrypoint (Python callable) to kodosumi (left bottom blue box in the diagram). kodosumi delivers features for access control, flow control and manages flow execution with Ray head node and worker nodes. kodosumi spooler gathers flow execution results and outputs into the event stream.
Deep-dive into endpoints and how these translate into entrypoints of flows which operationalize the business logic of agentic services or agents in the broader sense.
If you still need further background information read why kodosumi
installation
The following quick guide
- installs kodosumi and all prerequisites
- starts Ray and kodosumi on your localhost
- deploys an example flow which ships with kodosumi
This installation has been tested with versions ray==2.46.0 and python==3.12.6.
STEP 1 - clone and install kodosumi.
pip install kodosumi
To install the latest dev from GitHub clone and install from source.
git clone https://github.com/masumi-network/kodosumi.git
cd ./kodosumi
git checkout dev
pip install .
cd ..
STEP 2 - create service home.
Create a directory ./home. This directory will host agentic services. Each agentic service runs in a custom environment which matches the specific service requirements.
mkdir ./home
STEP 3 - start ray as a daemon.
Change to ./home and start Ray inside this directory so Ray can import from this directory.
cd ./home
ray start --head
Check ray status and visit ray dashboard at http://localhost:8265. For more information about ray visit ray's documentation.
STEP 4 - source example app
We will deploy one kodosumi example app. Clone kodosumi git source repository.
git clone https://github.com/masumi-network/kodosumi.git
git -C ./kodosumi checkout dev
Directory ./kodosumi/apps contains various example services. Copy example services from ./kodosumi/apps/<name> to ./home/<name>. For this example use the Hymn Creator which creates a short hymn about a given topic of your choice using OpenAI and CrewAI.
cd ./home
cp -r ./kodosumi/apps/hymn ./
You can remove source directory ./kodosumi or keep it to run other examples later.
STEP 5 - prepare environment
Based on deployment configuration in ./home/hymn/config.yaml Ray will create a dedicated Python environment for the service. In config.yaml you define the Python package requirements and environment variables.
For this example, edit ./home/hmyn/config.yaml and add your OpenAI OPEN_API_KEY at the bottom of the file.
applications:
- name: hymn
route_prefix: /hymn
import_path: hymn.app:fast_app
runtime_env:
pip:
- crewai
- crewai_tools
env_vars:
OTEL_SDK_DISABLED: "true"
OPENAI_API_KEY: ... # <-- add your key here
STEP 6 - deploy service
Deploy example hymn.app in folder ./home. Use Ray serve deploy to launch the service in your localhost Ray cluster. Ensure you start serve in the same directory as Ray (./home).
cd ./home
serve deploy ./hymn/config.yaml
This will setup a dedicated environment with Python dependencies crewai and crewai_tools. Ray sets up this environment based on the relevant sections in ./home/hymn/config.yaml.
Please be patient if the Ray serve application takes a moment to setup, install and deploy. Follow the deployment process with the Ray dashboard at http://localhost:8265/#/serve. On my laptop initial deployment can easily take a couple of minutes.
STEP 6 - start kodosumi
Finally start the kodosumi components and register the deployed ray endpoints available at
http://localhost:8001/-/routes. The port is defined in config.yaml. The path /-/routes reports available endpoints of active Ray deployments.
Ensure you start and serve from the same directory as Ray (./home).
cd ./home
koco start --register http://localhost:8001/-/routes
This command starts kodosumi spooler in the background and kodosumi panel and API in the foreground.
[!NOTE] Command
koco startis equivalent to:koco spool koco serve
STEP 7 - Look around
Visit kodosumi admin panel at http://localhost:3370. The default user is defined in config.py and reads name=admin and password=admin. If one or more Ray serve applications are not yet available when kodosumi starts, you need to refresh the list of registered flows. Visit control screen in the admin panel and click RECONNECT. Launch the Hymn Creator from the service screen and revisit results at the timeline screen.
Visit kodosumi panel overview to view some screenshots if you do not have the time and inclination.
Stop the kodosumi services by hitting CNTRL+C in your terminal. The spooler continues to run as a background daemon. You can stop the spooler with koco spool --status. Stop Ray serve with serve shutdown --yes and Ray daemon with command ray stop.
Where to get from here?
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file kodosumi-0.9.2.tar.gz.
File metadata
- Download URL: kodosumi-0.9.2.tar.gz
- Upload date:
- Size: 5.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a1c69e49015198fd62d29c37f84d7cc17d89ee04fb7ff1708bd2f32275070075
|
|
| MD5 |
fa7d14a9e877066838340502d0496b0e
|
|
| BLAKE2b-256 |
3d64faa224ea3c525201319e4a7e4e18d9efb8d30a462bd4b2e6b72252da0022
|
File details
Details for the file kodosumi-0.9.2-py3-none-any.whl.
File metadata
- Download URL: kodosumi-0.9.2-py3-none-any.whl
- Upload date:
- Size: 592.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
736bd019248d34f830b0a7a93c1e00b0a7377027f47a1312cf22eb74ca80fa8d
|
|
| MD5 |
4a41c6ac2e68eced6b868bfbc11f8be1
|
|
| BLAKE2b-256 |
796244fd924fb7b069bf39e3bef375f4fc932fd4e17e476ffc6331260cdaecd5
|