Skip to main content

kodosumi framework to execute and orchestrate agentic services safe and at scale

Project description

kodosumi

kodosumi is the runtime environment to manage and execute agentic services at scale. The system is based on Ray - a distributed computing framework - and a combination of litestar and fastapi to deliver agentic services to users or other agents. Similar to Ray, kodosumi follows a Python first agenda.

kodosumi is one component of a larger eco system with masumi and sokosumi.

Eco System

Introduction

kodosumi consists of three main building blocks. First, a Ray cluster to execute agentic services at scale. kodosumi builds on top of Ray and actively manages the lifecycle and events of service executions from starting to finished or error. No matter you name your code an application, flow, service or script: The third building block is your application which runs on top of kodosumi.

The following architecture shows the relation between the three building blocks: 1) your service on top of 2) kodosumi which operates 3) a distributed compute cluster with Ray secure and at scale.

kodosumi overview

You build and deploy your Flow by providing an endpoint (http route) and an entrypoint (Python callable) to kodosumi (left bottom blue box in the diagram). kodosumi delivers features for access control, flow control and manages flow execution with Ray head node and worker nodes. kodosumi spooler gathers flow execution results and outputs into the event stream.

Deep-dive into endpoints and how these translate into entrypoints of flows which operationalize the business logic of agentic services or agents in the broader sense.

installation

The following quick guide

  1. installs kodosumi and all prerequisites
  2. starts Ray and kodosumi on your localhost
  3. deploys an example flow which ships with kodosumi

This installation has been tested with versions ray==2.46.0 and python==3.12.6.

STEP 1 - clone and install kodosumi.

pip install kodosumi

To install the latest dev from GitHub clone and install from source.

git clone https://github.com/masumi-network/kodosumi.git
cd kodosumi
pip install .
cd ..

STEP 2 - create service home.

Create a directory ./home. This directory will host agentic services. Each agentic service runs in a custom environment which matches the specific service requirements.

mkdir ./home

STEP 3 - start ray as a daemon.

Start Ray with the ./home directory as a package root so Ray can import from this directory which has been created in the previous step.

PYTHONPATH=./home ray start --head

Check ray status and visit ray dashboard at http://localhost:8265. For more information about ray visit ray's documentation.

STEP 4 - prepare environment

To use openai or other API you might need to create a local file .env to define API keys. This follows the 12 factors to store config in the environment.

Since the flow we are going to deploy uses openai you have to provide your API key in file .env.

OPENAI_API_KEY=...

STEP 5 - deploy example app with ray serve

We will deploy kodosumi example apps. Clone kodosumi git source repository.

git clone https://github.com/masumi-network/kodosumi.git

Directory ./kodosumi/apps contains various example services. Copy or link the cloned directory from ./kodosumi/apps to ./home/apps.

cp -r ./kodosumi/apps ./home/

Deploy example apps.hymn.app in folder ./apps. Use Ray serve deploy to launch the service in your localhost Ray cluster.

serve deploy home/apps/hymn/config.yaml

Please be patient if the Ray serve application takes a moment to setup, install and deploy. Follow the deployment process with the Ray dashboard at http://localhost:8265/#/serve. On my laptop initial deployment can easily take a couple of minutes.

STEP 6 - start kodosumi

Finally start the kodosumi components and register the deployed ray endpoints available at http://localhost:8001/-/routes. The port is defined in the config.yaml file. The path /-/routes reports available endpoints of active Ray deployments.

koco start --register http://localhost:8001/-/routes

This command starts kodosumi spooler in the background and kodosumi panel and API in the foreground.

[!NOTE] Command koco start starts the kodosumi spooler and the kodosumi panel API and is equivalent to:

koco spool
koco serve

STEP 6 - Look around

Visit kodosumi admin panel at http://localhost:3370. The default user is defined in config.py and reads name=admin and password=admin. If one or more Ray serve applications are not yet available when kodosumi starts, you need to refresh the list of registered flows. Visit Config Screen at (http://localhost:3370/admin/routes in the Admin Panel and click Reconnect. Run the Hymn Creator a revisit results at the timeline screen,

Stop the kodosumi services by hitting CNTRL+C in your terminal. The spooler continues to run as a background daemon. You can stop the spooler with koco spool --status. Stop Ray serve with serve shutdown --yes and Ray daemon with command ray stop.

Where to get from here?

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

kodosumi-0.9.1.tar.gz (4.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

kodosumi-0.9.1-py3-none-any.whl (591.5 kB view details)

Uploaded Python 3

File details

Details for the file kodosumi-0.9.1.tar.gz.

File metadata

  • Download URL: kodosumi-0.9.1.tar.gz
  • Upload date:
  • Size: 4.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.6

File hashes

Hashes for kodosumi-0.9.1.tar.gz
Algorithm Hash digest
SHA256 913338674afa58718fecf1e9b2d6a0485b4656fc540a104c8806b124a0a261dc
MD5 637c54614e0c274f65ff163f6f01ebed
BLAKE2b-256 a944e6c75ed8a151c8dd06264c1beaf08d888fb96ad438fd395b5058254b2fa6

See more details on using hashes here.

File details

Details for the file kodosumi-0.9.1-py3-none-any.whl.

File metadata

  • Download URL: kodosumi-0.9.1-py3-none-any.whl
  • Upload date:
  • Size: 591.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.6

File hashes

Hashes for kodosumi-0.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 5971174cb50c1c8ef468430a6c72e38c4ebc1c75c51b7f0bf1cfd10e5ec7e488
MD5 6c0c9acef7f9b66483c3e5640289a4e8
BLAKE2b-256 31c771836b3ac0d041773e14f51f6fb835e3a2fad1433f6b301af8daacb2e653

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page