Skip to main content

Drop-in observability and guardrails for AI agents.

Project description

Lightsei

Drop-in observability and guardrails for AI agents.

pip install lightsei
import lightsei
import openai

lightsei.init(api_key="bk_...", agent_name="my-bot")

oai = openai.OpenAI()  # auto-instrumented after init()

@lightsei.track
def reply(prompt: str) -> str:
    return oai.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": prompt}],
    ).choices[0].message.content

That's it. Every call now appears at app.lightsei.com with timestamps, model, latency, and token counts. No instrumentation, no manual wrapping.

What you get

  • Observability — runs, events, costs, errors. Out of the box for OpenAI and Anthropic; one line of code per provider.
  • Guardrails — daily cost caps, output validators (schema + content rules), behavioral checks. Caught before delivery, visible in the dashboard.
  • Polaris — a project orchestrator bot you can deploy via Lightsei's PaaS. Reads your MEMORY.md + TASKS.md and proposes the next moves.
  • Notifications — Slack, Discord, Teams, Mattermost, generic webhook. Polaris's plans land in your team chat, validation failures page you, agent crashes get reported.
  • Graceful degradation, non-negotiable — if Lightsei's backend is unreachable or rejects an event, your bot keeps running. SDK never crashes the user's program.

Configuration

lightsei.init(
    api_key="bk_...",            # your workspace key from app.lightsei.com
    agent_name="my-bot",         # appears in dashboard + cost rollups
    version="0.1.0",             # optional — tags events
    base_url="https://api.lightsei.com",  # default
)

Sign up for a workspace API key at app.lightsei.com/signup.

Deploying bots on Lightsei

lightsei deploy ./my-bot --agent my-bot

Zips the directory, uploads to Lightsei's hosted runtime, builds a venv from requirements.txt, runs bot.py. Logs stream into the dashboard.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lightsei-0.1.2.tar.gz (36.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

lightsei-0.1.2-py3-none-any.whl (34.7 kB view details)

Uploaded Python 3

File details

Details for the file lightsei-0.1.2.tar.gz.

File metadata

  • Download URL: lightsei-0.1.2.tar.gz
  • Upload date:
  • Size: 36.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.2

File hashes

Hashes for lightsei-0.1.2.tar.gz
Algorithm Hash digest
SHA256 6b50ee00de5d119b24e57269e963aee41a185becdb1ea68220bb29f555b4c3ed
MD5 6ba0a41c053f8b8ef72c509b5e21725f
BLAKE2b-256 82034b5921f068f30843d7f4538ce0a01d7b257d2385294f740d5fd6dbd1c981

See more details on using hashes here.

File details

Details for the file lightsei-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: lightsei-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 34.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.2

File hashes

Hashes for lightsei-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d97354dbc7b97439d63b964b4a2e446fb4a2735f54407a4e8b4124e085a5c4de
MD5 203f8b8fdbb37d66644c27487ce2ee1d
BLAKE2b-256 fbbe6bc2a8b6f43e4dd871192014cf1bfde1015b0406d832e7bcb775e2a55b09

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page