Skip to main content

Layered agents!

Project description

Lasagna AI Logo

Lasagna AI

PyPI - Version PyPI - Python Version Test Status

  • 🥞 Layered agents!

    • Agents for your agents!
    • Tool-use and layering FTW 💪
    • Ever wanted a recursive agent? Now you can have one! 🤯
    • Parallel tool-calling by default.
    • Fully asyncio.
    • 100% Python type hints.
    • Functional-style 😎
    • (optional) Easy & pluggable caching! 🏦
  • 🚣 Streamable!

    • Event streams for everything.
    • Asyncio generators are awesome.
  • 🗃️ Easy database integration!

    • Don't rage when trying to store raw messages and token counts. 😡 🤬
    • Yes, you can have both streaming and easy database storage.
  • ↔️ Provider/model agnostic and interoperable!

    • Native support for OpenAI, Anthropic, NVIDIA NIM/NGC (+ more to come).
    • Message representations are canonized. 😇
    • Supports vision!
    • Easily build committees!
    • Swap providers or models mid-conversation.
    • Delegate tasks among model providers or model sizes.
    • Parallelize all the things.

Table of Contents

Installation

pip install -U lasagna-ai[openai,anthropic]

Used By

Lasagna is used in production by:

AutoAuto

Quickstart

Here is the most simple agent (it doesn't add anything to the underlying model). More complex agents would add tools and/or use layers of agents, but not this one! Anyway, run it in your terminal and you can chat interactively with the model. 🤩

from lasagna import (
    bind_model,
    recursive_extract_messages,
    flat_messages,
)

from lasagna.tui import (
    tui_input_loop,
)

import asyncio


@bind_model('openai', 'gpt-3.5-turbo-0125')
async def most_simple_agent(model, event_callback, prev_runs):
    messages = recursive_extract_messages(prev_runs)
    tools = []
    new_messages = await model.run(event_callback, messages, tools)
    return flat_messages(new_messages)


async def main():
    system_prompt = "You are grumpy."
    await tui_input_loop(most_simple_agent, system_prompt)


if __name__ == '__main__':
    asyncio.run(main())

The code above does not use Python type hints (lame! 👎). As agents get more complex, and you end up with nested data structures and agents that call other agents, we promise that type hints will be your best friend. So, we suggest you use type hints from day 1! Below is the same example, but with type hints. Use mypy or pyright to check your code (because type hints are useless unless you have a tool that checks them).

from lasagna import (
    bind_model,
    recursive_extract_messages,
    flat_messages,
)

from lasagna.tui import (
    tui_input_loop,
)

from lasagna.types import (
    Model,
    EventCallback,
    AgentRun,
)

from typing import List, Callable

import asyncio


@bind_model('openai', 'gpt-3.5-turbo-0125')
async def most_simple_agent(
    model: Model,
    event_callback: EventCallback,
    prev_runs: List[AgentRun],
) -> AgentRun:
    messages = recursive_extract_messages(prev_runs)
    tools: List[Callable] = []
    new_messages = await model.run(event_callback, messages, tools)
    return flat_messages(new_messages)


async def main() -> None:
    system_prompt = "You are grumpy."
    await tui_input_loop(most_simple_agent, system_prompt)


if __name__ == '__main__':
    asyncio.run(main())

Debug Logging

This library logs using Python's builtin logging module. It logs mostly to INFO, so here's a snippet of code you can put in your app to see those traces:

import logging

logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
)

# ... now use Lasagna as you normally would, but you'll see extra log traces!

Special Thanks

Special thanks to those who inspired this library:

License

lasagna-ai is distributed under the terms of the MIT license.

Joke Acronym

Layered Agents with toolS And aGeNts and Ai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lasagna_ai-0.6.2.tar.gz (81.8 kB view hashes)

Uploaded Source

Built Distribution

lasagna_ai-0.6.2-py3-none-any.whl (33.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page