Skip to main content

Layered agents!

Project description

Lasagna AI Logo

Lasagna AI

PyPI - Version PyPI - Python Version Test Status Downloads

  • 🥞 Layered agents!

    • Agents for your agents!
    • Tool-use, structured output ("extraction"), and layering FTW 💪
    • Ever wanted a recursive agent? Now you can have one! 🤯
    • Parallel tool-calling by default.
    • Fully asyncio.
    • 100% Python type hints.
    • Functional-style 😎
    • (optional) Easy & pluggable caching! 🏦
  • 🚣 Streamable!

    • Event streams for everything.
    • Asyncio generators are awesome.
  • 🗃️ Easy database integration!

    • Don't rage when trying to store raw messages and token counts. 😡 🤬
    • Yes, you can have both streaming and easy database storage.
  • ↔️ Provider/model agnostic and interoperable!

    • Native support for OpenAI, Anthropic, NVIDIA NIM/NGC (+ more to come).
    • Message representations are canonized. 😇
    • Supports vision!
    • Easily build committees!
    • Swap providers or models mid-conversation.
    • Delegate tasks among model providers or model sizes.
    • Parallelize all the things.

Table of Contents

Installation

pip install -U lasagna-ai[openai,anthropic]

If you want to easily run all the ./examples, then you can install the extra dependencies used by those examples:

pip install -U lasagna-ai[openai,anthropic,example-deps]

Used By

Lasagna is used in production by:

AutoAuto

Quickstart

Here is the most simple agent (it doesn't add anything to the underlying model). More complex agents would add tools and/or use layers of agents, but not this one! Anyway, run it in your terminal and you can chat interactively with the model. 🤩

(taken from ./examples/quickstart.py)

from lasagna import (
    known_models,
    build_simple_agent,
)

from lasagna.tui import (
    tui_input_loop,
)

from typing import List, Callable

import asyncio

from dotenv import load_dotenv; load_dotenv()


MODEL_BINDER = known_models.BIND_OPENAI_gpt_4o_mini()


async def main() -> None:
    system_prompt = "You are grumpy."
    tools: List[Callable] = []
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    my_bound_agent = MODEL_BINDER(my_agent)
    await tui_input_loop(my_bound_agent, system_prompt)


if __name__ == '__main__':
    asyncio.run(main())

Want to add your first tool? LLMs can't natively do arithmetic (beyond simple arithmetic with small numbers), so let's give our model a tool for doing arithmetic! 😎

(full example at ./examples/quickstart_with_math_tool.py)

import sympy as sp

...

def evaluate_math_expression(expression: str) -> float:
    """
    This tool evaluates a math expression and returns the result.
    Pass math expression as a string, for example:
     - "3 * 6 + 1"
     - "cos(2 * pi / 3) + log(8)"
     - "(4.5/2) + (6.3/1.2)"
     - ... etc
    :param: expression: str: the math expression to evaluate
    """
    expr = sp.sympify(expression)
    result = float(expr.evalf())
    return result

...

    ...
    tools: List[Callable] = [
        evaluate_math_expression,
    ]
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    ...

...

Simple RAG: Everyone's favorite tool: Retrieval Augmented Generation (RAG). Let's GO! 📚💨
See: ./examples/demo_rag.py

Debug Logging

This library logs using Python's builtin logging module. It logs mostly to INFO, so here's a snippet of code you can put in your app to see those traces:

import logging

logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
)

# ... now use Lasagna as you normally would, but you'll see extra log traces!

Special Thanks

Special thanks to those who inspired this library:

License

lasagna-ai is distributed under the terms of the MIT license.

Joke Acronym

Layered Agents with toolS And aGeNts and Ai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lasagna_ai-0.9.0.tar.gz (95.7 kB view details)

Uploaded Source

Built Distribution

lasagna_ai-0.9.0-py3-none-any.whl (39.5 kB view details)

Uploaded Python 3

File details

Details for the file lasagna_ai-0.9.0.tar.gz.

File metadata

  • Download URL: lasagna_ai-0.9.0.tar.gz
  • Upload date:
  • Size: 95.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for lasagna_ai-0.9.0.tar.gz
Algorithm Hash digest
SHA256 b70eb51f505a4ad5aa3840cf1cba43864f5d42a22f4018220b8bff6ceb568020
MD5 a195c0fc320d6ca20c005f32d7c2931c
BLAKE2b-256 ec74b120e3186a5c2442859d3cca9bc6c8aa5432fbb0c9c854da1e11f60e7002

See more details on using hashes here.

File details

Details for the file lasagna_ai-0.9.0-py3-none-any.whl.

File metadata

  • Download URL: lasagna_ai-0.9.0-py3-none-any.whl
  • Upload date:
  • Size: 39.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for lasagna_ai-0.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 fc6bb21284e80fabb9f5fd9b45c679d82cea89250254db01430ef81bf535d872
MD5 eeb77cc3d5f2c8c7fe492f88fe693eea
BLAKE2b-256 7d780a264a9bebd29a5c66672c9d802b1cc262af14dd244c741883f043a351c0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page