Skip to main content

Layered agents!

Project description

Lasagna AI Logo

Lasagna AI

PyPI - Version PyPI - Python Version Test Status Downloads

  • 🥞 Layered agents!

    • Agents for your agents!
    • Tool-use, structured output ("extraction"), and layering FTW 💪
    • Ever wanted a recursive agent? Now you can have one! 🤯
    • Parallel tool-calling by default.
    • Fully asyncio.
    • 100% Python type hints.
    • Functional-style 😎
    • (optional) Easy & pluggable caching! 🏦
  • 🚣 Streamable!

    • Event streams for everything.
    • Asyncio generators are awesome.
  • 🗃️ Easy database integration!

    • Don't rage when trying to store raw messages and token counts. 😡 🤬
    • Yes, you can have both streaming and easy database storage.
  • ↔️ Provider/model agnostic and interoperable!

    • Native support for OpenAI, Anthropic, NVIDIA NIM/NGC (+ more to come).
    • Message representations are canonized. 😇
    • Supports vision!
    • Easily build committees!
    • Swap providers or models mid-conversation.
    • Delegate tasks among model providers or model sizes.
    • Parallelize all the things.

Table of Contents

Installation

pip install -U lasagna-ai[openai,anthropic]

If you want to easily run all the ./examples, then you can install the extra dependencies used by those examples:

pip install -U lasagna-ai[openai,anthropic,example-deps]

Used By

Lasagna is used in production by:

AutoAuto

Quickstart

Here is the most simple agent (it doesn't add anything to the underlying model). More complex agents would add tools and/or use layers of agents, but not this one! Anyway, run it in your terminal and you can chat interactively with the model. 🤩

(taken from ./examples/quickstart.py)

from lasagna import (
    known_models,
    build_simple_agent,
)

from lasagna.tui import (
    tui_input_loop,
)

from typing import List, Callable

import asyncio

from dotenv import load_dotenv; load_dotenv()


MODEL_BINDER = known_models.BIND_OPENAI_gpt_4o_mini()


async def main() -> None:
    system_prompt = "You are grumpy."
    tools: List[Callable] = []
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    my_bound_agent = MODEL_BINDER(my_agent)
    await tui_input_loop(my_bound_agent, system_prompt)


if __name__ == '__main__':
    asyncio.run(main())

Want to add your first tool? LLMs can't natively do arithmetic (beyond simple arithmetic with small numbers), so let's give our model a tool for doing arithmetic! 😎

(full example at ./examples/quickstart_with_math_tool.py)

import sympy as sp

...

def evaluate_math_expression(expression: str) -> float:
    """
    This tool evaluates a math expression and returns the result.
    Pass math expression as a string, for example:
     - "3 * 6 + 1"
     - "cos(2 * pi / 3) + log(8)"
     - "(4.5/2) + (6.3/1.2)"
     - ... etc
    :param: expression: str: the math expression to evaluate
    """
    expr = sp.sympify(expression)
    result = float(expr.evalf())
    return result

...

    ...
    tools: List[Callable] = [
        evaluate_math_expression,
    ]
    my_agent = build_simple_agent(name = 'agent', tools = tools)
    ...

...

Simple RAG: Everyone's favorite tool: Retrieval Augmented Generation (RAG). Let's GO! 📚💨
See: ./examples/demo_rag.py

Debug Logging

This library logs using Python's builtin logging module. It logs mostly to INFO, so here's a snippet of code you can put in your app to see those traces:

import logging

logging.basicConfig(
    level=logging.INFO,
    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
)

# ... now use Lasagna as you normally would, but you'll see extra log traces!

Special Thanks

Special thanks to those who inspired this library:

License

lasagna-ai is distributed under the terms of the MIT license.

Joke Acronym

Layered Agents with toolS And aGeNts and Ai

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lasagna_ai-0.9.1.tar.gz (98.3 kB view details)

Uploaded Source

Built Distribution

lasagna_ai-0.9.1-py3-none-any.whl (40.9 kB view details)

Uploaded Python 3

File details

Details for the file lasagna_ai-0.9.1.tar.gz.

File metadata

  • Download URL: lasagna_ai-0.9.1.tar.gz
  • Upload date:
  • Size: 98.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for lasagna_ai-0.9.1.tar.gz
Algorithm Hash digest
SHA256 326d70907d49f9431779d181e52c9f54a334f386cb1f75828f6b43a05cc7a9b9
MD5 21b500d2922a800914476cd61f874a8b
BLAKE2b-256 abb67eb75efb062e155d6f7ce77a2dd87c58646d696f4df88d2daa5e2e3a5cee

See more details on using hashes here.

File details

Details for the file lasagna_ai-0.9.1-py3-none-any.whl.

File metadata

  • Download URL: lasagna_ai-0.9.1-py3-none-any.whl
  • Upload date:
  • Size: 40.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-httpx/0.27.2

File hashes

Hashes for lasagna_ai-0.9.1-py3-none-any.whl
Algorithm Hash digest
SHA256 dbaa51f562237b3216ab6bbc35be3ecc059e4ea6437ecb0e87b00894a065253a
MD5 5ddb29e1f32617137adc2ab7cfbc8dcd
BLAKE2b-256 9818d907fe7a49531d60a8752412b13de55a195e98bc6f5774b2c40104ec1208

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page