Skip to main content

A library for building LLM agents as Graphs

Project description

Lattice LLM

A "lightweight" Python library for building LLM-powered agents as executable Graphs. A core goals is to provide a good developer UX.

Key Features

  • Simple abstractions. Lattice aims to offer a small set of easy to use abstractions: Graphs, Nodes and Edges.

    • Graphs are used to orchestrate steps in an LLM-agent's workflow. A Graph executes in breadth-first fashion and has access to caller-provided Context (e.g. an AWS Bedrock client, the current user's id etc) and State (e.g. persisted chat history).

    • Nodes are simply Python functions of the form (context: Context, state: State) -> State. They're used to make the agent do stuff. Nodes are intended to be "pure" functions that take the current Context + State as input and return a copy of the updated State.

    • Edges: Connect nodes together and provide control flow. They come in two flavors:

      • A tuple of the form (Node, Node), for edges that should always be traversed. Or
      • A Python function of the form (Context, State) -> Node, for dynamic routing.
  • Control. Graphs are executed (by the caller) one layer of at a time. This makes it easy to support use-cases that require waiting on user input before executing the next layer of the Graph (e.g. a chatbot running on a web-server).

  • Easy to test and introspect. Execution can be started from any Node in the Graph. Each time a Graph layer is executed, a GraphExecutionResult is returned, which contains the updated State. This makes it easy to assert on the expected State after any Node is executed in the Graph.

  • Convenience. Lattice provides the following quality of life features "out of the box":

    • Persistance Lattice includes a StateStore Protocol (interface) for persisting graph State and a LocalStateStore that provides an in-memory implementation.
    • AWS Bedrock integration. Support is provided via a converse and converse_with_structured_output (which returns structured output in the form of a user-provided Pydantic model)
    • Tools Lattice can automatically:
      1. Convert Python functions to the JSON schema format LLMs require for defining tools.
      2. Invoke tools (local Python functions) that an LLM requests to use in its responses.

Installation

poetry add lattice_llm

Usage

from dataclasses import dataclass
from typing import Callable, Self

import boto3
from mypy_boto3_bedrock_runtime.type_defs import MessageUnionTypeDef as Message
from pydantic import BaseModel

from lattice_llm.bedrock import BedrockClient, ModelId, converse, converse_with_structured_output
from lattice_llm.bedrock.messages import text
from lattice_llm.graph import END, Graph, Node, run_chatbot_on_cli
from lattice_llm.state import LocalStateStore


@dataclass
class Context:
    """Context that a Graph can utilize as it executes. Context is not intended to be mutated"""

    user_id: str
    bedrock: BedrockClient
    tools: list[Callable]


@dataclass
class State:
    """State that a Graph can update as it executes."""

    messages: list[Message]

    @classmethod
    def merge(cls, a: Self, b: Self) -> Self:
        return cls(messages=a.messages + b.messages)


class ConversationDetails(BaseModel):
    should_continue: bool = True
    """True if the user wishes to keep conversing. False if the user has indicated a desire to end the conversation. If ambiguous, assume the user wants to continue the conversation."""


def welcome(context: Context, state: State) -> State:
    """A graph node that returns a fixed (canned) response."""
    return State.merge(state, State(messages=[text("...", role="user"), text("Hello!", role="assistant")]))


def assistant(context: Context, state: State) -> State:
    """A graph node that returns a message from Claude 3.5 Sonnet via the boto3 Bedrock client"""
    response = converse(
        client=context.bedrock,
        model_id=ModelId.CLAUDE_3_5,
        messages=state.messages,
        tools=context.tools,
        prompt="You are a helpful assistant.",
    )

    message = response["output"]["message"]
    return State.merge(state, State(messages=[message]))


def goodbye(context: Context, state: State) -> State:
    """A graph node that returns another fixed (canned) response to say goodbye to the user."""
    return State.merge(state, State(messages=[text("Goodbye!", role="assistant")]))


def continue_or_end(context: Context, state: State) -> Node[Context, State]:
    """A conditional edge, extracts structured output from Claude, in the form of a ConversationDetails Pydantic model and uses it to determine if we should loop back to the assistant node, or proceed to the goodbye node."""
    response = converse_with_structured_output(
        client=context.bedrock,
        model_id=ModelId.CLAUDE_3_5,
        messages=state.messages,
        prompt="Extract the conversation details from historical messages.",
        output_schema=ConversationDetails,
    )

    if response.should_continue:
        return assistant
    else:
        return goodbye


def get_temperature(city: str) -> int:
    """
    Returns the current temperature for a city.

    :param city: The city to pull temperature information from
    :return: The temperature in degrees fahrenheit for the specified city.
    """

    return 50


context = Context(bedrock=boto3.client("bedrock-runtime"), user_id="user-1", tools=[get_temperature])
graph = Graph[Context, State](
    nodes=[welcome, assistant, goodbye],
    edges=[
        (welcome, assistant),
        (assistant, continue_or_end),
        (goodbye, END),
    ],
)

store = LocalStateStore(lambda: State(messages=[]))

run_chatbot_on_cli(graph, context, store)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lattice_llm-0.1.2.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

lattice_llm-0.1.2-py3-none-any.whl (13.5 kB view details)

Uploaded Python 3

File details

Details for the file lattice_llm-0.1.2.tar.gz.

File metadata

  • Download URL: lattice_llm-0.1.2.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0

File hashes

Hashes for lattice_llm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 036d0a909408702c3bed77e17ca1815e2f8c0a317626128973d556eb6ba00ab7
MD5 345f8318802912c35297109bb15caa3d
BLAKE2b-256 40e4453807223a69697ebf0d6be82d83cfb16d9d5f13d7f43712cd758b15b3dc

See more details on using hashes here.

File details

Details for the file lattice_llm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: lattice_llm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 13.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0

File hashes

Hashes for lattice_llm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 5beb0e6403df9665acc5a03dd21318a6c2d3687a06ac94c6cfa8b20b9d244b55
MD5 43a3944c9d3aa4e6dcde3ce7767d1fef
BLAKE2b-256 aa9bf966b4ac1a8e1f0430daa259eb04f7a9d2696fb4bdad26d4d7cd65aa078c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page