Skip to main content

A library for building LLM agents as Graphs

Project description

Lattice LLM

A "lightweight" Python library for building LLM-powered agents as executable Graphs. A core goals is to provide a good developer UX.

Key Features

  • Simple abstractions. Lattice aims to offer a small set of easy to use abstractions: Graphs, Nodes and Edges.

    • Graphs are used to orchestrate steps in an LLM-agent's workflow. A Graph executes in breadth-first fashion and has access to caller-provided Context (e.g. an AWS Bedrock client, the current user's id etc) and State (e.g. persisted chat history).

    • Nodes are simply Python functions of the form (context: Context, state: State) -> State. They're used to make the agent do stuff. Nodes are intended to be "pure" functions that take the current Context + State as input and return a copy of the updated State.

    • Edges: Connect nodes together and provide control flow. They come in two flavors:

      • A tuple of the form (Node, Node), for edges that should always be traversed. Or
      • A Python function of the form (Context, State) -> Node, for dynamic routing.
  • Control. Graphs are executed (by the caller) one layer of at a time. This makes it easy to support use-cases that require waiting on user input before executing the next layer of the Graph (e.g. a chatbot running on a web-server).

  • Easy to test and introspect. Execution can be started from any Node in the Graph. Each time a Graph layer is executed, a GraphExecutionResult is returned, which contains the updated State. This makes it easy to assert on the expected State after any Node is executed in the Graph.

  • Convenience. Lattice provides the following quality of life features "out of the box":

    • Persistance Lattice includes a StateStore Protocol (interface) for persisting graph State and a LocalStateStore that provides an in-memory implementation.
    • AWS Bedrock integration. Support is provided via a converse and converse_with_structured_output (which returns structured output in the form of a user-provided Pydantic model)
    • Tools Lattice can automatically:
      1. Convert Python functions to the JSON schema format LLMs require for defining tools.
      2. Invoke tools (local Python functions) that an LLM requests to use in its responses.

Installation

poetry add lattice_llm

Usage

from dataclasses import dataclass
from typing import Callable, Self

import boto3
from mypy_boto3_bedrock_runtime.type_defs import MessageUnionTypeDef as Message
from pydantic import BaseModel

from lattice_llm.bedrock import BedrockClient, ModelId, converse, converse_with_structured_output
from lattice_llm.bedrock.messages import text
from lattice_llm.graph import END, Graph, Node, run_chatbot_on_cli
from lattice_llm.state import LocalStateStore


@dataclass
class Context:
    """Context that a Graph can utilize as it executes. Context is not intended to be mutated"""

    user_id: str
    bedrock: BedrockClient
    tools: list[Callable]


@dataclass
class State:
    """State that a Graph can update as it executes."""

    messages: list[Message]

    @classmethod
    def merge(cls, a: Self, b: Self) -> Self:
        return cls(messages=a.messages + b.messages)


class ConversationDetails(BaseModel):
    should_continue: bool = True
    """True if the user wishes to keep conversing. False if the user has indicated a desire to end the conversation. If ambiguous, assume the user wants to continue the conversation."""


def welcome(context: Context, state: State) -> State:
    """A graph node that returns a fixed (canned) response."""
    return State.merge(state, State(messages=[text("...", role="user"), text("Hello!", role="assistant")]))


def assistant(context: Context, state: State) -> State:
    """A graph node that returns a message from Claude 3.5 Sonnet via the boto3 Bedrock client"""
    response = converse(
        client=context.bedrock,
        model_id=ModelId.CLAUDE_3_5,
        messages=state.messages,
        tools=context.tools,
        prompt="You are a helpful assistant.",
    )

    message = response["output"]["message"]
    return State.merge(state, State(messages=[message]))


def goodbye(context: Context, state: State) -> State:
    """A graph node that returns another fixed (canned) response to say goodbye to the user."""
    return State.merge(state, State(messages=[text("Goodbye!", role="assistant")]))


def continue_or_end(context: Context, state: State) -> Node[Context, State]:
    """A conditional edge, extracts structured output from Claude, in the form of a ConversationDetails Pydantic model and uses it to determine if we should loop back to the assistant node, or proceed to the goodbye node."""
    response = converse_with_structured_output(
        client=context.bedrock,
        model_id=ModelId.CLAUDE_3_5,
        messages=state.messages,
        prompt="Extract the conversation details from historical messages.",
        output_schema=ConversationDetails,
    )

    if response.should_continue:
        return assistant
    else:
        return goodbye


def get_temperature(city: str) -> int:
    """
    Returns the current temperature for a city.

    :param city: The city to pull temperature information from
    :return: The temperature in degrees fahrenheit for the specified city.
    """

    return 50


context = Context(bedrock=boto3.client("bedrock-runtime"), user_id="user-1", tools=[get_temperature])
graph = Graph[Context, State](
    nodes=[welcome, assistant, goodbye],
    edges=[
        (welcome, assistant),
        (assistant, continue_or_end),
        (goodbye, END),
    ],
)

store = LocalStateStore(lambda: State(messages=[]))

run_chatbot_on_cli(graph, context, store)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lattice_llm-0.1.1.tar.gz (11.5 kB view details)

Uploaded Source

Built Distribution

lattice_llm-0.1.1-py3-none-any.whl (13.4 kB view details)

Uploaded Python 3

File details

Details for the file lattice_llm-0.1.1.tar.gz.

File metadata

  • Download URL: lattice_llm-0.1.1.tar.gz
  • Upload date:
  • Size: 11.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0

File hashes

Hashes for lattice_llm-0.1.1.tar.gz
Algorithm Hash digest
SHA256 ba3cdfe225e630f5ab88241f19a9a22d35336dac9f5f3cb49b25be2603c52373
MD5 b4eb2ce3ac6c0cd071503b6238058be6
BLAKE2b-256 cde75bd213c5b84804757299c9b160a7e7c551cb731b8a86028f4db36e9b9fb5

See more details on using hashes here.

File details

Details for the file lattice_llm-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: lattice_llm-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 13.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0

File hashes

Hashes for lattice_llm-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 6275206ee9cb0744a4721444ff17fa4f700452a2d9c3651081ec3eb9e9fe4998
MD5 7a49f3ec2ec82e9a3159da38001712f0
BLAKE2b-256 42ca9ba14d3c907e8a3288d1b697619c500acfadfe64dc29ae12d257eb154b90

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page