A library for building LLM agents as Graphs
Project description
Lattice LLM
A "lightweight" Python library for building LLM-powered agents as executable Graph
s. A core goals is to provide a good developer UX.
Key Features
-
Simple abstractions. Lattice aims to offer a small set of easy to use abstractions: Graphs, Nodes and Edges.
-
Graphs are used to orchestrate steps in an LLM-agent's workflow. A
Graph
executes in breadth-first fashion and has access to caller-providedContext
(e.g. an AWS Bedrock client, the current user's id etc) andState
(e.g. persisted chat history). -
Nodes are simply Python functions of the form
(context: Context, state: State) -> State
. They're used to make the agent do stuff. Nodes are intended to be "pure" functions that take the currentContext
+State
as input and return a copy of the updatedState
. -
Edges: Connect nodes together and provide control flow. They come in two flavors:
- A tuple of the form
(Node, Node)
, for edges that should always be traversed. Or - A Python function of the form
(Context, State) -> Node
, for dynamic routing.
- A tuple of the form
-
-
Control. Graphs are executed (by the caller) one layer of at a time. This makes it easy to support use-cases that require waiting on user input before executing the next layer of the
Graph
(e.g. a chatbot running on a web-server). -
Easy to test and introspect. Execution can be started from any
Node
in theGraph
. Each time aGraph
layer is executed, aGraphExecutionResult
is returned, which contains the updatedState
. This makes it easy toassert
on the expectedState
after anyNode
is executed in theGraph
. -
Convenience. Lattice provides the following quality of life features "out of the box":
- Persistance Lattice includes a
StateStore
Protocol
(interface) for persisting graphState
and aLocalStateStore
that provides an in-memory implementation. - AWS Bedrock integration. Support is provided via a
converse
andconverse_with_structured_output
(which returns structured output in the form of a user-provided Pydantic model) - Tools Lattice can automatically:
- Convert Python functions to the JSON schema format LLMs require for defining tools.
- Invoke tools (local Python functions) that an LLM requests to use in its responses.
- Persistance Lattice includes a
Installation
poetry add lattice_llm
Usage
from dataclasses import dataclass
from typing import Callable, Self
import boto3
from mypy_boto3_bedrock_runtime.type_defs import MessageUnionTypeDef as Message
from pydantic import BaseModel
from lattice_llm.bedrock import BedrockClient, ModelId, converse, converse_with_structured_output
from lattice_llm.bedrock.messages import text
from lattice_llm.graph import END, Graph, Node, run_chatbot_on_cli
from lattice_llm.state import LocalStateStore
@dataclass
class Context:
"""Context that a Graph can utilize as it executes. Context is not intended to be mutated"""
user_id: str
bedrock: BedrockClient
tools: list[Callable]
@dataclass
class State:
"""State that a Graph can update as it executes."""
messages: list[Message]
@classmethod
def merge(cls, a: Self, b: Self) -> Self:
return cls(messages=a.messages + b.messages)
class ConversationDetails(BaseModel):
should_continue: bool = True
"""True if the user wishes to keep conversing. False if the user has indicated a desire to end the conversation. If ambiguous, assume the user wants to continue the conversation."""
def welcome(context: Context, state: State) -> State:
"""A graph node that returns a fixed (canned) response."""
return State.merge(state, State(messages=[text("...", role="user"), text("Hello!", role="assistant")]))
def assistant(context: Context, state: State) -> State:
"""A graph node that returns a message from Claude 3.5 Sonnet via the boto3 Bedrock client"""
response = converse(
client=context.bedrock,
model_id=ModelId.CLAUDE_3_5,
messages=state.messages,
tools=context.tools,
prompt="You are a helpful assistant.",
)
message = response["output"]["message"]
return State.merge(state, State(messages=[message]))
def goodbye(context: Context, state: State) -> State:
"""A graph node that returns another fixed (canned) response to say goodbye to the user."""
return State.merge(state, State(messages=[text("Goodbye!", role="assistant")]))
def continue_or_end(context: Context, state: State) -> Node[Context, State]:
"""A conditional edge, extracts structured output from Claude, in the form of a ConversationDetails Pydantic model and uses it to determine if we should loop back to the assistant node, or proceed to the goodbye node."""
response = converse_with_structured_output(
client=context.bedrock,
model_id=ModelId.CLAUDE_3_5,
messages=state.messages,
prompt="Extract the conversation details from historical messages.",
output_schema=ConversationDetails,
)
if response.should_continue:
return assistant
else:
return goodbye
def get_temperature(city: str) -> int:
"""
Returns the current temperature for a city.
:param city: The city to pull temperature information from
:return: The temperature in degrees fahrenheit for the specified city.
"""
return 50
context = Context(bedrock=boto3.client("bedrock-runtime"), user_id="user-1", tools=[get_temperature])
graph = Graph[Context, State](
nodes=[welcome, assistant, goodbye],
edges=[
(welcome, assistant),
(assistant, continue_or_end),
(goodbye, END),
],
)
store = LocalStateStore(lambda: State(messages=[]))
run_chatbot_on_cli(graph, context, store)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file lattice_llm-0.1.2.tar.gz
.
File metadata
- Download URL: lattice_llm-0.1.2.tar.gz
- Upload date:
- Size: 11.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 036d0a909408702c3bed77e17ca1815e2f8c0a317626128973d556eb6ba00ab7 |
|
MD5 | 345f8318802912c35297109bb15caa3d |
|
BLAKE2b-256 | 40e4453807223a69697ebf0d6be82d83cfb16d9d5f13d7f43712cd758b15b3dc |
File details
Details for the file lattice_llm-0.1.2-py3-none-any.whl
.
File metadata
- Download URL: lattice_llm-0.1.2-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.5.1 CPython/3.9.13 Darwin/23.2.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5beb0e6403df9665acc5a03dd21318a6c2d3687a06ac94c6cfa8b20b9d244b55 |
|
MD5 | 43a3944c9d3aa4e6dcde3ce7767d1fef |
|
BLAKE2b-256 | aa9bf966b4ac1a8e1f0430daa259eb04f7a9d2696fb4bdad26d4d7cd65aa078c |