SDK for integrating purchased graphs from the lmsystems marketplace.
Project description
LMSystems SDK
The LMSystems SDK provides flexible interfaces for integrating and executing purchased graphs from the LMSystems marketplace in your Python applications. The SDK offers two main approaches:
- PurchasedGraph Class: For seamless integration with LangGraph workflows
- LmsystemsClient: For direct, low-level interaction with LMSystems graphs, offering more flexibility and control
Try it in Colab
Get started quickly with our interactive Colab notebook:
This notebook provides a hands-on introduction to the LMSystems SDK with ready-to-run examples.
Installation
Install the package using pip:
pip install lmsystems==1.0.8
Quick Start
Using the Client SDK
The client SDK provides direct interaction with one LMSystems graphs (e.g. Deep Research Agent):
from lmsystems import (
SyncLmsystemsClient,
APIError
)
import os
def main():
# Check for required environment variables
api_key = os.environ.get("LMSYSTEMS_API_KEY")
# Initialize client
client = SyncLmsystemsClient(
graph_name="groq-deep-research-agent-51",
api_key=api_key
)
try:
# Create a new thread
thread = client.threads.create()
print(f"Created thread with status: {client.get_thread_status(thread)}")
# Example 1: Using default environment variables
for chunk in client.stream_run(
thread=thread,
input = {
"research_topic":"what are the best agent frameworks for building apps with llms?"
},
config = {
"configurable": {
"llm": "",
"tavily_api_key": "",
"groq_api_key": ""
}
},
stream_mode=["messages", "updates"]
):
print(f"Received chunk: {chunk}")
# Example: Check final thread status
final_status = client.get_thread_status(thread)
print(f"Final thread status: {final_status}")
except APIError as e:
print(f"API Error: {str(e)}")
except Exception as e:
print(f"Unexpected error: {str(e)}")
if __name__ == "__main__":
main()
Using PurchasedGraph with LangGraph
For integration with other Langgraph apps, you can plug Purchased Graphs in as a single node:
from lmsystems.purchased_graph import PurchasedGraph
from langgraph.graph import StateGraph, START, MessagesState
import os
from dataclasses import dataclass
@dataclass
class ResearchState:
research_topic: str
api_key = os.environ.get("LMSYSTEMS_API_KEY")
def main():
# Initialize our purchased graph (which wraps RemoteGraph)
purchased_graph = PurchasedGraph(
graph_name="groq-deep-research-agent-51",
api_key=api_key,
default_state_values = {
"research_topic":""
},
config = {
"configurable": {
"llm": "llama-3.1-8b-instant",
"tavily_api_key": "",
"groq_api_key": ""
}
},
)
# Create parent graph and add our purchased graph as a node
builder = StateGraph(ResearchState)
builder.add_node("purchased_node", purchased_graph)
builder.add_edge(START, "purchased_node")
graph = builder.compile()
# Use the parent graph - invoke
result = graph.invoke({
"research_topic": "what are the best agent frameworks for building apps with llms?"
})
print("Parent graph result:", result)
# Use the parent graph - stream
for chunk in graph.stream({
"research_topic":"what are the best agent frameworks for building apps with llms?"
}, subgraphs=True): # Include outputs from our purchased graph
print("Stream chunk:", chunk)
if __name__ == "__main__":
main()
Configuration
API Keys and Configuration
The SDK now automatically handles configuration through your LMSystems account. To set up:
- Create an account at LMSystems
- Navigate to your account settings
- Configure your API keys (OpenAI, Anthropic, etc.)
- Generate your LMSystems API key
Your configured API keys and settings will be automatically used when running graphs - no need to include them in your code!
Note: While configuration is handled automatically, you can still override settings programmatically if needed:
# Optional: Override stored config
config = {
"configurable": {
"model": "gpt-4",
"openai_api_key": "your-custom-key"
}
}
purchased_graph = PurchasedGraph(
graph_name="github-agent-6",
api_key=os.environ.get("LMSYSTEMS_API_KEY"),
config=config # Optional override
)
Store your LMSystems API key securely using environment variables:
export LMSYSTEMS_API_KEY="your-api-key"
API Reference
LmsystemsClient Class
LmsystemsClient.create(
graph_name: str,
api_key: str
)
Parameters:
graph_name: Name of the graph to interact withapi_key: Your LMSystems API key
Methods:
create_thread(): Create a new thread for graph executioncreate_run(thread, input): Create a new run within a threadstream_run(thread, run): Stream the output of a runget_run(thread, run): Get the status and result of a runlist_runs(thread): List all runs in a thread
PurchasedGraph Class
PurchasedGraph(
graph_name: str,
api_key: str,
config: Optional[RunnableConfig] = None,
default_state_values: Optional[dict[str, Any]] = None
)
Parameters:
graph_name: Name of the purchased graphapi_key: Your LMSystems API keyconfig: Optional configuration for the graphdefault_state_values: Default values for required state parameters
Methods:
invoke(): Execute the graph synchronouslyainvoke(): Execute the graph asynchronouslystream(): Stream graph outputs synchronouslyastream(): Stream graph outputs asynchronously
Error Handling
The SDK provides specific exceptions for different error cases:
AuthenticationError: API key or authentication issuesGraphError: Graph execution or configuration issuesInputError: Invalid input parametersAPIError: Backend communication issues
Example error handling:
from lmsystems.exceptions import (
LmsystemsError,
AuthenticationError,
GraphError,
InputError,
APIError,
GraphNotFoundError,
GraphNotPurchasedError
)
try:
result = graph.invoke(input_data)
except AuthenticationError as e:
print(f"Authentication failed: {e}")
except GraphNotFoundError as e:
print(f"Graph not found: {e}")
except GraphNotPurchasedError as e:
print(f"Graph not purchased: {e}")
except GraphError as e:
print(f"Graph execution failed: {e}")
except InputError as e:
print(f"Invalid input: {e}")
except APIError as e:
print(f"API communication error: {e}")
except LmsystemsError as e:
print(f"General error: {e}")
Stream Modes
The SDK supports different streaming modes through the StreamMode enum:
from lmsystems import StreamMode
# Stream run with specific modes
async for chunk in client.stream_run(
thread=thread,
input=input_data,
stream_mode=[
StreamMode.MESSAGES, # Stream message updates
StreamMode.VALUES, # Stream value updates from nodes
StreamMode.UPDATES, # Stream general state updates
StreamMode.CUSTOM # Stream custom-defined updates
]
):
print(chunk)
Available stream modes:
StreamMode.MESSAGES: Stream message updates from the graphStreamMode.VALUES: Stream value updates from graph nodesStreamMode.UPDATES: Stream general state updatesStreamMode.CUSTOM: Stream custom-defined updates
Support
For support, feature requests, or bug reports:
- Contact me at sean.sullivan3@yahoo.com
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lmsystems-1.0.8.tar.gz.
File metadata
- Download URL: lmsystems-1.0.8.tar.gz
- Upload date:
- Size: 25.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.10.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ce39612ae3129b33133c064dea0f5d874b11fb52aa03aefad5ba5583be33ac92
|
|
| MD5 |
41b81824d85a1e07c7ac8d07f30ed683
|
|
| BLAKE2b-256 |
35c691053470562d2381462d43c65e71ae649d926df6743f6d04f8a2b834423e
|
File details
Details for the file lmsystems-1.0.8-py3-none-any.whl.
File metadata
- Download URL: lmsystems-1.0.8-py3-none-any.whl
- Upload date:
- Size: 26.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.0.1 CPython/3.10.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
dc764c63b855f2a5f3c753d54dd7d0269a8a383202ee730e7cb635296b1919d6
|
|
| MD5 |
f6c910afbed5bd3afa9edb07ac9ab3f0
|
|
| BLAKE2b-256 |
b3e9f39620acadad5454034baf09eaa445e741ad4da2e27d188acb4f86b83564
|