Use Composio to get array of tools with LnagGraph Agent Workflows
Project description
🦜🕸️ Using Composio With LangGraph
Integrate Composio with LangGraph Agentic workflows & enable them to interact seamlessly with external apps, enhancing their functionality and reach.
Goal
- Star a repository on GitHub using natural language commands through a LangGraph Agent.
Installation and Setup
Ensure you have the necessary packages installed and connect your GitHub account to allow your agents to utilize GitHub functionalities.
# Install Composio LangGraph package
pip install composio-langgraph
# Connect your GitHub account
composio-cli add github
# View available applications you can connect with
composio-cli show-apps
Usage Steps
1. Import Base Packages
Prepare your environment by initializing necessary imports from LangGraph & LangChain for setting up your agent.
from typing import Literal
from langchain_openai import ChatOpenAI
from langgraph.graph import MessagesState, StateGraph
from langgraph.prebuilt import ToolNode
2. Fetch GitHub LangGraph Tools via Composio
Access GitHub tools provided by Composio for LangGraph, initialize a ToolNode
with necessary tools obtaned from ComposioToolSet
.
from composio_langgraph import Action, ComposioToolSet
# Initialize the toolset for GitHub
composio_toolset = ComposioToolSet()
tools = composio_toolset.get_actions(
actions=[
Action.GITHUB_ACTIVITY_STAR_REPO_FOR_AUTHENTICATED_USER,
Action.GITHUB_USERS_GET_AUTHENTICATED,
])
tool_node = ToolNode(tools)
3. Prepare the model
Initialize the LLM class and bind obtained tools to the model.
model = ChatOpenAI(temperature=0, streaming=True)
model_with_tools = model.bind_tools(functions)
4. Define the Graph Nodes
LangGraph expects you to define different nodes of the agentic workflow as separate functions. Here we define a node for calling the LLM model.
def call_model(state: MessagesState):
messages = state["messages"]
response = model_with_tools.invoke(messages)
return {"messages": [response]}
5. Define the Graph Nodes and Edges
To establish the agent's workflow, we begin by initializing the workflow with agent
and tools
node, followed by specifying the connecting edges between nodes, finally compiling the workflow. These edges can be straightforward or conditional, depending on the workflow requirements.
def should_continue(state: MessagesState) -> Literal["tools", "__end__"]:
messages = state["messages"]
last_message = messages[-1]
if last_message.tool_calls:
return "tools"
return "__end__"
workflow = StateGraph(MessagesState)
# Define the two nodes we will cycle between
workflow.add_node("agent", call_model)
workflow.add_node("tools", tool_node)
workflow.add_edge("__start__", "agent")
workflow.add_conditional_edges(
"agent",
should_continue,
)
workflow.add_edge("tools", "agent")
app = workflow.compile()
6. Invoke & Check Response
After the compilation of workflow, we invoke the LLM with a task, and stream the response.
for chunk in app.stream(
{
"messages": [
(
"human",
# "Star the Github Repository composiohq/composio",
"Get my information.",
)
]
},
stream_mode="values",
):
chunk["messages"][-1].pretty_print()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file composio_langgraph-0.5.48rc1.tar.gz
.
File metadata
- Download URL: composio_langgraph-0.5.48rc1.tar.gz
- Upload date:
- Size: 3.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 27fa554c3fc031b81ab23bd95f20b8a580354e50b1c1d84387b8f62d2068fc19 |
|
MD5 | ca6dd8e92cf33644ecf8e6112856ebaf |
|
BLAKE2b-256 | ef3c6bfebdb74c9bc79cdaf6e7f8bf23f536a769e5fbae5f66684cb23fc254d9 |
File details
Details for the file composio_langgraph-0.5.48rc1-py3-none-any.whl
.
File metadata
- Download URL: composio_langgraph-0.5.48rc1-py3-none-any.whl
- Upload date:
- Size: 4.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 0bd3c87e82cac4ac2446848aa9c0528bdacaf6d4b766c6e29c57eb5f24f8cac4 |
|
MD5 | 624c59a348585dbc642643ff9a22127a |
|
BLAKE2b-256 | d7531facfcad6eadb1c9e665d848bfc16d1796c6472cc4285d1e101775c29708 |