Use Composio to get an array of tools with your LlamaIndex agent.
Project description
🦙 Using Composio With LlamaIndex
Integrate Composio with llamaindex agents to allow them to interact seamlessly with external apps & data sources, enhancing their functionality and reach.
Goal
- Star a repository on GitHub using natural language commands through a llamaindex Agent.
Installation and Setup
Ensure you have the necessary packages installed and connect your GitHub account to allow your agents to utilize GitHub functionalities.
# Install Composio llamaindex package
pip install composio-llamaindex
# Connect your GitHub account
composio-cli add github
# View available applications you can connect with
composio-cli show-apps
Usage Steps
1. Import Base Packages
Prepare your environment by initializing necessary imports from llamaindex and setting up your agent.
from llama_index.llms.openai import OpenAI
from llama_index.core.llms import ChatMessage
from llama_index.core.agent import FunctionCallingAgentWorker
import dotenv
from llama_index.core.tools import FunctionTool
# Load environment variables from .env
dotenv.load_dotenv()
llm = OpenAI(model="gpt-4-turbo")
2. Fetch GitHub llamaindex Tools via Composio
Access GitHub tools provided by Composio for llamaindex.
from composio_llamaindex import App, Action, ComposioToolSet
# Get All the tools
composio_toolset = ComposioToolSet()
tools = composio_toolset.get_actions(
actions=[Action.GITHUB_STAR_A_REPOSITORY_FOR_THE_AUTHENTICATED_USER]
)
print(tools)
3. Prepare the Agent
Configure the agent to perform tasks such as starring a repository on GitHub.
prefix_messages = [
ChatMessage(
role="system",
content=(
"You are now a integration agent, and what ever you are requested, you will try to execute utilizing your toools."
),
)
]
agent = FunctionCallingAgentWorker(
tools=tools,
llm=llm,
prefix_messages=prefix_messages,
max_function_calls=10,
allow_parallel_tool_calls=False,
verbose=True,
).as_agent()
4. Check Response
Validate the execution and response from the agent to ensure the task was completed successfully.
response = agent.chat("Hello! I would like to star a repo composiohq/composio on GitHub")
print("Response:", response)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file composio_llamaindex-0.5.45.tar.gz
.
File metadata
- Download URL: composio_llamaindex-0.5.45.tar.gz
- Upload date:
- Size: 4.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2fbd0325bb948282c7e425b6a8a628bfc36a30092eab2283af58f4ac818caead |
|
MD5 | 4051dd0550b6aadbc1f13d1d2d7429ed |
|
BLAKE2b-256 | c26e83b2fb95073dc5994bf7a5088dfeb80d50f886af64e7e08f2282c921d402 |
File details
Details for the file composio_llamaindex-0.5.45-py3-none-any.whl
.
File metadata
- Download URL: composio_llamaindex-0.5.45-py3-none-any.whl
- Upload date:
- Size: 4.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.12.7
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c31005368fc7e75e5626d7dcb6205594e3e87d85709293338dae57b3dc8e40b |
|
MD5 | edb7f7853ab46b8b0d9c87326accf38f |
|
BLAKE2b-256 | b9442083e8be6fe44bd8d4d65acc6282d2bb996c2b161125541bc4489b7d590f |