Skip to main content

Use Composio to get an array of tools with your LlamaIndex agent.

Project description

🦙 Using Composio With LlamaIndex

Integrate Composio with llamaindex agents to allow them to interact seamlessly with external apps & data sources, enhancing their functionality and reach.

Goal

  • Star a repository on GitHub using natural language commands through a llamaindex Agent.

Installation and Setup

Ensure you have the necessary packages installed and connect your GitHub account to allow your agents to utilize GitHub functionalities.

# Install Composio llamaindex package
pip install composio-llamaindex

# Connect your GitHub account
composio-cli add github

# View available applications you can connect with
composio-cli show-apps

Usage Steps

1. Import Base Packages

Prepare your environment by initializing necessary imports from llamaindex and setting up your agent.

from llama_index.llms.openai import OpenAI
from llama_index.core.llms import ChatMessage
from llama_index.core.agent import FunctionCallingAgentWorker

import dotenv
from llama_index.core.tools import FunctionTool

# Load environment variables from .env
dotenv.load_dotenv()

llm = OpenAI(model="gpt-4-turbo")

2. Fetch GitHub llamaindex Tools via Composio

Access GitHub tools provided by Composio for llamaindex.

from composio_llamaindex import App, Action, ComposioToolSet

# Get All the tools
composio_toolset = ComposioToolSet()
tools = composio_toolset.get_actions(
    actions=[Action.GITHUB_STAR_A_REPOSITORY_FOR_THE_AUTHENTICATED_USER]
)
print(tools)

3. Prepare the Agent

Configure the agent to perform tasks such as starring a repository on GitHub.

prefix_messages = [
    ChatMessage(
        role="system",
        content=(
            "You are now a integration agent, and what  ever you are requested, you will try to execute utilizing your toools."
        ),
    )
]

agent = FunctionCallingAgentWorker(
    tools=tools,
    llm=llm,
    prefix_messages=prefix_messages,
    max_function_calls=10,
    allow_parallel_tool_calls=False,
    verbose=True,
).as_agent()

4. Check Response

Validate the execution and response from the agent to ensure the task was completed successfully.

response = agent.chat("Hello! I would like to star a repo composiohq/composio on GitHub")
print("Response:", response)

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

composio_llamaindex-0.5.41.tar.gz (4.0 kB view details)

Uploaded Source

Built Distribution

composio_llamaindex-0.5.41-py3-none-any.whl (4.4 kB view details)

Uploaded Python 3

File details

Details for the file composio_llamaindex-0.5.41.tar.gz.

File metadata

  • Download URL: composio_llamaindex-0.5.41.tar.gz
  • Upload date:
  • Size: 4.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.7

File hashes

Hashes for composio_llamaindex-0.5.41.tar.gz
Algorithm Hash digest
SHA256 b0cbab69707a49932e3e81218967b52f1101ccf5bc0bedefcf347ea4815455a8
MD5 456f047b61c642c545f4435e056e4830
BLAKE2b-256 1aadc8c46293a3aef9e5fae3e7c210d22f0e3ad4ae5f5c1377ef466993c462e4

See more details on using hashes here.

File details

Details for the file composio_llamaindex-0.5.41-py3-none-any.whl.

File metadata

File hashes

Hashes for composio_llamaindex-0.5.41-py3-none-any.whl
Algorithm Hash digest
SHA256 acb14c0e13eb59172875d916e30b5d8dc35441ce7e8abc1b81a24f25aaed614a
MD5 ea7beb216d69362eff7e5058bad209e4
BLAKE2b-256 d95178f90f1b1765b476cb97b3f7a9fc71c0c0368ca7fd257f94f4eb91110607

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page