Skip to main content

llama-index agent coa integration

Project description

LlamaIndex Agent Integration: Coa

Chain-of-Abstraction Agent Pack

pip install llama-index-agent-coa

The chain-of-abstraction (CoA) agent integration implements a generalized version of the strategy described in the origin CoA paper.

By prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.

The LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:

After buying the apples, Sally has [FUNC add(3, 2) = y1] apples.
Then, the wizard casts a spell to multiply the number of apples by 3,
resulting in [FUNC multiply(y1, 3) = y2] apples.

From there, the function calls can be parsed into a dependency graph, and executed.

Then, the values in the CoA are replaced with their actual results.

As an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.

NOTE: In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.

Code Usage

pip install llama-index-agent-coa

First, setup some tools (could be function tools, query engines, etc.)

from llama_index.core.tools import FunctionTool


def add(a: float, b: float) -> float:
    """Add two numbers together."""
    return a + b


def multiply(a: float, b: float) -> float:
    """Multiply two numbers together."""
    return a * b


add_tool = FunctionTool.from_defaults(fn=add)
multiply_tool = FunctionTool.from_defaults(fn=multiply)

Next, create the pack with the tools, and run it!

from llama_index.agent.coa import CoAAgentWorker
from llama_index.llms.openai import OpenAI

agent = CoAAgentWorker(
    llm=OpenAI(model="gpt-4.1-mini"),
    tools=[add_tool, multiply_tool],
).as_agent()

resp = agent.chat("What is 123.123*101.101 and what is its product with 12345")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_agent_coa-0.3.3.tar.gz (8.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_agent_coa-0.3.3-py3-none-any.whl (9.4 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_agent_coa-0.3.3.tar.gz.

File metadata

File hashes

Hashes for llama_index_agent_coa-0.3.3.tar.gz
Algorithm Hash digest
SHA256 c293bf21ccadfe935ca04cf22cbc34fe1bef002979de67860207df4b2a177035
MD5 3fe0d4fa76f06e6cdd7d3732a7cd4f01
BLAKE2b-256 3b1e91add84c54ac73674044a700546b6a4e611dd6687c79be44eade0fb86755

See more details on using hashes here.

File details

Details for the file llama_index_agent_coa-0.3.3-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_agent_coa-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 eb96f4f5c48bf0a4452e624b35b927e6701a0dcfabab551974b75b404143afed
MD5 381e7f856252573f966325933f87822f
BLAKE2b-256 51330c839548f2002a9a8650ad4bba0ba0b54734aaf9292e75a9eab4e70ce477

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page