llama-index agent coa integration
Project description
LlamaIndex Agent Integration: Coa
Chain-of-Abstraction Agent Pack
pip install llama-index-agent-coa
The chain-of-abstraction (CoA) agent integration implements a generalized version of the strategy described in the origin CoA paper.
By prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.
The LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:
After buying the apples, Sally has [FUNC add(3, 2) = y1] apples.
Then, the wizard casts a spell to multiply the number of apples by 3,
resulting in [FUNC multiply(y1, 3) = y2] apples.
From there, the function calls can be parsed into a dependency graph, and executed.
Then, the values in the CoA are replaced with their actual results.
As an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.
NOTE: In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.
A full example notebook is also provided.
Code Usage
pip install llama-index-agent-coa
First, setup some tools (could be function tools, query engines, etc.)
from llama_index.core.tools import QueryEngineTool, FunctionTool
def add(a: int, b: int) -> int:
"""Add two numbers together."""
return a + b
query_engine = index.as_query_engine(...)
function_tool = FunctionTool.from_defaults(fn=add)
query_tool = QueryEngineTool.from_defaults(
query_engine=query_engine, name="...", description="..."
)
Next, create the pack with the tools, and run it!
from llama_index.packs.agent.coa import CoAAgentPack
from llama_index.llms.openai import OpenAI
pack = CoAAgentPack(
tools=[function_tool, query_tool], llm=OpenAI(model="gpt-4")
)
print(pack.run("What is 1245 + 4321?"))
See the example notebook for more thorough details.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llama_index_agent_coa-0.2.0.tar.gz
.
File metadata
- Download URL: llama_index_agent_coa-0.2.0.tar.gz
- Upload date:
- Size: 8.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Darwin/23.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ac68cd7929edaf1629b9aba5103f8c921d6df6fb4833ca3b6ec32c5bf9351c53 |
|
MD5 | df83d12e4a69e96d17e6f50cc01106f9 |
|
BLAKE2b-256 | fb358d88d02d73e35b29aec8c636f762ba37853468018e37d70ba5a79c507995 |
File details
Details for the file llama_index_agent_coa-0.2.0-py3-none-any.whl
.
File metadata
- Download URL: llama_index_agent_coa-0.2.0-py3-none-any.whl
- Upload date:
- Size: 8.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.7.1 CPython/3.10.13 Darwin/23.6.0
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b1da045cdd95bbf7747ded1c1dbbfcf4c9dbe97559753764607b7505d06afdea |
|
MD5 | 46d96507373e82f56e1d00b8a0793c10 |
|
BLAKE2b-256 | 6086f43c0a1f81191cf976d63b42e3675d06ffd2522fdd83bf373e10ee4efa80 |