Skip to main content

llama-index packs for chain-of-abstraction

Project description

Chain-of-Abstraction Agent Pack

pip install llama-index-packs-agents-coa

The chain-of-abstraction (CoA) LlamaPack implements a generalized version of the strategy described in the origin CoA paper.

By prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.

The LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:

After buying the apples, Sally has [FUNC add(3, 2) = y1] apples.
Then, the wizard casts a spell to multiply the number of apples by 3,
resulting in [FUNC multiply(y1, 3) = y2] apples.

From there, the function calls can be parsed into a dependency graph, and executed.

Then, the values in the CoA are replaced with their actual results.

As an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.

NOTE: In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.

A full example notebook is also provided.

Code Usage

pip install llama-index-packs-agents-coa

First, setup some tools (could be function tools, query engines, etc.)

from llama_index.core.tools import QueryEngineTool, FunctionTool


def add(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b


query_engine = index.as_query_engine(...)

function_tool = FunctionTool.from_defaults(fn=add)
query_tool = QueryEngineTool.from_defaults(
    query_engine=query_engine, name="...", description="..."
)

Next, create the pack with the tools, and run it!

from llama_index.packs.agents_coa import CoAAgentPack
from llama_index.llms.openai import OpenAI

pack = CoAAgentPack(
    tools=[function_tool, query_tool], llm=OpenAI(model="gpt-4")
)

print(pack.run("What is 1245 + 4321?"))

See the example notebook for more thorough details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_packs_agents_coa-0.2.0.tar.gz (8.4 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file llama_index_packs_agents_coa-0.2.0.tar.gz.

File metadata

File hashes

Hashes for llama_index_packs_agents_coa-0.2.0.tar.gz
Algorithm Hash digest
SHA256 34df90aa45cb2d09f8e0ff316357d4f0141eb6a2b18657e5c0019b9672854d59
MD5 b866335df689411762d7144db2a43b74
BLAKE2b-256 5be421643f436e1d65d5ee117c871e8117f2ee8b183a94f91c74adc076f3e98a

See more details on using hashes here.

File details

Details for the file llama_index_packs_agents_coa-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_packs_agents_coa-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 aefd52cac7712dd0afcfab5c664bb1340fd8033eb4e61bf3eb0261f309a41c1a
MD5 15df1b7737431f8e80884af08390c35f
BLAKE2b-256 60b59a54dba3a0c28f2994990fa2bd702c4cd40e4822fc8644745b4ff135cd6c

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page