Skip to main content

llama-index packs for chain-of-abstraction

Project description

Chain-of-Abstraction Agent Pack

pip install llama-index-packs-agents-coa

The chain-of-abstraction (CoA) LlamaPack implements a generalized version of the strategy described in the origin CoA paper.

By prompting the LLM to write function calls in a chain-of-thought format, we can execute both simple and complex combinations of function calls needed to execute a task.

The LLM is prompted to write a response containing function calls, for example, a CoA plan might look like:

After buying the apples, Sally has [FUNC add(3, 2) = y1] apples.
Then, the wizard casts a spell to multiply the number of apples by 3,
resulting in [FUNC multiply(y1, 3) = y2] apples.

From there, the function calls can be parsed into a dependency graph, and executed.

Then, the values in the CoA are replaced with their actual results.

As an extension to the original paper, we also run the LLM a final time, to rewrite the response in a more readable and user-friendly way.

NOTE: In the original paper, the authors fine-tuned an LLM specifically for this, and also for specific functions and datasets. As such, only capabale LLMs (OpenAI, Anthropic, etc.) will be (hopefully) reliable for this without finetuning.

Code Usage

pip install llama-index-packs-agents-coa

First, setup some tools (could be function tools, query engines, etc.)

from llama_index.core.tools import QueryEngineTool, FunctionTool


def add(a: int, b: int) -> int:
    """Add two numbers together."""
    return a + b


query_engine = index.as_query_engine(...)

function_tool = FunctionTool.from_defaults(fn=add)
query_tool = QueryEngineTool.from_defaults(
    query_engine=query_engine, name="...", description="..."
)

Next, create the pack with the tools, and run it!

from llama_index.packs.agents_coa import CoAAgentPack
from llama_index.llms.openai import OpenAI

pack = CoAAgentPack(
    tools=[function_tool, query_tool], llm=OpenAI(model="gpt-4")
)

print(pack.run("What is 1245 + 4321?"))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_index_packs_agents_coa-0.3.2.tar.gz (9.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llama_index_packs_agents_coa-0.3.2-py3-none-any.whl (10.2 kB view details)

Uploaded Python 3

File details

Details for the file llama_index_packs_agents_coa-0.3.2.tar.gz.

File metadata

File hashes

Hashes for llama_index_packs_agents_coa-0.3.2.tar.gz
Algorithm Hash digest
SHA256 b1b80b720119528efb1c195d47bfcc057eb139169a3a9db35e4f7d48235e05ae
MD5 da81f6d7f87366dc941ad2138f6868cd
BLAKE2b-256 292366210d1b726a1e36c76c61db15c854880a16853ce6119b35ca31dde75bd0

See more details on using hashes here.

File details

Details for the file llama_index_packs_agents_coa-0.3.2-py3-none-any.whl.

File metadata

File hashes

Hashes for llama_index_packs_agents_coa-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 75960767cf30971016d2997e0f6642d1074b21273e9ec65f032dce0c26b3c09d
MD5 0209494604367826f00ba037c0022ace
BLAKE2b-256 0eba6b88ff36acae4331f31214aa6f5c88c2adf44950f85a130b3c5d7df00acb

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page