Skip to main content

No project description provided

Project description

microchain

function calling-based LLM agents. Just that, no bloat.

Installation

pip install microchain-python

Define LLM and template

from microchain import OpenAITextGenerator, HFChatTemplate, LLM

generator = OpenAITextGenerator(
    model=MODEL_NAME,
    api_key=API_KEY,
    api_base=API_BASE,
    temperature=0.7
)

template = HFChatTemplate(CHAT_TEMPLATE)
llm = LLM(generator=generator, templates=[template])

Use HFChatTemplate(template) to use tokenizer.apply_chat_template from huggingface.

You can also use VicunaTemplate() for a classic vicuna-style prompt.

To use ChatGPT APIs you don't need to apply a template:

from microchain import OpenAIChatGenerator, LLM

generator = OpenAIChatGenerator(
    model="gpt-3.5-turbo",
    api_key=API_KEY,
    api_base="https://api.openai.com/v1",
    temperature=0.7
)

llm = LLM(generator=generator)

Define LLM functions

Define LLM callable functions as plain Python objects. Use type annotations to instruct the LLM to use the correct types.

from microchain import Function

class Sum(Function):
    @property
    def description(self):
        return "Use this function to compute the sum of two numbers"
    
    @property
    def example_args(self):
        return [2, 2]
    
    def __call__(self, a: float, b: float):
        return a + b

class Product(Function):
    @property
    def description(self):
        return "Use this function to compute the product of two numbers"
    
    @property
    def example_args(self):
        return [2, 2]
    
    def __call__(self, a: float, b: float):
        return a * b

print(Sum().help)
'''
Sum(a: float, b: float)
This function sums two numbers.
Example: Sum(a=2, b=2)
'''

print(Product().help)
'''
Product(a: float, b: float)
Use this function to compute the product of two numbers.
Example: Product(a=2, b=2)
'''

Define a LLM Agent

Register your functions with an Engine() using the register() function.

Create an Agent() using the llm and the execution engine.

Define a prompt for the LLM and include the functions documentation using engine.help().

It's always a good idea to bootstrap the LLM with examples of function calls. Do this by setting engine.bootstrap = [...] with a list of function calls to run and prepend their results to the chat history.

from microchain import Agent, Engine
from microchain.functions import Reasoning, Stop

engine = Engine()
engine.register(Reasoning())
engine.register(Stop())
engine.register(Sum())
engine.register(Product())

agent = Agent(llm=llm, engine=engine)
agent.prompt = f"""Act as a calculator. You can use the following functions:

{engine.help}

Only output valid Python function calls.

How much is (2*4 + 3)*5?
"""

agent.bootstrap = [
    'Reasoning("I need to reason step-by-step")',
]
agent.run()

Running it will output something like:

prompt:
Act as a calculator. You can use the following functions:

Reasoning(reasoning: str)
Use this function for your internal reasoning.
Example: Reasoning(reasoning=The next step to take is...)

Stop()
Use this function to stop the program.
Example: Stop()

Sum(a: float, b: float)
Use this function to compute the sum of two numbers.
Example: Sum(a=2, b=2)

Product(a: float, b: float)
Use this function to compute the product of two numbers.
Example: Product(a=2, b=2)


Only output valid Python function calls.

How much is (2*4 + 3)*5?

Running 10 iterations
>> Reasoning("I need to reason step-by-step")
The reasoning has been recorded
>> Reasoning("First, calculate the product of 2 and 4")
The reasoning has been recorded
>> Product(a=2, b=4)
8
>> Reasoning("Then, add 3 to the product of 2 and 4")
The reasoning has been recorded
>> Sum(a=8, b=3)
11
>> Reasoning("Lastly, multiply the sum by 5")
The reasoning has been recorded
>> Product(a=11, b=5)
55
>> Reasoning("So, the result of (2*4 + 3)*5 is 55")
The reasoning has been recorded
>> Stop()
The program has been stopped

You can find more examples here

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

microchain_python-0.4.6.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

microchain_python-0.4.6-py3-none-any.whl (16.1 kB view details)

Uploaded Python 3

File details

Details for the file microchain_python-0.4.6.tar.gz.

File metadata

  • Download URL: microchain_python-0.4.6.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.12.6

File hashes

Hashes for microchain_python-0.4.6.tar.gz
Algorithm Hash digest
SHA256 3331c6d7b94b04fc88ef298c23cf18a258482eac7408d476de6690b540f5ee83
MD5 f370c387e3b9086e026605df1ef1c09f
BLAKE2b-256 c9e8bd54908bb405b038413617f84cf039c3dac6d0695234e9db5a5b6678dba2

See more details on using hashes here.

File details

Details for the file microchain_python-0.4.6-py3-none-any.whl.

File metadata

File hashes

Hashes for microchain_python-0.4.6-py3-none-any.whl
Algorithm Hash digest
SHA256 f521001719517e1acaa3c217032a665f885dc94053cb38391ed1512d75024afd
MD5 0b20c13f11cc760ef97954ffe35540a4
BLAKE2b-256 aa39b6e67497e0afaee8d540bfb0f7d0e62791c5120f61b8d34fd3e29de30592

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page