Skip to main content

Seamless integration and composability for large language model apps.

Project description

chaincrafter

Seamless integration and composability for large language model apps.

Features

  • Composable prompts and chains
    • Use multiple models to run one chain and then use that as input for a different chain and model
  • Customizable prompt and response formatting
    • Add modifiers to prompts to change the style, length, and format of the response
    • Extract data from the response to use in the next prompt
    • Add custom functions to process the response
    • Add custom functions to process the input variables
  • Integration with OpenAI API (llama.cpp in progress)
  • Async calls to models
  • Load Prompts and Chains from YAML using Catalogs
    • Makes it easier to share prompts and chains between projects
    • Build up a prompts library

Installation

pip install chaincrafter

Usage

  1. Define your prompts and the variables that they expect
  • The input variables can be of any type, and can be processed by a function
  • The prompt message is treated as an f-string
  1. Define your chain of prompts
  • The chain is a list of tuples, where each tuple contains a prompt and the output key to store the response in
  • The output key is used to access the response in the next prompt
  1. Set up the models that you want to use
  2. Run the chain using the models
from chaincrafter import Chain, Prompt
from chaincrafter.models import OpenAiChat

chat_model = OpenAiChat(temperature=0.65, model_name="gpt-3.5-turbo")
system_prompt = Prompt("You are a helpful assistant who responds to questions about the world")
hello_prompt = Prompt("Hello, what is the capital of France? Answer only with the city name.")
followup_prompt = Prompt("{city} sounds like a nice place to visit. What is the population of {city}?")
chain = Chain(
    system_prompt,
    (hello_prompt, "city"),
    (followup_prompt, "followup_response"),
)
messages = chain.run(chat_model)
for message in messages:
    print(f"{message['role']}: {message['content']}")

Running the examples

source venv/bin/activate
export OPENAI_API_KEY="..."
python -m examples.interesting_facts
python -m examples.interesting_facts_catalog

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chaincrafter-0.2.3.tar.gz (10.0 kB view details)

Uploaded Source

Built Distribution

chaincrafter-0.2.3-py3-none-any.whl (9.9 kB view details)

Uploaded Python 3

File details

Details for the file chaincrafter-0.2.3.tar.gz.

File metadata

  • Download URL: chaincrafter-0.2.3.tar.gz
  • Upload date:
  • Size: 10.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.4

File hashes

Hashes for chaincrafter-0.2.3.tar.gz
Algorithm Hash digest
SHA256 3c3969a2f8df827f9ef86a780e29a1d04bc2c74645f3903abf0d23762a2d7854
MD5 92be282ee52abce9514050ce49da1f44
BLAKE2b-256 2cad88d376e98ca910ccc0d252fa20a094fbb962d6670b01e00fec45556529d7

See more details on using hashes here.

File details

Details for the file chaincrafter-0.2.3-py3-none-any.whl.

File metadata

File hashes

Hashes for chaincrafter-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 660500d91f484eb2580e85b0f91daea55400b0805883b9cca8871861d3aed3f6
MD5 a87187e445bbdbdf477d17e2428331bf
BLAKE2b-256 6510bc57bd1a10a2829163d9fcda390a0108e387b194500cf5d5e0971855690d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page