Declarative prompt chaining
Project description
Chainy
Overview
Chainy is a Python package for declarative prompt chaining. It allows users to define a chain of prompts to be run by a Large Language Model (LLM). These chains are defined using a YAML configuration file and can include dependencies, which are handled automatically.
Installation
To install chainy
, you can use a package manager like pip
.
pip install chainy
You can find the package repository here.
Usage
Configuration
Chainy uses YAML configuration files to define chains of prompts. Here's an example of what these configuration files look like:
inputs:
- input_1
- input_2
prompts:
prompt_1:
model: my_model
template: tmpl01.md
substitute:
var_1: input_1
var_2: input_2
prompt_2:
model: my_model
template: tmpl02.md
substitute:
res_1: prompt_1
In this example, prompt_1
and prompt_2
are the prompts to be run.
Each prompt includes a template file and a dictionary of variables to be substituted into the template.
The dependencies between prompts are defined in the substitute section: prompt_2
depends on prompt_1
.
Expected Directory Structure
|- yourproject/
|--- prompts/
|----- tmpl01.md
|----- tmpl02.md
|--- chains/
|----- example-1.yml
|--- entrypoint.py
Bring Your Own Model (BYOM)
Each prompt must specify the alias of the model that will be used to generate the response.
Models must be added to a chain with Chain.add_model(name, model)
before the chain is started.
Models are user defined classes that adhere to the LanguageModelProtocol
or ChatModelProtocol
,
which can be found in the llm
module.
Essentially, the model needs to expose the appropriate generate()
method, then chainy
will take it from there.
Running a Chain
To run a chain, call the Chain.start()
method with the required input values:
from chainy.model import Chain
# STEP 1: Load your chain from configuration
chain = Chain.from_config("chains/example-1.yml")
# STEP 2: Add your model(s)
model = ... # instantiate your model here!
chain.add_model("my_model", model)
# STEP 3: Start the chain
chain.start("hey", "bud")
Testing
Tests are located in the tests/
directory. To run them, use your preferred test runner.
Contributing
We welcome contributions! Please open an issue or submit a pull request if you have something to add.
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file chainy-0.1.0.tar.gz
.
File metadata
- Download URL: chainy-0.1.0.tar.gz
- Upload date:
- Size: 4.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | c1a3550afd6ba3e49342261744fdd332a24f32e20e3a566df768a0631b05ff71 |
|
MD5 | 814f2827127aaacd6ebb8f74d30ec1ae |
|
BLAKE2b-256 | 5ef2d2cf8054b02a0f13be0f921672e93be8576e0f53babfb2c9db6e2d35b5f8 |
File details
Details for the file chainy-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: chainy-0.1.0-py3-none-any.whl
- Upload date:
- Size: 5.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 3c28f92800063f250c9a9bdf78945d9cf7442e0f8f392d684e2286348aadbed0 |
|
MD5 | c1a72b048cafe7a504444b223de1ecf0 |
|
BLAKE2b-256 | ae3e6d1bed254a0404dc2cbf3d29a430774c5866d3b35938b2996e4450003a0f |