A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests.
Project description
bulk-chain 0.24.1
A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests stored in CSV
/ JSONL
/ sqlite
.
It allows applying series of prompts formed into schema
(See related section)
Features
- ✅ No-strings: you're free to LLM dependencies and flexible
venv
customization. - ✅ Provides iterator over infinite amount of input contexts served in
CSV
/JSONL
. - ✅ Progress caching: withstanding exception during LLM calls by using
sqlite3
engine for caching LLM answers; - ✅ Support schemas descriptions for Chain-of-Thought concept.
Installation
pip install bulk-chain
Chain-of-Thought Schema
To declare Chain-of-Though (CoT) schema, this project exploits JSON
format.
This format adopts name
field for declaring a name and schema
is a list of CoT instructions for the Large Language Model.
Each step represents a dictionary with prompt
and out
keys that corresponds to the input prompt and output variable name respectively.
All the variable names are expected to be mentioned in {}
.
Below, is an example on how to declare your own schema:
{
"name": "schema-name",
"schema": [
{"prompt": "Given the question '{text}', let's think step-by-step.",
"out": "steps"},
{"prompt": "For the question '{text}' the reasoining steps are '{steps}'. what would be an answer?",
"out": "answer"},
]
}
Another templates are available here.
Usage
Just three simple steps:
- Define your CoT Schema, or fetch it as shown below:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/schema/default.json
- Fetch or write your own model or pick the one preset here:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/flan_t5.py
- Launch inference in (chat mode):
!python -m bulk_chain.infer \
--schema "default.json" \
--adapter "dynamic:flan_t5.py:FlanT5" \
%% \
--device "cpu" \
--temp 0.1
Embed your LLM
All you have to do is to implement BaseLM
class, that includes:
__init__
-- for initialization;ask(prompt)
-- infer your model with the givenprompt
.
See examples with models here.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file bulk_chain-0.24.1.tar.gz
.
File metadata
- Download URL: bulk_chain-0.24.1.tar.gz
- Upload date:
- Size: 12.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 45cd9b8df973d2b5ae8109d7174b665b5eaf5a91732c2e2b376b25f2a852c3a7 |
|
MD5 | 677f93dc90112200f3c4eeb59940f8da |
|
BLAKE2b-256 | c5b6c58d7ec7d4b45d5b68387f8deccc28c1881dd603e2522875fc65e07ce159 |
File details
Details for the file bulk_chain-0.24.1-py3-none-any.whl
.
File metadata
- Download URL: bulk_chain-0.24.1-py3-none-any.whl
- Upload date:
- Size: 13.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.9.5
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b25419a392e2fbb6c5907b992c64f1bf47b3ceb121b95d9eba94ca7f292fd13f |
|
MD5 | aea3840347cedb26fbfb7fbb44a5fc75 |
|
BLAKE2b-256 | 499763a01da83cb8cc9bb093d8de6addcc752edabbe961dc110a0e267cae2235 |