Skip to main content

A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests.

Project description

bulk-chain 0.24.1

twitter

A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests stored in CSV / JSONL / sqlite. It allows applying series of prompts formed into schema (See related section)

Features

  • No-strings: you're free to LLM dependencies and flexible venv customization.
  • Provides iterator over infinite amount of input contexts served in CSV/JSONL.
  • Progress caching: withstanding exception during LLM calls by using sqlite3 engine for caching LLM answers;
  • Support schemas descriptions for Chain-of-Thought concept.

Installation

pip install bulk-chain

Chain-of-Thought Schema

To declare Chain-of-Though (CoT) schema, this project exploits JSON format. This format adopts name field for declaring a name and schema is a list of CoT instructions for the Large Language Model.

Each step represents a dictionary with prompt and out keys that corresponds to the input prompt and output variable name respectively. All the variable names are expected to be mentioned in {}.

Below, is an example on how to declare your own schema:

{
"name": "schema-name",
"schema": [
    {"prompt": "Given the question '{text}', let's think step-by-step.", 
     "out": "steps"},
    {"prompt": "For the question '{text}' the reasoining steps are '{steps}'. what would be an answer?", 
     "out":  "answer"},
]
}

Another templates are available here.

Usage

Just three simple steps:

  1. Define your CoT Schema, or fetch it as shown below:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/schema/default.json
  1. Fetch or write your own model or pick the one preset here:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/flan_t5.py
  1. Launch inference in (chat mode):
!python -m bulk_chain.infer \
    --schema "default.json" \
    --adapter "dynamic:flan_t5.py:FlanT5" \
    %% \
    --device "cpu" \
    --temp 0.1

Embed your LLM

All you have to do is to implement BaseLM class, that includes:

  • __init__ -- for initialization;
  • ask(prompt) -- infer your model with the given prompt.

See examples with models here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bulk_chain-0.24.1.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

bulk_chain-0.24.1-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file bulk_chain-0.24.1.tar.gz.

File metadata

  • Download URL: bulk_chain-0.24.1.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.5

File hashes

Hashes for bulk_chain-0.24.1.tar.gz
Algorithm Hash digest
SHA256 45cd9b8df973d2b5ae8109d7174b665b5eaf5a91732c2e2b376b25f2a852c3a7
MD5 677f93dc90112200f3c4eeb59940f8da
BLAKE2b-256 c5b6c58d7ec7d4b45d5b68387f8deccc28c1881dd603e2522875fc65e07ce159

See more details on using hashes here.

File details

Details for the file bulk_chain-0.24.1-py3-none-any.whl.

File metadata

  • Download URL: bulk_chain-0.24.1-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.5

File hashes

Hashes for bulk_chain-0.24.1-py3-none-any.whl
Algorithm Hash digest
SHA256 b25419a392e2fbb6c5907b992c64f1bf47b3ceb121b95d9eba94ca7f292fd13f
MD5 aea3840347cedb26fbfb7fbb44a5fc75
BLAKE2b-256 499763a01da83cb8cc9bb093d8de6addcc752edabbe961dc110a0e267cae2235

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page