Skip to main content

A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests.

Project description

bulk-chain

A lightweight, no-strings-attached Chain-of-Thought framework for your LLM, ensuring reliable results for bulk input requests stored in CSV / JSONL / sqlite. It allows applying series of prompts formed into schema (See related section)

Features

  • No-strings: you're free to LLM dependencies and flexible venv customization.
  • Provides iterator over infinite amount of input contexts served in CSV/JSONL.
  • Progress caching: withstanding exception during LLM calls by using sqlite3 engine for caching LLM answers;
  • Support schemas descriptions for Chain-of-Thought concept.

Installation

pip install git+https://github.com/nicolay-r/bulk-chain@master

Chain-of-Thought Schema

To declare Chain-of-Though (CoT) schema, this project exploits JSON format. This format adopts name field for declaring a name and schema is a list of CoT instructions for the Large Language Model.

Each step represents a dictionary with prompt and out keys that corresponds to the input prompt and output variable name respectively. All the variable names are expected to be mentioned in {}.

Below, is an example on how to declare your own schema:

{
"name": "schema-name",
"schema": [
    {"prompt": "Given the question '{text}', let's think step-by-step.", 
     "out": "steps"},
    {"prompt": "For the question '{text}' the reasoining steps are '{steps}'. what would be an answer?", 
     "out":  "answer"},
]
}

Another templates are available here.

Usage

Just three simple steps:

  1. Define your CoT Schema, or fetch it as shown below:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/schema/default.json
  1. Fetch or write your own model or pick the one preset here:
!wget https://raw.githubusercontent.com/nicolay-r/bulk-chain/refs/heads/master/ext/flan_t5.py
  1. Launch inference in (chat mode):
!python -m bulk_chain.infer \
    --schema "default.json" \
    --adapter "dynamic:flan_t5.py:FlanT5" \
    %% \
    --device "cpu" \
    --temp 0.1

Embed your LLM

All you have to do is to implement BaseLM class, that includes:

  • __init__ -- for initialization;
  • ask(prompt) -- infer your model with the given prompt.

See examples with models here.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

bulk_chain-0.24.0.tar.gz (12.5 kB view details)

Uploaded Source

Built Distribution

bulk_chain-0.24.0-py3-none-any.whl (13.5 kB view details)

Uploaded Python 3

File details

Details for the file bulk_chain-0.24.0.tar.gz.

File metadata

  • Download URL: bulk_chain-0.24.0.tar.gz
  • Upload date:
  • Size: 12.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.5

File hashes

Hashes for bulk_chain-0.24.0.tar.gz
Algorithm Hash digest
SHA256 70a7a533987f16f44a66f4a46e76cd74b93545afd758d2be350d0f659b1a8bd1
MD5 4dceeb2ac2cf15241d3bbf1ace365b80
BLAKE2b-256 59cdba6d768f7175df3f1a63b035f9417cbd6b48c75719878133f65f697d9f2d

See more details on using hashes here.

File details

Details for the file bulk_chain-0.24.0-py3-none-any.whl.

File metadata

  • Download URL: bulk_chain-0.24.0-py3-none-any.whl
  • Upload date:
  • Size: 13.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.5

File hashes

Hashes for bulk_chain-0.24.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7eeb3abe2ee92c831d29e4850146ea538e09244494be137a0c4355ece0f7ab65
MD5 67f670e8349807184ef16cf03c0617b3
BLAKE2b-256 f372d489f5a2a00b5a805393c5693e021a85f47a81ea0c5ab96b177ef5d8494d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page