Minimalistic man-made burgers of data and words for interfacing with LLMs and beyond
Project description
Nothingburger
Nothing but semantic burgers made by humans to feed machines
The goal for nothingburger is to create a minimalistic platform in which humans can cook up quick little snacks stacks of language and data that humans can understand and machines can digest. In other words, it is meant as a convenient solution to the otherwise slightly inconvenient/time-consuming process of cooking up simple recipes for interfacing with Large Language Models and the like.
Installation
pip install nothingburger
cp -R $NOTHINGBURGER_SRC/.model_library ./ # this will change later
Usage
Simple text generation
from nothingburger.model_loader import initializeModel
model = initializeModel(args.model_library + '/' + args.model_file)
print(model.generate("How much wood could a woodchuck chuck if a woodchuck could chuck wood?"))
Example Chain (Interactive Chat)
import nothingburger.templates as templates
import nothingburger.instructions as instructions
from nothingburger.cli import repl
from nothingburger.chains import Chain
from nothingburger.parsers import OutputParser
from nothingburger.model_loader import initializeModel
prompt = instructions.getRenderedInstruction("chat")
template = templates.getTemplate("alpaca_instruct_chat")
model_library = "./.model_library"
model_file = "ollama/vicuna.toml"
model = initializeModel(model_library + '/' + model_file)
chain = Chain(
instruction = prompt,
template = template,
output_parser = OutputParser(),
model = model,
debug = False,
assistant_prefix = "Assistant: ",
user_prefix = "User: ",
)
repl(chain)
The syntax will likely feel like a much more minimalistic version of LangChain. Part of the mission is to incorporate an alternate interpretation of LangChain's syntax without all the fluff.
Modelfiles
One key feature of nothingburger is that it enables you to load or connect to LLMs easily through various APIs or libraries by simply writing a TOML file. For example, here's the entire contents of the file used to load Mistral 7B through Ollama:
name = "Mistral 7B"
author = "MistralAI"
license = "Apache 2.0"
website = "mistral.ai"
[service]
provider = "ollama"
base_url = "http://localhost:11434"
model_key = "mistral:7b"
[generation]
temperature = 0.7
top_k = 20
top_p = 0.9
max_tokens = 512
seed = 42
batch = 1
repeat_penalty = 1.15
presence_penalty = 0.0
frequency_penalty = 0.0
threads = 0
[generation.mirostat]
mode = 0
eta = 0.1
tau = 5.0
The generation values serve as defaults that can you can override at runtime.
Supported Model Providers/Backends
- Local
- HuggingFace Transformers
- Llama.cpp (llama-cpp-python and ctransformers)
- Ollama
- HuggingFace Text-Generation-Inference
- Hosted
- OpenAI (+ Huggingface Endpoints and any other OpenAI-compatible API)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nothingburger-0.0.1.tar.gz.
File metadata
- Download URL: nothingburger-0.0.1.tar.gz
- Upload date:
- Size: 9.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cf39598d2dcae73d4faec14d2cc6932b4a0a03875e8e797d1786c292d467444c
|
|
| MD5 |
a9b70ad158350237ca6c6a9c75911149
|
|
| BLAKE2b-256 |
b2399c9bfed778cef93f81b0634d0d87c32d25879fbf1421a7af5d2ce8ac8f63
|
File details
Details for the file nothingburger-0.0.1-py3-none-any.whl.
File metadata
- Download URL: nothingburger-0.0.1-py3-none-any.whl
- Upload date:
- Size: 10.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8dbc6a2dd5762a74d333a63c02797245689e4f66912a63dcbe217f085347350b
|
|
| MD5 |
4aef857184d714469ea23860210d69ea
|
|
| BLAKE2b-256 |
057371f3573b4aeee901e583b09d239a0ec80bf398561983ac8887d4c3848c89
|