Simple interface for creating and managing LLM chains
Project description
LLM Blocks
LLM Blocks is a Python module that helps users interact with Language Learning Model (LLM) chains. It provides a simple and flexible way to create and manage LLM chains, ensuring efficient interactions with models such as OpenAI's GPT-3.5 Turbo.
Installation
First, make sure to have Python installed. Then, to install the required dependencies for this module, run the following commands:
pip install -r requirements.txt
pip install -r requirements.dev.txt
Configuration
To run the LLM Blocks, you'll need an OpenAI API key. Store your API key in a .env file or export it as an environment variable:
For a .env file:
OPENAI_API_KEY=your_openai_api_key
For environment variables:
export OPENAI_API_KEY=your_openai_api_key
Usage
The llm_blocks
folder contains the main ChatUtils class, which can be utilized to create and manage your LLM chains.
Here's a simple example of using the GenericChain
class:
from llm_blocks.chat_utils import GenericChain
# Create a chain with a given template
template = "The meaning of {word} is:"
my_chain = GenericChain(template)
# Call the chain with any input you desire
response = my_chain("friendship")
print(response)
Module Structure
exclude.toml
: Configuration file to specify files or directories to exclude.requirements.dev.txt
: Development dependencies for this module.requirements.txt
: Main dependencies for this module.llm_blocks
chat_utils.py
: Python file containing the definition of theGenericChain
class and utility functions for working with LLM chains.
Logging Responses
The GenericChain
class keeps a log of all interactions with the given LLM. The log is stored as a list of dictionaries and can be accessed with my_chain.logs
.
Example:
for log in my_chain.logs:
print(f'Inputs: {log["inputs"]}')
print(f'Callback: {log["callback"]}')
print(f'Response: {log["response"]}')
print(f'Response Time: {log["response_time"]}')
print('---')
Contributing
Feel free to submit pull requests, report bugs, or suggest new features through the GitHub repository. We appreciate your contributions and feedback!
License
This project is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file llm-blocks-0.2.4.tar.gz
.
File metadata
- Download URL: llm-blocks-0.2.4.tar.gz
- Upload date:
- Size: 3.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5a39a6b2eccbef27c2ae94c88c8b149a23b8f65d5d93a44960fb0cedc7ce376f |
|
MD5 | b7b6ada5d0824de61cc6d0bff391a375 |
|
BLAKE2b-256 | dbde54c9dc04811ee5f170145bd4bdeab5a46872da4589b71e620d3cae7df49c |
File details
Details for the file llm_blocks-0.2.4-py3-none-any.whl
.
File metadata
- Download URL: llm_blocks-0.2.4-py3-none-any.whl
- Upload date:
- Size: 3.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.1
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9566bbf60eda0a9b84f30757b4b25af648a3841f891e203e5593ea9fe1b550f5 |
|
MD5 | 275930d609026fc0867ed9ed3d94d02d |
|
BLAKE2b-256 | 0933ca577fc9f0436096be66f1fcce203c1a93cc613126494aa3c65bc17c237e |