YAML-based LLM configuration and execution
Project description
YAMLLM
A Python library for YAML-based LLM configuration and execution.
Installation
pip install yamllm-core
uv add yamllm-core
Quick Start
In order to run a simple query, run a script as follows. NOTE: Printing of the response is not required as this is handles by the query method. This uses the rich library to print the responses in the console.
from yamllm.core.llm import OpenAIGPT, GoogleGemini, DeepSeek, MistralAI
import os
import dotenv
dotenv.load_dotenv()
config_path = "path/to/config.yaml"
# Initialize LLM with config
llm = GoogleGemini(config_path=config_path, api_key=os.environ.get("GOOGLE_API_KEY"))
# Make a query
response = llm.query("Give me some boiler plate pytorch code please")
In order to have an ongoing conversation with the model, run a script as follows.
from yamllm.core.llm import OpenAIGPT, GoogleGemini, DeepSeek, MistralAI
from rich.console import Console
import os
import dotenv
dotenv.load_dotenv()
console = Console()
config_path = "path/to/config.yaml"
llm = GoogleGemini(config_path=config_path, api_key=os.environ.get("GOOGLE_API_KEY"))
while True:
try:
prompt = input("\nHuman: ")
if prompt.lower() == "exit":
break
response = llm.query(prompt)
if response is None:
continue
except FileNotFoundError as e:
console.print(f"[red]Configuration file not found:[/red] {e}")
except ValueError as e:
console.print(f"[red]Configuration error:[/red] {e}")
except Exception as e:
console.print(f"[red]An error occurred:[/red] {str(e)}")
Configuration
YAMLLM uses YAML files for configuration. Set up a .config file to define the parameters for your LLM instance. This file should include settings such as the model type, temperature, maximum tokens, and system prompt.
Example configuration:
name: "openai" # supported: openai, google, deepseek, mistral
model: "gpt-4o-mini" # model identifier
api_key: # api key goes here, best practice to put into dotenv
base_url: # optional: for custom endpoints e.g. "https://generativelanguage.googleapis.com/v1beta/openai/"
# Model Configuration
model_settings:
temperature: 0.7
max_tokens: 1000
top_p: 1.0
frequency_penalty: 0.0
presence_penalty: 0.0
stop_sequences: []
# Request Settings
request:
timeout: 30 # seconds
retry:
max_attempts: 3
initial_delay: 1
backoff_factor: 2
# Context Management
context:
system_prompt: "You are a helpful assistant, helping me achieve my goals"
max_context_length: 16000
memory:
enabled: true
max_messages: 10 # number of messages to keep in conversation history
conversation_db: "yamllm/memory/conversation_history.db"
vector_store:
index_path: "yamllm/memory/vector_store/faiss_index.idx"
metadata_path: "yamllm/memory/vector_store/metadata.pkl"
# Output Formatting
output:
format: "text" # supported: text, json, markdown
stream: false
logging:
level: "INFO"
file: "yamllm.log"
format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
# Tool Management - In development
tools:
enabled: false
tool_timeout: 10 # seconds
tool_list: ['calculator', 'web_search']
# Safety Settings
safety:
content_filtering: true
max_requests_per_minute: 60
sensitive_keywords: []
Place the .config file in your project directory and reference it in your code to initialize the LLM instance.
Features
- YAML-based configuration
- Simple API interface
- Customizable prompt templates
- Error handling and retry logic
- In built memory management in sqlite database for short term memory
- Use of vector database for long term memory based on semantic search
- Choose streaming or non-streamed response
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file yamllm_core-0.1.8.tar.gz.
File metadata
- Download URL: yamllm_core-0.1.8.tar.gz
- Upload date:
- Size: 124.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e289c72a1e650db4f57a8b140e1a2fa89fbb6096453d57d28deec7b683f73866
|
|
| MD5 |
8c815a1da05b6505fa15502b05f41894
|
|
| BLAKE2b-256 |
6bc4659add36ea6c47e80d0d82785fedab6b53420c6b90505f924c304871f2b1
|
File details
Details for the file yamllm_core-0.1.8-py3-none-any.whl.
File metadata
- Download URL: yamllm_core-0.1.8-py3-none-any.whl
- Upload date:
- Size: 3.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.12.2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c5e4512aefaf5e4c16cf1b20bf9b57a566683f5a6d8d4e427ee1bc01d52d63f7
|
|
| MD5 |
cedfb7209bde2fae2597616d98c9591d
|
|
| BLAKE2b-256 |
a1472248b7e53bf21c1264a84229e5ca02d1eb6200ffdcd84eba6ef98e48dee9
|