Structured queries from local or online LLM models
Project description
Sibila
Extract structured data from LLM models, using a common API to access remote models like GPT-4 or local models via llama.cpp.
- Query structured data into dataset or Pydantic BaseModel objects.
- Use the same API for local and remote models.
- Thread-based interaction with chat/instruct fine-tuned models.
- Compare output across local/remote models with included utilities, text or CSV output.
- Model directory: store configurations and quickly switch between models.
- Automatic chat templates: identifies and uses the right templates for each model.
With Sibila you can extract structured data from a local quantized model like OpenChat-3.5 with 7B params:
from sibila import (LlamaCppModel, OpenAIModel)
from pydantic import BaseModel, Field
class Info(BaseModel):
event_year: int
first_name: str
last_name: str
age_at_the_time: int
nationality: str
openchat = LlamaCppModel("openchat-3.5-1210.Q5_K_M.gguf")
openchat.extract(Info,
"Who was the first man in the moon?",
inst="Just be helpful.") # instructions, aka system message
Outputs an object of class Info, initialized with the model's output:
Info(event_year=1969,
first_name='Neil',
last_name='Armstrong',
age_at_the_time=38,
nationality='American')
With the same API you can also query OpenAI models:
gpt4 = OpenAIModel("gpt-4-0613")
gpt4.extract(Info,
"Who was the first man in the moon?",
inst="Just be helpful.") # instructions, aka system message
Which creates an Info object initialized from model's response, as above.
If Pydantic BaseModel objects are too much for your project, you can also use a ligher Python dataclass.
Sibila also includes model management and tools to compare output between models.
Examples
The included examples show what you can do with local or remote models in Sibila: structured data extraction, classification, summarization, etc.
Although you can use any llama.cpp or OpenAI supported model, by default most of the examples use OpenChat, a local 7B quantized model which is very capable for information extraction tasks. To use OpenAI models, just uncomment a line or two.
Installation and getting started
Sibila can be installed from PyPI by doing:
pip install sibila
For running local models with hardware acceleration, accessing OpenAI and general "getting started" help, see How to get started.
Documentation
The API reference and more info is available here.
License
This project is licensed under the MIT License - see the LICENSE file for details.
Sibila?
Sibila is the Portuguese word for Sibyl. The Sibyls were wise oracular women in ancient Greece. Their mysterious words puzzled people throughout the centuries, providing insight or prophetic predictions.
Michelangelo's Delphic Sibyl, in the Sistine Chapel ceiling.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sibila-0.3.0.tar.gz
.
File metadata
- Download URL: sibila-0.3.0.tar.gz
- Upload date:
- Size: 54.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a3243f3f5ed04a0fd97304b942499a0714c232345ee4ce781a77a47e9f82c6c0 |
|
MD5 | 5115cc61001d859bcb0d0d3f1160785e |
|
BLAKE2b-256 | 528e59263a265cdd38e5400294f7acd0cccc2905eaa485d194281b37549c748a |
File details
Details for the file sibila-0.3.0-py3-none-any.whl
.
File metadata
- Download URL: sibila-0.3.0-py3-none-any.whl
- Upload date:
- Size: 56.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5794ebf87799ffd99e38a7a08c3cb86d486024af4e544f3a0ec26148d3c8608b |
|
MD5 | 4f5ec1e6cbe473ee4f22de0d28359881 |
|
BLAKE2b-256 | 0e44874dd7b9595d8bf3fee7ed319380c6ed743b21eff247c63668873d2ab073 |