Query language to blend SQL logic and LLM reasoning across multi-modal data.
Project description
SQL 🤝 LLMs
Check out our online documentation for a more comprehensive overview.
Results from the paper are available here
pip install blendsql
BlendSQL is a superset of SQLite for problem decomposition and hybrid question-answering with LLMs.
As a result, we can Blend together...
- 🥤 ...operations over heterogeneous data sources (e.g. tables, text, images)
- 🥤 ...the structured & interpretable reasoning of SQL with the generalizable reasoning of LLMs
It can be viewed as an inversion of the typical text-to-SQL paradigm, where a user calls a LLM, and the LLM calls a SQL program.
Now, the user is given the control to oversee all calls (LLM + SQL) within a unified query language.
For example, imagine we have the following table titles parks
, containing info on national parks in the United States.
We can use BlendSQL to build a travel planning LLM chatbot to help us navigate the options below.
Name | Image | Location | Area | Recreation Visitors (2022) | Description |
---|---|---|---|---|---|
Death Valley | California, Nevada | 3,408,395.63 acres (13,793.3 km2) | 1,128,862 | Death Valley is the hottest, lowest, and driest place in the United States, with daytime temperatures that have exceeded 130 °F (54 °C). | |
Everglades | Alaska | 7,523,897.45 acres (30,448.1 km2) | 9,457 | The country's northernmost park protects an expanse of pure wilderness in Alaska's Brooks Range and has no park facilities. | |
New River Gorge | West Virgina | 7,021 acres (28.4 km2) | 1,593,523 | The New River Gorge is the deepest river gorge east of the Mississippi River. | |
Katmai | Alaska | 3,674,529.33 acres (14,870.3 km2) | 33,908 | This park on the Alaska Peninsula protects the Valley of Ten Thousand Smokes, an ash flow formed by the 1912 eruption of Novarupta. |
BlendSQL allows us to ask the following questions by injecting "ingredients", which are callable functions denoted by double curly brackets ({{
, }}
).
Which parks don't have park facilities?
SELECT * FROM parks
WHERE NOT {{
LLMValidate(
'Does this location have park facilities?',
context=(SELECT "Name" AS "Park", "Description" FROM parks),
)
}}
What does the largest park in Alaska look like?
SELECT {{VQA('Describe this image.', 'parks::Image')}} FROM parks
WHERE "Location" = 'Alaska'
ORDER BY {{
LLMMap(
'Size in km2?',
'parks::Area'
)
}} LIMIT 1
Which park protects an ash flow formed by a volcano eruption?
{{
LLMQA(
'Which park protects an ash flow formed by a volcano?',
context=(SELECT "Name", "Description" FROM parks),
options="parks::Name"
)
}}
For in-depth descriptions of the above queries, check out our documentation.
Features
- Supports many DBMS 💾
- Currently, SQLite and PostgreSQL are functional - more to come!
- Easily extendable to multi-modal usecases 🖼️
- Smart parsing optimizes what is passed to external functions 🧠
- Traverses abstract syntax tree with sqlglot to minimize LLM function calls 🌳
- Constrained decoding with outlines 🚀
- LLM function caching, built on diskcache 🔑
Quickstart
from blendsql import blend, LLMQA
from blendsql.db import SQLite
from blendsql.models import OpenaiLLM, TransformersLLM
from blendsql.utils import fetch_from_hub
blendsql = """
SELECT * FROM w
WHERE city = {{
LLMQA(
'Which city is located 120 miles west of Sydney?',
(SELECT * FROM documents WHERE documents MATCH 'sydney OR 120'),
options='w::city'
)
}}
"""
# Make our smoothie - the executed BlendSQL script
smoothie = blend(
query=blendsql,
db=SQLite(fetch_from_hub("1884_New_Zealand_rugby_union_tour_of_New_South_Wales_1.db")),
blender=OpenaiLLM("gpt-3.5-turbo"),
# If you don't have OpenAI setup, you can use this small Transformers model below instead
# blender=TransformersLLM("Qwen/Qwen1.5-0.5B"),
ingredients={LLMQA},
verbose=True
)
print(smoothie.df)
print(smoothie.meta.prompts)
Citation
@article{glenn2024blendsql,
title={BlendSQL: A Scalable Dialect for Unifying Hybrid Question Answering in Relational Algebra},
author={Parker Glenn and Parag Pravin Dakle and Liang Wang and Preethi Raghavan},
year={2024},
eprint={2402.17882},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Acknowledgements
Special thanks to those below for inspiring this project. Definitely recommend checking out the linked work below, and citing when applicable!
- The authors of Binding Language Models in Symbolic Languages
- This paper was the primary inspiration for BlendSQL.
- The authors of EHRXQA: A Multi-Modal Question Answering Dataset for Electronic Health Records with Chest X-ray Images
- As far as I can tell, the first publication to propose unifying model calls within SQL
- Served as the inspiration for the vqa-ingredient.ipynb example
- The authors of Grammar Prompting for Domain-Specific Language Generation with Large Language Models
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file blendsql-0.0.16.tar.gz
.
File metadata
- Download URL: blendsql-0.0.16.tar.gz
- Upload date:
- Size: 121.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d746f19adb425830e27a29b95f8a194585c567df97d27a9a97672dca4d0d181a |
|
MD5 | b35be983451cef36c9395af60739a06b |
|
BLAKE2b-256 | 298e6501f47cdcb157b6173a7f88d70071dea7d98f781adbef965aef6a095019 |
File details
Details for the file blendsql-0.0.16-py3-none-any.whl
.
File metadata
- Download URL: blendsql-0.0.16-py3-none-any.whl
- Upload date:
- Size: 108.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2d936f13a088be54b092fda461a6b9cf070b7ce16a4d07238ca626e59b33b465 |
|
MD5 | 5557bdbfbcd24a11e50431ce639dd143 |
|
BLAKE2b-256 | 36482b23ef261e1e1457c923e61c88a00b8aeea586bb3e9b3bf6db7d48d0d22c |