Skip to main content

A query language for language models.

Project description

Logo

LMQL

A programming language for large language models.
Documentation »

Explore Examples · Playground IDE · Report Bug

PyPI version

LMQL is an open source programming language for large language models (LLMs) based on a superset of Python. LMQL goes beyond traditional templating languages by providing full Python support, yet a lightweight programming interface.

LMQL is designed to make working with language models like OpenAI, 🤗 Transformers more efficient and powerful through its advanced functionality, including multi-variable templates, conditional distributions, constraints, datatype constraints and control flow.

Features:

Explore LMQL

A simple example program in LMQL looks like this:

argmax
   "Greet LMQL:[GREETINGS]\n"

   if "Hi there" in GREETINGS:
      "Can you reformulate your greeting in the speech of victorian-era English: [VIC_GREETINGS]\n"

   "Analyse what part of this response makes it typically victorian:\n"

   for i in range(4):
      "-[THOUGHT]\n"

   "To summarize:[SUMMARY]"
from 
   "openai/text-davinci-003" 
where 
   stops_at(GREETINGS, ".") and not "\n" in GREETINGS and 
   stops_at(VIC_GREETINGS, ".") and 
   stops_at(THOUGHT, ".")

Program Output:


The main body of an LMQL program reads like standard Python (with control-flow), where top-level strings are interpreted as model input with template variables like [GREETINGS].

The argmax keyword in the beginning specifies the decoding algorithm used to generate tokens, e.g. argmax, sample or even advanced branching decoders like beam search and best_k.

The from and where clauses specify the model and constraints that are employed during decoding.

Overall, this style of language model programming facilitates guidance of the model's reasoning process, and constraining of intermediate outputs using an expressive constraint language.

Learn more about LMQL by exploring our Example Showcase or by running your own programs in our browser-based Playground IDE.

Getting Started

To install the latest version of LMQL run the following command with Python ==3.10 installed.

pip install lmql

Local GPU Support: If you want to run models on a local GPU, make sure to install LMQL in an environment with a GPU-enabled installation of PyTorch >= 1.11 (cf. https://pytorch.org/get-started/locally/) and install via pip install lmql[hf].

Running LMQL Programs

After installation, you can launch the LMQL playground IDE with the following command:

lmql playground

Using the LMQL playground requires an installation of Node.js. If you are in a conda-managed environment you can install node.js via conda install nodejs=14.20 -c conda-forge. Otherwise, please see the official Node.js website https://nodejs.org/en/download/ for instructions how to install it on your system.

This launches a browser-based playground IDE, including a showcase of many exemplary LMQL programs. If the IDE does not launch automatically, go to http://localhost:3000.

Alternatively, lmql run can be used to execute local .lmql files. Note that when using local HuggingFace Transformers models in the Playground IDE or via lmql run, you have to first launch an instance of the LMQL Inference API for the corresponding model via the command lmql serve-model.

Configuring OpenAI API Credentials

If you want to use OpenAI models, you have to configure your API credentials. To do so, create a file api.env in the active working directory, with the following contents.

openai-org: <org identifier>
openai-secret: <api secret>

For system-wide configuration, you can also create an api.env file at $HOME/.lmql/api.env or at the project root of your LMQL distribution (e.g. src/ in a development copy).

Installing the Latest Development Version

To install the latest (bleeding-edge) version of LMQL, you can also run the following command:

pip install git+https://github.com/eth-sri/lmq

This will install the lmql package directly from the main branch of this repository. We do not continously test the main version, so it may be less stable than the latest PyPI release.

Setting Up a Development Environment

To setup a conda environment for local LMQL development with GPU support, run the following commands:

# prepare conda environment
conda env create -f scripts/conda/requirements.yml -n lmql
conda activate lmql

# registers the `lmql` command in the current shell
source scripts/activate-dev.sh

Operating System: The GPU-enabled version of LMQL was tested to work on Ubuntu 22.04 with CUDA 12.0 and Windows 10 via WSL2 and CUDA 11.7. The no-GPU version (see below) was tested to work on Ubuntu 22.04 and macOS 13.2 Ventura or Windows 10 via WSL2.

Development without GPU

This section outlines how to setup an LMQL development environment without local GPU support. Note that LMQL without local GPU support only supports the use of API-integrated models like openai/text-davinci-003. Please see the OpenAI API documentation (https://platform.openai.com/docs/models/gpt-3-5) to learn more about the set of available models.

To setup a conda environment for LMQL with no GPU support, run the following commands:

# prepare conda environment
conda env create -f scripts/conda/requirements-no-gpu.yml -n lmql-no-gpu
conda activate lmql-no-gpu

# registers the `lmql` command in the current shell
source scripts/activate-dev.sh

Project details


Release history Release notifications | RSS feed

This version

0.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

xxtest-0.1.tar.gz (15.6 kB view details)

Uploaded Source

Built Distribution

xxtest-0.1-py3-none-any.whl (16.7 kB view details)

Uploaded Python 3

File details

Details for the file xxtest-0.1.tar.gz.

File metadata

  • Download URL: xxtest-0.1.tar.gz
  • Upload date:
  • Size: 15.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.11

File hashes

Hashes for xxtest-0.1.tar.gz
Algorithm Hash digest
SHA256 8df1724011aae24985cab2e58dd990021d5b73c3475fede80d5383ec058d836e
MD5 1fd30137c8b7f8e15e824125074454d8
BLAKE2b-256 f0480e6afee0ef889198e0fbdafce93d03382662e2474a6b9f5d458b4ff45395

See more details on using hashes here.

File details

Details for the file xxtest-0.1-py3-none-any.whl.

File metadata

  • Download URL: xxtest-0.1-py3-none-any.whl
  • Upload date:
  • Size: 16.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.8.11

File hashes

Hashes for xxtest-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3e5573a9a3a5fb8aea0448404000830735c470513ca7a798749cea4f32d73596
MD5 30463580bc8ef9c82cdd2ee5777b13bb
BLAKE2b-256 7c8a0709993eb673ea8b56bfe7df9a729d582017b11e039a461364aa1a6c8760

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page