Skip to main content

LangExtract provider plugin for llama.cpp

Project description

LangExtract llama-cpp-python Provider

PyPI version

A provider plugin for LangExtract that supports llama.cpp models.

Installation

pip install langextract-llamacpp

Supported Model IDs

Model ID using the format as such:

  1. HuggingFace repo with file name: hf:<hf_repo_id>:<filename>
  2. HuggingFace repo without file name: hf:<hf_repo_id>, in this case the filename will be None
  3. Local file: file:<path_to_model>

hf_repo_id is existing huggingface model repository.

Usage

Using HuggingFace repository; this will call Llama.from_pretrained(...).

import langextract as lx

config = lx.factory.ModelConfig(
    model_id="hf:MaziyarPanahi/Mistral-7B-Instruct-v0.3-GGUF:*Q4_K_M.gguf",
    provider="LlamaCppLanguageModel", # optional as hf: will resolve to the model
    provider_kwargs=dict(
        n_gpu_layers=-1,
        n_ctx=4096,
        verbose=False,
        completion_kwargs=dict(
            temperature=1.1,
            seed=42,
        ),
    ),
)

model = lx.factory.create_model(config)

result = lx.extract(
    model=model,
    text_or_documents="Your input text",
    prompt_description="Extract entities",
    examples=[...],
)

Using local file path; this will call Llama(...).

import langextract as lx

config = lx.factory.ModelConfig(
    model_id="file:Mistral-7B-Instruct-v0.3.Q4_K_M.gguf",
    provider="LlamaCppLanguageModel", # optional as file: will resolve to the model
    provider_kwargs=dict(
        ...
    ),
)

...

For provider_kwargs refer to documentation for Llama class.

For completion_kwargs refer to documentation for create_chat_completion method.

OpenAI compatible Web Server

When using llama-cpp-python server (or llama.cpp), you can use OpenAILanguageModel in the provider field as they implement OpenAI compatible web server.

To set this up, choose OpenAILanguageModel as the provider and supply the server’s base URL and an API key (any value) in provider_kwargs. The model_id field is optional.

config = lx.factory.ModelConfig(
    model_id="local", # optional
    provider="OpenAILanguageModel", # explicitly set the provider to `OpenAILanguageModel`
    provider_kwargs=dict(
        base_url="http://localhost:8000/v1/",
        api_key="llama-cpp", # any value; mandatory
    ),
)

model = lx.factory.create_model(config)

result = lx.extract(
    model=model,
    ...
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

langextract_llamacpp-0.1.3.tar.gz (3.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

langextract_llamacpp-0.1.3-py3-none-any.whl (4.6 kB view details)

Uploaded Python 3

File details

Details for the file langextract_llamacpp-0.1.3.tar.gz.

File metadata

  • Download URL: langextract_llamacpp-0.1.3.tar.gz
  • Upload date:
  • Size: 3.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for langextract_llamacpp-0.1.3.tar.gz
Algorithm Hash digest
SHA256 1b00230b75a02a89a1bfb4664a11f0a692a9951a0efb7fa1ac0a1d327f43b708
MD5 729d7df17fd26114c20e611502bdbc41
BLAKE2b-256 ae8b1f4d2f89b7e678aad563626e45311b89c99756f0a1e564aeae846365788c

See more details on using hashes here.

File details

Details for the file langextract_llamacpp-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: langextract_llamacpp-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 4.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for langextract_llamacpp-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 ac45695b8eae6d2ac5fdac427b61e9f2bd8a3ec7149e245c375be2c4ce5ddc1f
MD5 d1f75367429be847101c52f896846cb9
BLAKE2b-256 7537bd00f887d399b099a37308e452460d8c67858930e68831790a9a1e7a6a10

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page