Skip to main content

Use local LLMs in your Python apps, with GPU acceleration and zero dependencies.

Project description

Function LLM for Python

function logo

Dynamic JSON Badge X (formerly Twitter) Follow

Use local LLMs in your Python apps, with GPU acceleration and zero dependencies. This package is designed to patch OpenAI and Anthropic clients for running inference locally, using predictors hosted on Function.

[!IMPORTANT] This package is still a work-in-progress, so the API could change drastically between all releases.

Installing Function LLM

Function is distributed on PyPi. To install, open a terminal and run the following command:

# Install Function LLM
$ pip install --upgrade fxn-llm

[!IMPORTANT] Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.

[!NOTE] Function LLM requires Python 3.10+

Using the OpenAI Client Locally

To run text generation and embedding models locally using the OpenAI client, patch your OpenAI instance with the locally function:

from openai import OpenAI
from fxn_llm import locally

# 💥 Create your OpenAI client
openai = OpenAI()

# 🔥 Make it local
openai = locally(openai)

# 🚀 Generate embeddings
embeddings = openai.embeddings.create(
    model="@nomic/nomic-embed-text-v1.5-quant",
    input="search_query: Hello world!"
)

[!WARNING] Currently, only openai.embeddings.create is supported. Text generation is coming soon!

Using the Anthropic Client Locally

To run text generation models locally using the Anthopic client, patch your Anthropic instance with the locally function and the following configuration:

from anthropic import Anthropic
from fxn_llm import locally

# 💥 Create your Anthropic client
anthropic = Anthropic()

# 🔥 Make it local
anthropic = locally(openai, provider="anthropic")

# 🚀 Chat
message = anthropic.messages.create(
  model="@meta/llama-3.1-8b-quant",
  messages=[{ "role": "user", "content": "Hello, Llama" }],
  max_tokens=1024,
)

[!CAUTION] Anthropic support is not functional. It is still a work-in-progress.


Useful Links

Function is a product of NatML Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fxn_llm-0.0.1.tar.gz (12.7 kB view hashes)

Uploaded Source

Built Distribution

fxn_llm-0.0.1-py3-none-any.whl (12.3 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page