Skip to main content

Use local LLMs in your Python apps, with GPU acceleration and zero dependencies.

Project description

Function LLM for Python

function logo

Dynamic JSON Badge X (formerly Twitter) Follow

Use local LLMs in your Python apps, with GPU acceleration and zero dependencies. This package is designed to patch OpenAI and Anthropic clients for running inference locally, using predictors hosted on Function.

[!TIP] We offer a similar package for use in the browser and Node.js. Check out fxn-llm-js.

[!IMPORTANT] This package is still a work-in-progress, so the API could change drastically between all releases.

Installing Function LLM

Function is distributed on PyPi. To install, open a terminal and run the following command:

# Install Function LLM
$ pip install --upgrade fxn-llm

[!NOTE] Function LLM requires Python 3.10+

[!IMPORTANT] Make sure to create an access key by signing onto Function. You'll need it to fetch the predictor at runtime.

Using the OpenAI Client Locally

To run text generation and embedding models locally using the OpenAI client, patch your OpenAI instance with the locally function:

from openai import OpenAI
from fxn_llm import locally

# 💥 Create your OpenAI client
openai = OpenAI()

# 🔥 Make it local
openai = locally(openai)

# 🚀 Generate embeddings
embeddings = openai.embeddings.create(
    model="@nomic/nomic-embed-text-v1.5-quant",
    input="search_query: Hello world!"
)

[!WARNING] Currently, only openai.embeddings.create is supported. Text generation is coming soon!


Useful Links

Function is a product of NatML Inc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fxn_llm-0.0.2.tar.gz (12.7 kB view details)

Uploaded Source

Built Distribution

fxn_llm-0.0.2-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file fxn_llm-0.0.2.tar.gz.

File metadata

  • Download URL: fxn_llm-0.0.2.tar.gz
  • Upload date:
  • Size: 12.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.8

File hashes

Hashes for fxn_llm-0.0.2.tar.gz
Algorithm Hash digest
SHA256 1aff66fe419bc695531948f57928672775bb9c803983e0c77aee77bd2fa1e34c
MD5 e184dd395999ebcfbf1b6f8c84bb00c9
BLAKE2b-256 fef0e69fa36f1391fdf98804403352a8e48b17cbc52fe8e443d5058cf0e4b1fb

See more details on using hashes here.

File details

Details for the file fxn_llm-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: fxn_llm-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 12.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.8

File hashes

Hashes for fxn_llm-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 0987ab56908e0746d1c563ac6e8f68334600de7dca693fd140b833283249247e
MD5 582b081b2477acdc276e125fe06fbfa9
BLAKE2b-256 efb24c803129cdeb6f22f79a73abc5f047fa0b1676e75100bfce6adceeb59194

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page