Skip to main content

Python bindings for GPT4All

Project description

Python GPT4All

This package contains a set of Python bindings around the llmodel C-API.

Package on PyPI: https://pypi.org/project/gpt4all/

Documentation

https://docs.gpt4all.io/gpt4all_python.html

Installation

The easiest way to install the Python bindings for GPT4All is to use pip:

pip install gpt4all

This will download the latest version of the gpt4all package from PyPI.

Local Build

As an alternative to downloading via pip, you may build the Python bindings from source.

Prerequisites

On Windows and Linux, building GPT4All requires the complete Vulkan SDK. You may download it from here: https://vulkan.lunarg.com/sdk/home

macOS users do not need Vulkan, as GPT4All will use Metal instead.

Building the python bindings

  1. Clone GPT4All and change directory:
git clone --recurse-submodules https://github.com/nomic-ai/gpt4all.git
cd gpt4all/gpt4all-backend
  1. Build the backend.

If you are using Windows and have Visual Studio installed:

cmake -B build
cmake --build build --parallel --config RelWithDebInfo

For all other platforms:

cmake -B build -DCMAKE_BUILD_TYPE=RelWithDebInfo
cmake --build build --parallel

RelWithDebInfo is a good default, but you can also use Release or Debug depending on the situation.

  1. Install the Python package:
cd ../../gpt4all-bindings/python
pip install -e .

Usage

Test it out! In a Python script or console:

from gpt4all import GPT4All
model = GPT4All("orca-mini-3b-gguf2-q4_0.gguf")
output = model.generate("The capital of France is ", max_tokens=3)
print(output)

GPU Usage

from gpt4all import GPT4All
model = GPT4All("orca-mini-3b-gguf2-q4_0.gguf", device='gpu') # device='amd', device='intel'
output = model.generate("The capital of France is ", max_tokens=3)
print(output)

Troubleshooting a Local Build

  • If you're on Windows and have compiled with a MinGW toolchain, you might run into an error like:

    FileNotFoundError: Could not find module '<...>\gpt4all-bindings\python\gpt4all\llmodel_DO_NOT_MODIFY\build\libllmodel.dll'
    (or one of its dependencies). Try using the full path with constructor syntax.
    

    The key phrase in this case is "or one of its dependencies". The Python interpreter you're using probably doesn't see the MinGW runtime dependencies. At the moment, the following three are required: libgcc_s_seh-1.dll, libstdc++-6.dll and libwinpthread-1.dll. You should copy them from MinGW into a folder where Python will see them, preferably next to libllmodel.dll.

  • Note regarding the Microsoft toolchain: Compiling with MSVC is possible, but not the official way to go about it at the moment. MSVC doesn't produce DLLs with a lib prefix, which the bindings expect. You'd have to amend that yourself.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

gpt4all-2.3.2-py3-none-win_amd64.whl (6.3 MB view details)

Uploaded Python 3 Windows x86-64

gpt4all-2.3.2-py3-none-manylinux1_x86_64.whl (3.9 MB view details)

Uploaded Python 3

gpt4all-2.3.2-py3-none-macosx_10_15_universal2.whl (5.8 MB view details)

Uploaded Python 3 macOS 10.15+ universal2 (ARM64, x86-64)

File details

Details for the file gpt4all-2.3.2-py3-none-win_amd64.whl.

File metadata

  • Download URL: gpt4all-2.3.2-py3-none-win_amd64.whl
  • Upload date:
  • Size: 6.3 MB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.8.12

File hashes

Hashes for gpt4all-2.3.2-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 cf71c4ff9af0e31026b9e101f2f65e7e0d6dc00870431be37d31b152081904e1
MD5 098a0f82629f890af65b5d91ae966d82
BLAKE2b-256 dabf9aed15629cc9e4948a391842ae6b1ad088fc7710749118a2e16fb59376a5

See more details on using hashes here.

File details

Details for the file gpt4all-2.3.2-py3-none-manylinux1_x86_64.whl.

File metadata

File hashes

Hashes for gpt4all-2.3.2-py3-none-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 ad47d297b5e6bf2d949aef37fd87a0c7b9f7db2045fcff35146235eacbd1eda4
MD5 c6728fde14a105ce04374d68a33e739e
BLAKE2b-256 c924eb3ba6bac9c5522b2f8648da75f7346efc648b12b717622b1bb78f1e8b73

See more details on using hashes here.

File details

Details for the file gpt4all-2.3.2-py3-none-macosx_10_15_universal2.whl.

File metadata

File hashes

Hashes for gpt4all-2.3.2-py3-none-macosx_10_15_universal2.whl
Algorithm Hash digest
SHA256 8ef4289992166f63cca7f3fe431c4c0c90bcddb0736d4484b558f90bc18123b0
MD5 f4f71997e6d51ebe5ddbcb40e0a29a1c
BLAKE2b-256 2ec411360af5b28856c78cfd36488fe80efca6b6788f68f991929fee1de2ef9b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page