Skip to main content

Official Python CPU inference for GPT4All language models based on llama.cpp and ggml

Project description

PyGPT4All

Official Python CPU inference for GPT4All language models based on llama.cpp and ggml.

License: MIT

NB: Under active development

Installation

  1. The easy way is to use the prebuilt wheels
pip install pygpt4all
  1. Build it from source:
git clone --recursive https://github.com/nomic-ai/pygpt4all && cd pygpt4all
pip install .

Usage

GPT4All model

Download a GPT4All model from https://the-eye.eu/public/AI/models/nomic-ai/gpt4all/. The easiest approach is download a file whose name ends in ggml.bin

from pygpt4all.models.gpt4all import GPT4All

def new_text_callback(text):
    print(text, end="")

model = GPT4All('./models/ggml-gpt4all-j.bin')
model.generate("Once upon a time, ", n_predict=55, new_text_callback=new_text_callback)

GPT4All-J model

Download the GPT4All-J model from https://gpt4all.io/models/ggml-gpt4all-j-v1.2-jazzy.bin

from pygpt4all.models.gpt4all_j import GPT4All_J

def new_text_callback(text):
    print(text, end="")

model = GPT4All_J('./models/ggml-gpt4all-j.bin')
model.generate("Once upon a time, ", n_predict=55, new_text_callback=new_text_callback)

License

This project is licensed under the MIT License.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pygpt4all-1.0.0.tar.gz (2.6 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page