Converse with GPT4 LLM locally
Project description
local_llm_cli
local_llm_cli
is a Python package that allows you to converse with a GPT4All Language Model (LLM) locally. This can be useful for testing, developing, and debugging.
Currently, this library supports interacting with the GPT4All model. However, support for other models and additional functionalities are planned for future updates.
Installation
To install local_llm_cli
, you can use pip
:
pip install local_llm_cli
You'll also need to ensure that you have the necessary model files available locally.
Usage
The converse
sublibrary provides a function to load a GPT4All LLM and converse with it.
Here's a simple usage example:
from local_llm_cli.converse.chat import load_and_interact
# define the model path
model_path = 'path/to/your/model'
# call the function to start conversing with the LLM
load_and_interact(model_path)
In this example, the model_path
should be the path to the GPT4All model files on your local system.
The load_and_interact
function also accepts optional arguments to specify the model context (model_n_ctx
) and batch size (model_n_batch
). If these arguments are not provided, they default to 1024 and 8, respectively.
Here's an example with custom context and batch size:
load_and_interact(model_path, model_n_ctx=2048, model_n_batch=16)
You can stop the conversation at any time by typing exit
.
License
This project is licensed under the MIT License. See the LICENSE file for more details.
This package was crafted with ❤️ by Harsh Avinash in approximately 22 minutes. Enjoy conversing with your local LLM!
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file local_llm_cli-0.1.3.tar.gz
.
File metadata
- Download URL: local_llm_cli-0.1.3.tar.gz
- Upload date:
- Size: 2.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | faf75c1c0a5a78459f36abdb0701f1680771c12c1867f2beb65b88a70931c2f4 |
|
MD5 | 5b64ee6b58115fcd277a03abf6d57227 |
|
BLAKE2b-256 | 1b2799cd31ee3b5b6e9cd2bff0003fb44050e4a1f4cc754daf47f7393ded44b5 |
File details
Details for the file local_llm_cli-0.1.3-py3-none-any.whl
.
File metadata
- Download URL: local_llm_cli-0.1.3-py3-none-any.whl
- Upload date:
- Size: 3.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | b9568b4d79057cf34681dc5237b06d7f8fe15cfcce9ecef40b6afb6aaadbea16 |
|
MD5 | 1245f03c6ec91fcb5605c4950923e7ee |
|
BLAKE2b-256 | 28d3a667476f5272cd6314d77a7ba8c54ca66103f4af9b56d74ca97addfa27b1 |