a simple way to interact llama with gguf
Project description
GGUF core
This package is a GGUF (GPT-Generated Unified Format) file caller.
install the caller via pip/pip3 (once only):
pip install gguf-core
update the caller (if not in the latest version) by:
pip install gguf-core --upgrade
user manual
This is a cmd-based (command line) package, you can find the user manual by adding the flag -h or --help.
gguf -h
check current version
gguf -v
cli connector
with command-line interface
gguf c
gui connector
with graphical user interface
gguf g
interface selector
selection menu for connector interface(s) above
gguf i
metadata reader
read model metadata for detail(s)
gguf r
GGUF file(s) in the current directory will automatically be detected by the caller.
get feature
get GGUF from URL; clone/download it to the current directory
gguf get [url]
sample model list
You can either use the get feature above or opt a sample GGUF straight from the sample list by:
gguf s
pdf analyzor (beta)
You can now load your PDF file(s) straight into the model for generating digested summary; try it out by:
gguf p
wav analyzor (beta)
You can talk/speak straight to GGUF right away; prompt WAV(s) into the model for feedback; try it out by:
gguf w
launch to page/container (gguf.us)
gguf us
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for gguf_core-0.0.35-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 345f586288f57831bef9b462321bb2932926276b3a304fe97e199514d2b5ff68 |
|
MD5 | dfff17f2d4e77a35cf13337ef2ceb95a |
|
BLAKE2b-256 | bb864411071b23ddeabcbbbbb89b1a53942e032a4b0ac108f9850376fa0202a0 |