Skip to main content

gguf connector core built on llama.cpp

Project description

llama-core

Static Badge

This is a solo llama connector also; being able to work independently.

install via (pip/pip3):

pip install llama-core

run it by (python/python3):

python -m llama_core

Prompt to user interface selection menu above; while chosen, GGUF file(s) in the current directory will be searched and detected (if any) as below.

include interface selector to your code by adding:

from llama_core import menu

include gguf reader to your code by adding:

from llama_core import reader

include gguf writer to your code by adding:

from llama_core import writer

remark(s)

Other functions are same as llama-cpp-python; for CUDA(GPU, Nvida) and Metal(M1/M2, Apple) supported settings, please specify CMAKE_ARGS following Abetlen's repo below; if you want to install it by source file (under releases), you should opt to do it by .tar.gz file (then build your machine-customized installable package) rather than .whl (wheel; a pre-built binary package) with an appropriate cmake tag(s).

references

repo llama-cpp-python llama.cpp page gguf.us

build from llama_core-(version).tar.gz (examples below are for CPU)

According to the latest note inside vs code, msys64 was recommended by Microsoft; or you could opt w64devkit or etc. as source/location of your gcc and g++ compilers.

for windows user(s):

$env:CMAKE_GENERATOR = "MinGW Makefiles"
$env:CMAKE_ARGS = "-DCMAKE_C_COMPILER=C:/msys64/mingw64/bin/gcc.exe -DCMAKE_CXX_COMPILER=C:/msys64/mingw64/bin/g++.exe"
pip install llama_core-(version).tar.gz

In mac, xcode command line tools were recommended by Apple for dealing all coding related issue(s); or you could bypass it for your own good/preference.

for mac user(s):

pip3 install llama_core-(version).tar.gz

Make sure your gcc and g++ are >=11; you can check it by: gcc --version and g++ --version; other setting(s) include: cmake>=3.21, etc.; however, if you opt to install it by the pre-built wheel (.whl) file then you don't need to worry about that.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llama_core-0.3.5.tar.gz (64.0 MB view details)

Uploaded Source

Built Distributions

llama_core-0.3.5-cp312-cp312-macosx_14_0_arm64.whl (3.5 MB view details)

Uploaded CPython 3.12 macOS 14.0+ ARM64

llama_core-0.3.5-cp312-cp312-macosx_11_0_x86_64.whl (3.9 MB view details)

Uploaded CPython 3.12 macOS 11.0+ x86-64

File details

Details for the file llama_core-0.3.5.tar.gz.

File metadata

  • Download URL: llama_core-0.3.5.tar.gz
  • Upload date:
  • Size: 64.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.12.1

File hashes

Hashes for llama_core-0.3.5.tar.gz
Algorithm Hash digest
SHA256 359d3bf852d15d5411966f98cedbc1e7fa702d9343e50c712ff422ff6fcc9b9f
MD5 0546a4d657d3670366eac1cde3fe6823
BLAKE2b-256 c1a4182d8c081ea04913dcade7caf01c59dce03335ca7fa5a7fd26528c493924

See more details on using hashes here.

File details

Details for the file llama_core-0.3.5-cp312-cp312-macosx_14_0_arm64.whl.

File metadata

File hashes

Hashes for llama_core-0.3.5-cp312-cp312-macosx_14_0_arm64.whl
Algorithm Hash digest
SHA256 03570babdf34fff5a6cdc11b6d5a77177b647d66e8457f0170c1e398f46a43a7
MD5 d7cc314f29baca9bd1b90dd681827b72
BLAKE2b-256 4d35a484efd5b319bc444218292705527d2b8660938303b82c78052060037026

See more details on using hashes here.

File details

Details for the file llama_core-0.3.5-cp312-cp312-macosx_11_0_x86_64.whl.

File metadata

File hashes

Hashes for llama_core-0.3.5-cp312-cp312-macosx_11_0_x86_64.whl
Algorithm Hash digest
SHA256 48a325e48bb821c55ee22e1d84dd154a962bc06b147403ae4d8454b691b8509e
MD5 ea3cbdcb306fdd1f2da1e837c96de183
BLAKE2b-256 75ec757836db8293623e50e84c4ebb9869ba72d5bc35e5b77c052912ffa944a2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page