Skip to main content

Python bindings for ik_llama.cpp — high-performance llama.cpp fork

Project description

ik-llama-cpp-python

PyPI version PyPI - Python Version License: MIT

Python bindings for ik_llama.cpp — a high-performance fork of llama.cpp with faster CPU inference, novel quantization types (Trellis / IQK quants), and AVX-VNNI / AVX-512 optimizations.

Designed as a drop-in replacement for llama-cpp-python.

Installation

Pre-built wheels (CPU, with AVX2)

pip install ik-llama-cpp-python

Pre-built wheels (CUDA)

pip install ik-llama-cpp-python-cuda
From source (requires CMake ≥ 3.21 and a C++20 compiler)
git clone https://github.com/gongpx20069/ik-llama-cpp-python
cd ik-llama-cpp-python
git submodule update --init --recursive
pip install -e .
From source with CUDA
CMAKE_ARGS="-DGGML_CUDA=ON" pip install -e .
From source with native CPU optimizations

For maximum performance on your specific CPU (AVX-512, AVX-VNNI, etc.):

CMAKE_ARGS="-DGGML_NATIVE=ON" pip install -e .

Quick Start

from ik_llama_cpp import IkLlama

llm = IkLlama("model.gguf", n_ctx=4096)

# Simple chat
text = llm.chat("What is the theory of relativity?")
print(text)

API

create_chat_completion — OpenAI-compatible

Returns a dict matching the llama_cpp.Llama.create_chat_completion schema.

response = llm.create_chat_completion(
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    temperature=0.3,
    max_tokens=256,
)
print(response["choices"][0]["message"]["content"])
print(response["usage"])

chat — Convenience wrapper

text = llm.chat("Explain quantum mechanics in one sentence.")

generate — Low-level token generation

tokens = llm.tokenize("Hello, world!")
output_ids = llm.generate(tokens, max_tokens=128, temperature=0.7)
text = llm.detokenize(output_ids)

Drop-in replacement for llama-cpp-python

# Change this:
# from llama_cpp import Llama
# To this:
from ik_llama_cpp import IkLlama as Llama

llm = Llama("model.gguf", n_ctx=4096, flash_attn=True)
response = llm.create_chat_completion(
    messages=[{"role": "user", "content": "Hello!"}],
)

Quantization (IQ4_KT)

ik_llama.cpp provides novel Trellis quantization types (IQ1_KTIQ4_KT) that are not available in upstream llama.cpp. This package includes llama-quantize and a Python API to create these quants from standard GGUF files.

Install with quantization support

pip install ik-llama-cpp-python[quantize]

CLI: Download from HuggingFace and quantize

# Download bf16 source + imatrix, quantize to IQ4_KT in one step
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF

# Specify a different quant type
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF --type IQ3_KT

# Custom output directory
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF --output-dir models/

CLI: Quantize a local file

# With imatrix (recommended for IQ quants)
ik-llama-quantize quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT \
    --imatrix model-imatrix.gguf

# Without imatrix
ik-llama-quantize quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT

# Shorthand (without subcommand)
ik-llama-quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT

Python API

from ik_llama_cpp import quantize, quantize_from_hf

# One-step: download from HuggingFace and quantize
path = quantize_from_hf("bartowski/google_gemma-4-E2B-it-GGUF", quant_type="IQ4_KT")

# Or quantize a local file
path = quantize("model-bf16.gguf", "model-IQ4_KT.gguf", "IQ4_KT",
                imatrix_path="model-imatrix.gguf")

Check if llama-quantize is available

ik-llama-quantize check

Constructor Parameters

Parameter Type Default Description
model_path str required Path to GGUF model file
n_ctx int 4096 Context window size
n_threads int 0 CPU threads (0 = auto)
use_mmap bool True Memory-map model file
use_mlock bool False Lock model in RAM
flash_attn bool True Enable flash attention
n_gpu_layers int 0 Number of layers to offload to GPU
verbose bool True Logging verbosity

Supported Platforms

Platform Wheels Notes
Linux x86_64 CPU (AVX2), CUDA 12.4 Python 3.10–3.13
Linux aarch64 CPU Python 3.10–3.13
macOS arm64 CPU + Metal Python 3.10–3.13
Windows x86_64 CPU (AVX2) Python 3.10–3.13

Environment Variables

Variable Description
IK_LLAMA_CPP_LIB_PATH Override path to the compiled shared library
CMAKE_ARGS Extra CMake flags for source builds

Why ik_llama.cpp?

ik_llama.cpp is a llama.cpp fork focused on performance and quantization research. Key advantages:

  • Faster CPU inference — improved prompt processing across all quantization types, better Flash Attention token generation
  • Novel quantization types — Trellis quants (IQ1_KTIQ4_KT), IQK quants (IQ2_KIQ6_K), row-interleaved R4 variants, MXFP4
  • Better KV cacheQ8_KV / Q4_0 KV-cache quantization with Hadamard transforms
  • DeepSeek optimizations — FlashMLA (v1–v3), fused MoE operations, Smart Expert Reduction
  • Hardware support — optimized kernels for AVX2, AVX-512, AVX-VNNI, ARM NEON, CUDA (Turing+)
  • Broad model support — LLaMA-3/4, Qwen3, DeepSeek-V3, Gemma3/4, Mistral, and many more

Architecture

ik_llama_cpp/
  __init__.py        # Public API: IkLlama, quantize, quantize_from_hf
  _lib_loader.py     # Finds and loads the shared library (.dll/.so/.dylib)
  _ctypes_api.py     # Low-level ctypes bindings to llama.h C API
  _internals.py      # RAII wrappers: IkModel, IkContext
  llama.py           # High-level IkLlama class
  quantize.py        # Quantization CLI and API (wraps llama-quantize)
  lib/               # Compiled shared libraries (installed by CMake)
  bin/               # llama-quantize binary (installed by CMake)

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request on GitHub. Whether it's bug reports, feature requests, documentation improvements, or code contributions — all are appreciated.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ik_llama_cpp_python-0.1.0.tar.gz (42.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ik_llama_cpp_python-0.1.0-cp313-cp313-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.13Windows x86-64

ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.0-cp313-cp313-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.0-cp312-cp312-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.12Windows x86-64

ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.0-cp312-cp312-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.0-cp311-cp311-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.11Windows x86-64

ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.0-cp311-cp311-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.0-cp310-cp310-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.10Windows x86-64

ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.0-cp310-cp310-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

File details

Details for the file ik_llama_cpp_python-0.1.0.tar.gz.

File metadata

  • Download URL: ik_llama_cpp_python-0.1.0.tar.gz
  • Upload date:
  • Size: 42.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ik_llama_cpp_python-0.1.0.tar.gz
Algorithm Hash digest
SHA256 7220452c74882ec8e5c9b6eaae748d80835782dd8699d97d7fa6e8b97f7610d5
MD5 fba696aa48937f17f38f3202c6c9d26a
BLAKE2b-256 97478dfc32f012e5ea1784003c63761bf1eea6faff168121c6d7ed15bde4a6fb

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0.tar.gz:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 aabf5e69e13838bede49ef5884dc23474caf94d211ebbc5f4455e3af1ee3d5ac
MD5 028c6dacccf088de8a09bef17ce6e8bf
BLAKE2b-256 90922e0ed91fa424c2ac292a1059c130c824768e8fef5bb490a8e17013788a72

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 5e61b2610a8577663cabe9ce3d61ef7bad2efdc47b9f422a5329022b5073522c
MD5 66982270e0ba88a74d05e5c700b2513f
BLAKE2b-256 25f26c794504b3d0ffba1869d66f76ba39d9e166cd02c82311a495784c81f183

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7e8c3c8833b3b6d89357cf9cd99773eaa23dcf7e8baf933c4672fcf6bb3c7bcb
MD5 8e2d1d127d08ff55a4e46eefb2c4c702
BLAKE2b-256 93e7b8dc15ebd13c5ef5a56617fcaa868be1d713498489e2403a8293bbf8400d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp313-cp313-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 7ca84d709deb8dd193c967ac13eb535ac0803764fc5ec554a33b3a8f58f9637b
MD5 1ead30533fca6762adaa642b485614c9
BLAKE2b-256 1f66dd7fb7f253f077fbb6f7563143fbc96b36a614c2877d17e04ce854284b38

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 371b81a4e260675ef7b162ff7e499efbf558983104292be35b9cb8f76862f272
MD5 fe8a1097abbba8f525154697ae8be5bc
BLAKE2b-256 474fb6e4c33fa8875e9cc3839df952ffc4aaf2c94aa4325e2a5a646809577351

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 11843f04677fdae2f7505cf628dc75ec1b24ec63f7b3ad22b98dde0eabede5fd
MD5 1bbac98473ec8710eb36c7100534699b
BLAKE2b-256 37b7e14ff8954a684d1212b3b09813a9e65aa6f7ba91158dd839d32a4dacb848

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 da7e833bc0e3c417b9e4c7b4dfb0159ba16b06518bc9b9b35645b7cd4b99d601
MD5 d5fdddf4221f94c0e13b6a413dd48734
BLAKE2b-256 12f458744be916619cfd81c09b0da73a2cf357b1408933e7854b9a49dfc1c348

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp312-cp312-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 e138597bf6b0e01c590e05ced32e5c3eb57e7aa2a9516d719c58394d601bf36a
MD5 8eb3f749e5846188698dfb7e8238956e
BLAKE2b-256 ee8c034188fb5cb7f11f75647fe183ff1d6d83769585260b91eb386676b617b2

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 41654f46a22619dea25cc126de487300b848d0c9a959d26634ccf74d5a68a32b
MD5 7db51cdd11ecce66329fcb7de90c260f
BLAKE2b-256 fba283548ba10b22fddf407bc2572d3bb0f252227e9b75ae23131022baf0a490

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 1586250f58d4dadfe2d428aa601a8a0ca6b92e1bce88b0b7681c2857344982b1
MD5 13d6d366bbe9001678e07fe3f4aa1f4c
BLAKE2b-256 88b5d851f976d32ae297180d829690b5a22e24fc2001551b8d11b3597ea01445

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 ce58c0b578c60305b157659f3a2698fcc0ea98cb1aa778710e11c5c0c24fd626
MD5 b9c375f1e3ab74b2906642b3cf8e07ff
BLAKE2b-256 e5ca2c93b4c841fef5d28ecb741520fe29df28d56bbccf2cbfd63707d21c0afb

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp311-cp311-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 5bdd37c18ad03b790bf1ba6d2df9168b1235e875d601e0f465b57e4c1ba051fc
MD5 74bdc377bb8937a206f00b5cd9d25ec7
BLAKE2b-256 2e4df99c51323bbc12bd144774cf5cbcacf3b5bf86c6f73da77ed9b6b3782ff8

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 ec54b0014173b9bb65e72de595cbcff70be2f5f47b7496f1c0b7d65c6aba5208
MD5 22491624502579d2e54776a92fc3d51b
BLAKE2b-256 03fc417f67c280aba30291b5436979b7a6bd1652099b110f958de8264062a80d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 fb37fa9cdf560b98430fae4f61b3b039943a052d454202345ebf0e0ea81358de
MD5 896b8f3007bee82dbde8964b5d68aefb
BLAKE2b-256 fe1cbc7c625d56e919b65909d8f3459522c3fcf8b5d1a7662c0f94de76509ca7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 016033f83785b6d0c09b4daa5a2daa3f217fb80403ce99ce6f6209b298494791
MD5 da7aa0b19110f5d3480efcdd5d85c796
BLAKE2b-256 285da33edd77d79eab945d8c12d8e7d1cb6680caa3505ac35701662dfecd2ec3

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp310-cp310-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.0-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.0-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 fb16d1faaea9b873a88fc845e39cb92780a38955d113961f236348f21f098cff
MD5 c130e3cd36930fc1d80cf4c9e261cd6e
BLAKE2b-256 d286623502e800d7a8f240c0429f3b4d2f7868bcb7fce83ef813524cfcc57d77

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.0-cp310-cp310-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page