Skip to main content

Python bindings for ik_llama.cpp — high-performance llama.cpp fork

Project description

ik-llama-cpp-python

PyPI version PyPI - Python Version License: MIT

Python bindings for ik_llama.cpp — a high-performance fork of llama.cpp with faster CPU inference, novel quantization types (Trellis / IQK quants), and AVX-VNNI / AVX-512 optimizations.

Designed as a drop-in replacement for llama-cpp-python.

Installation

Pre-built wheels (CPU, with AVX2)

pip install ik-llama-cpp-python

Pre-built wheels (CUDA)

pip install ik-llama-cpp-python-cuda
From source (requires CMake ≥ 3.21 and a C++20 compiler)
git clone https://github.com/gongpx20069/ik-llama-cpp-python
cd ik-llama-cpp-python
git submodule update --init --recursive
pip install -e .
From source with CUDA
CMAKE_ARGS="-DGGML_CUDA=ON" pip install -e .
From source with native CPU optimizations

For maximum performance on your specific CPU (AVX-512, AVX-VNNI, etc.):

CMAKE_ARGS="-DGGML_NATIVE=ON" pip install -e .

Quick Start

from ik_llama_cpp import IkLlama

llm = IkLlama("model.gguf", n_ctx=4096)

# Simple chat
text = llm.chat("What is the theory of relativity?")
print(text)

API

create_chat_completion — OpenAI-compatible

Returns a dict matching the llama_cpp.Llama.create_chat_completion schema.

response = llm.create_chat_completion(
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Hello!"},
    ],
    temperature=0.3,
    max_tokens=256,
)
print(response["choices"][0]["message"]["content"])
print(response["usage"])

chat — Convenience wrapper

text = llm.chat("Explain quantum mechanics in one sentence.")

generate — Low-level token generation

tokens = llm.tokenize("Hello, world!")
output_ids = llm.generate(tokens, max_tokens=128, temperature=0.7)
text = llm.detokenize(output_ids)

Drop-in replacement for llama-cpp-python

# Change this:
# from llama_cpp import Llama
# To this:
from ik_llama_cpp import IkLlama as Llama

llm = Llama("model.gguf", n_ctx=4096, flash_attn=True)
response = llm.create_chat_completion(
    messages=[{"role": "user", "content": "Hello!"}],
)

Quantization (IQ4_KT)

ik_llama.cpp provides novel Trellis quantization types (IQ1_KTIQ4_KT) that are not available in upstream llama.cpp. This package includes llama-quantize and a Python API to create these quants from standard GGUF files.

Install with quantization support

pip install ik-llama-cpp-python[quantize]

CLI: Download from HuggingFace and quantize

# Download bf16 source + imatrix, quantize to IQ4_KT in one step
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF

# Specify a different quant type
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF --type IQ3_KT

# Custom output directory
ik-llama-quantize from-hf bartowski/google_gemma-4-E2B-it-GGUF --output-dir models/

CLI: Quantize a local file

# With imatrix (recommended for IQ quants)
ik-llama-quantize quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT \
    --imatrix model-imatrix.gguf

# Without imatrix
ik-llama-quantize quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT

# Shorthand (without subcommand)
ik-llama-quantize model-bf16.gguf model-IQ4_KT.gguf IQ4_KT

Python API

from ik_llama_cpp import quantize, quantize_from_hf

# One-step: download from HuggingFace and quantize
path = quantize_from_hf("bartowski/google_gemma-4-E2B-it-GGUF", quant_type="IQ4_KT")

# Or quantize a local file
path = quantize("model-bf16.gguf", "model-IQ4_KT.gguf", "IQ4_KT",
                imatrix_path="model-imatrix.gguf")

Check if llama-quantize is available

ik-llama-quantize check

Constructor Parameters

Parameter Type Default Description
model_path str required Path to GGUF model file
n_ctx int 4096 Context window size
n_threads int 0 CPU threads (0 = auto)
use_mmap bool True Memory-map model file
use_mlock bool False Lock model in RAM
flash_attn bool True Enable flash attention
n_gpu_layers int 0 Number of layers to offload to GPU
verbose bool True Logging verbosity

Supported Platforms

Platform Wheels Notes
Linux x86_64 CPU (AVX2), CUDA 12.4 Python 3.10–3.13
Linux aarch64 CPU Python 3.10–3.13
macOS arm64 CPU + Metal Python 3.10–3.13
Windows x86_64 CPU (AVX2) Python 3.10–3.13

Environment Variables

Variable Description
IK_LLAMA_CPP_LIB_PATH Override path to the compiled shared library
CMAKE_ARGS Extra CMake flags for source builds

Why ik_llama.cpp?

ik_llama.cpp is a llama.cpp fork focused on performance and quantization research. Key advantages:

  • Faster CPU inference — improved prompt processing across all quantization types, better Flash Attention token generation
  • Novel quantization types — Trellis quants (IQ1_KTIQ4_KT), IQK quants (IQ2_KIQ6_K), row-interleaved R4 variants, MXFP4
  • Better KV cacheQ8_KV / Q4_0 KV-cache quantization with Hadamard transforms
  • DeepSeek optimizations — FlashMLA (v1–v3), fused MoE operations, Smart Expert Reduction
  • Hardware support — optimized kernels for AVX2, AVX-512, AVX-VNNI, ARM NEON, CUDA (Turing+)
  • Broad model support — LLaMA-3/4, Qwen3, DeepSeek-V3, Gemma3/4, Mistral, and many more

Architecture

ik_llama_cpp/
  __init__.py        # Public API: IkLlama, quantize, quantize_from_hf
  _lib_loader.py     # Finds and loads the shared library (.dll/.so/.dylib)
  _ctypes_api.py     # Low-level ctypes bindings to llama.h C API
  _internals.py      # RAII wrappers: IkModel, IkContext
  llama.py           # High-level IkLlama class
  quantize.py        # Quantization CLI and API (wraps llama-quantize)
  lib/               # Compiled shared libraries (installed by CMake)
  bin/               # llama-quantize binary (installed by CMake)

Contributing

Contributions are welcome! Feel free to open an issue or submit a pull request on GitHub. Whether it's bug reports, feature requests, documentation improvements, or code contributions — all are appreciated.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ik_llama_cpp_python-0.1.2.tar.gz (42.2 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ik_llama_cpp_python-0.1.2-cp313-cp313-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.13Windows x86-64

ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.2-cp313-cp313-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.2-cp312-cp312-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.12Windows x86-64

ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.2-cp312-cp312-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.2-cp311-cp311-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.11Windows x86-64

ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.2-cp311-cp311-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

ik_llama_cpp_python-0.1.2-cp310-cp310-win_amd64.whl (10.0 MB view details)

Uploaded CPython 3.10Windows x86-64

ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_x86_64.whl (25.0 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ x86-64

ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_aarch64.whl (23.4 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

ik_llama_cpp_python-0.1.2-cp310-cp310-macosx_11_0_arm64.whl (28.0 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

File details

Details for the file ik_llama_cpp_python-0.1.2.tar.gz.

File metadata

  • Download URL: ik_llama_cpp_python-0.1.2.tar.gz
  • Upload date:
  • Size: 42.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for ik_llama_cpp_python-0.1.2.tar.gz
Algorithm Hash digest
SHA256 8c4b21d3aad42103f33de1f4739663e76628160df9534b627da3b50601734a72
MD5 d61d8d104798b05f7d9b6484d0961099
BLAKE2b-256 13921cbd80c4cf0c0a140393b856eaaf9e75797c029a9d56fe3bc55fc65a971b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2.tar.gz:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp313-cp313-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 4c5f934d8ecc9439e5559081b9c89c5d151d3a521e4542343705b7852fd952b3
MD5 0f176bed391968732ea520de83a182df
BLAKE2b-256 291b4222a41b4e533632ab084aacfe95cc0499d9a2b93fe6cd7938e25483ded6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp313-cp313-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 a616c3c52318eb16cee2cb34d269eb015227a6bf6e5ff219c12b636b870823e7
MD5 16098cdf9a154357da1203d53e0aa95d
BLAKE2b-256 ef91e4dbcb09d7cba18430c586668f60acb93874ffb53a83fcd4d657c62ba147

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 12096fe5ef50ed234e75c28d652fdb0f13516a26e154709271868cc3bd6b1744
MD5 7ee31b605ca0caa2005e719b9e72ef30
BLAKE2b-256 ef45e7c9a75bff1e3116f3bf3fdfc1e5df3ede5dac1653ffcde927b3fb7d079d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp313-cp313-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6f37a729b09635e9b0f5083fde063e92c835c7a0916c76c4e733b3cc49c3915e
MD5 ad5a6c479b61488b40222514852dcf16
BLAKE2b-256 c442e59b745ec25705f5eade428df23d2db83234e88321543119c5d334002ca7

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp313-cp313-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp312-cp312-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 d03c1355bf65896787b9ae8bfcfffb46476dbfe7d6d26dfbaf0f2235969e79aa
MD5 d7235551f62e21e36e087d776ffe5fc6
BLAKE2b-256 62ecda21b2d2e9d1250dce69f3fa139d5f2bc893be7d76569f126ce681159bfc

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp312-cp312-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 221500c4df96b05caf7fafb3dcc479d0b6e9b03df056379b6a2db5ed991d5a43
MD5 88b723cc1b00d69172c778bc216a794c
BLAKE2b-256 0dc48f7f3524df8233f2d3441d9c518ba0c5d4940108c2878658a515c69cc376

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 3fe95328dc3d1e6c27d50d340da4228c1effe2d1ebc7bedd208e2acd2150e5d0
MD5 b1f1d0d1b63cd9653294265000492d21
BLAKE2b-256 426247ccb431f1f7faf213d4aa84599ce684ab3a340575d3a0df5254a1df46d6

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp312-cp312-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 5b1bc86705bfb1b036e5adfaf5d9bb40f52f6636f3168de3d985d2897a0f8426
MD5 7c0d994fd13a2c286e59aeaf38a61d6c
BLAKE2b-256 2c47dda56f698880ced4906346161299e63200b1042245b70e5eab47466ae0eb

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp312-cp312-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp311-cp311-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 7cf496071ea77392add520e6efbd724fb66b00b69c2ba32b992270cbef6f19b0
MD5 05da8e97e8b6177b7a197a0fe8baa37d
BLAKE2b-256 bd47985a2ff6cceacfcbc18b9bd928bcbf6cbb0d0c48cd5571f3d9ede809f177

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp311-cp311-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 ca88aafbc930fae33a4943a35ea248da08b55e9e4e98d63730a4a20e0c8ffa6f
MD5 9f4f22686857d3feb7badfacf80f94f0
BLAKE2b-256 7881ab5e7d11d397d63dd25e380e39cd8ba3dd218d1e5fc1787a97c2b813db0c

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 7bd06f115207b284684e5da5c1dc5f39a9f9942e5f20eb270a00b50260b3ec75
MD5 cbbae404880221ab001f4f91daa7e80a
BLAKE2b-256 47d7822a7f9d2a6e5666e14d3afec301062ffcc06f68caf4913501b4ad3c643d

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp311-cp311-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 95a545bc2c3f03b40de945d98b76e20d35038d8101e5533027c6dc065b20af27
MD5 6a4eea7eed928be89d77b3d3390bad76
BLAKE2b-256 32d9f96ebafffc9122931fa87a1090892d0cb6ca2d4bd21afbe2e8bd3ae29b6b

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp311-cp311-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp310-cp310-win_amd64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 738c4fbffabd8ff3befdbb0dde18dbc3994c5080784e25bf00d58f20e9994bf3
MD5 7dac2cc24bc51c80b911af03cd118399
BLAKE2b-256 33a4dad018b5f1b92be6c60dc41b285ab3c792684c2fa9790a24b66f04ef21ef

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp310-cp310-win_amd64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_x86_64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_x86_64.whl
Algorithm Hash digest
SHA256 589aba36ee0d282bd3f40919c88d5848149816d401c567a8880de2ef8ffccdd4
MD5 c4f2d511ad9cda96213d96c095c7b0fe
BLAKE2b-256 3cdb4d8982fc2d467c6c67ccf9934c2776bd26d65d08bd65c0633c21cd3fe913

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_x86_64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 6acb6c6f52f64d7769f3d5aeb5afa8cb06f4397f2382a1787a7ca70fb52a4a69
MD5 66c622848caf3ad4e82cc0f0d037f90a
BLAKE2b-256 a34c462d643d6408e4cb9b0d7a8c591a553b12d95d9e06040cc53b88ea0ea298

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp310-cp310-manylinux_2_28_aarch64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file ik_llama_cpp_python-0.1.2-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for ik_llama_cpp_python-0.1.2-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 bcf3033a2a03f67385b9d96422a95948ec1789fedef0ca2a4112f1853d487d14
MD5 388eec99c941a2e84380ff9cbe3c9db8
BLAKE2b-256 015d50a111e38e759cd2ae5607428eabfe99975c167535d30f170453cf685c3a

See more details on using hashes here.

Provenance

The following attestation bundles were made for ik_llama_cpp_python-0.1.2-cp310-cp310-macosx_11_0_arm64.whl:

Publisher: publish.yml on gongpx20069/ik-llama-cpp-python

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page