Enable IPFS model loading for llama-cpp-python
Project description
Llama_IPFS
Load models directly from IPFS for llama-cpp-python.
Features
- 🌐 Direct integration with local IPFS nodes (preferred method)
- 🔄 Automatic fallback to IPFS gateways when local node isn't available
- 🔍 Simple URI format:
ipfs://CIDfor easy model sharing - ⚡ Zero configuration required - works automatically once installed
- 🧩 Compatible with any version of llama-cpp-python
Installation
# Note: PyPI package names use hyphens
pip install llama-ipfs
llama-ipfs activate
Once installed and activated, the llama_ipfs integration will be loaded automatically whenever you use Python.
Usage
After installation, use llama-cpp-python with IPFS model URIs:
from llama_cpp import Llama
# Load a model directly from IPFS
model = Llama.from_pretrained(
repo_id="ipfs://bafybeie7quk74kmqg34nl2ewdwmsrlvvt6heayien364gtu2x6g2qpznhq",
filename="ggml-model-Q4_K_M.gguf"
)
# Use the model normally
response = model.create_completion(
"Once upon a time",
max_tokens=128
)
Google Colab Usage
In Google Colab, you need to manually apply the patch after importing:
# Import and manually apply patch
import llama_ipfs
llama_ipfs.activate()
# Verify patch is active
print(f"IPFS patch active: {llama_ipfs.status()}")
IPFS Node Connectivity
The llama_ipfs package prioritizes connectivity in the following order:
-
Local IPFS Node (Recommended): If you have an IPFS daemon running locally (
ipfs daemon), the package will automatically detect and use it. This method:- Is much faster for repeated downloads
- More reliably loads complex model directories
- Contributes to the IPFS network by providing content to others
-
IPFS Gateway (Fallback): If a local node isn't available, the package will fall back to public gateways. This method:
- Works without installing IPFS
- May be less reliable for complex model directories
- Downloads can be interrupted more easily
Command Line Interface
# Note: CLI commands use hyphens
# Activate the auto-loading
llama-ipfs activate
# Check if the integration is active
llama-ipfs status
# Test the integration
llama-ipfs test
# Deactivate the integration
llama-ipfs deactivate
Dependencies
- Python 3.8+
- llama-cpp-python
License
MIT License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llama_ipfs-0.1.1.tar.gz.
File metadata
- Download URL: llama_ipfs-0.1.1.tar.gz
- Upload date:
- Size: 14.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
769bac205c51434081209ad18d6f9a7d78becec11cdcfaba9c6ef51b2d6ad909
|
|
| MD5 |
e161b5e48e717d21e14302afa4b25c82
|
|
| BLAKE2b-256 |
48cd9b556e4dc8dd0319337bf456c84cb35a19501c9c69d7727c6475180e4bb5
|
Provenance
The following attestation bundles were made for llama_ipfs-0.1.1.tar.gz:
Publisher:
publish.yml on alexbakers/llama_ipfs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llama_ipfs-0.1.1.tar.gz -
Subject digest:
769bac205c51434081209ad18d6f9a7d78becec11cdcfaba9c6ef51b2d6ad909 - Sigstore transparency entry: 187446743
- Sigstore integration time:
-
Permalink:
alexbakers/llama_ipfs@6848f7218fcae5a72b8a4c217bdb3a480b9b6685 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/alexbakers
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6848f7218fcae5a72b8a4c217bdb3a480b9b6685 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llama_ipfs-0.1.1-py3-none-any.whl.
File metadata
- Download URL: llama_ipfs-0.1.1-py3-none-any.whl
- Upload date:
- Size: 13.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.12.9
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
8e68683aa41bbcb08e3f685c6d2fc25bbe2cf6d31aff19ca6798ed568fb44a36
|
|
| MD5 |
1526f4177a26d888f91d4e39e2c7288b
|
|
| BLAKE2b-256 |
8fbf1eabebbcd532538336c4e7b82c6b42b2d9b47fd93eaac569540676baead0
|
Provenance
The following attestation bundles were made for llama_ipfs-0.1.1-py3-none-any.whl:
Publisher:
publish.yml on alexbakers/llama_ipfs
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llama_ipfs-0.1.1-py3-none-any.whl -
Subject digest:
8e68683aa41bbcb08e3f685c6d2fc25bbe2cf6d31aff19ca6798ed568fb44a36 - Sigstore transparency entry: 187446745
- Sigstore integration time:
-
Permalink:
alexbakers/llama_ipfs@6848f7218fcae5a72b8a4c217bdb3a480b9b6685 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/alexbakers
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@6848f7218fcae5a72b8a4c217bdb3a480b9b6685 -
Trigger Event:
release
-
Statement type: