SoTA Transformers with C-backend for fast inference on your CPU.
Project description
We identify three pillers to enable fast inference of SoTA AI models on your CPU:
- Fast C/C++ LLM inference kernels for CPU.
- Machine Learning Research & Exploration front - Compression through quantization, sparsification, training on more data, collecting data and training instruction & chat models.
- Easy to use API for fast AI inference in dynamically typed language like Python.
This project aims to address the third using LLaMa.cpp and GGML.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
cformers-0.0.4.tar.gz
(1.0 MB
view details)
Built Distribution
File details
Details for the file cformers-0.0.4.tar.gz
.
File metadata
- Download URL: cformers-0.0.4.tar.gz
- Upload date:
- Size: 1.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ac93556304e4966ed08f0b64fa6a0a01a878eefe9284dd11e47bf16694dc4ba6 |
|
MD5 | b4bea6f5cd76f9d8f12a17338b6ce39b |
|
BLAKE2b-256 | 9c277505402180b2bd111ce2a3835cd9f2cc9466352c9cfaf5e0d6bd2279a9b6 |
File details
Details for the file cformers-0.0.4-py3-none-any.whl
.
File metadata
- Download URL: cformers-0.0.4-py3-none-any.whl
- Upload date:
- Size: 1.0 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.10.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | f7c769d6a6b39fadcdc1833f5ca89ba005fd999165ab8ad3a8f61a17726902b9 |
|
MD5 | ffc7030acdd3afa45a2eac9f68dc3d5f |
|
BLAKE2b-256 | eee6f9ede7542994a9b94d58a3b0cf106c7ccef788acfe1292b5607ca8e08cfb |