Skip to main content

No project description provided

Project description

Pybind11 bindings for whisper.cpp

Quickstart

Install with pip:

pip install whispercpp

To use the latest version, install from source:

pip install git+https://github.com/aarnphm/whispercpp.git

For local setup, initialize all submodules:

git submodule update --init --recursive

Build the wheel:

# Option 1: using pypa/build
python3 -m build -w

# Option 2: using bazel
./tools/bazel build //:whispercpp_wheel

Install the wheel:

# Option 1: via pypa/build
pip install dist/*.whl

# Option 2: using bazel
pip install $(./tools/bazel info bazel-bin)/*.whl

The binding provides a Whisper class:

from whispercpp import Whisper

w = Whisper.from_pretrained("tiny.en")

Currently, the inference API is provided via transcribe:

w.transcribe(np.ones((1, 16000)))

You can use ffmpeg or librosa to load audio files into a Numpy array, then pass it to transcribe:

import ffmpeg
import numpy as np

try:
    y, _ = (
        ffmpeg.input("/path/to/audio.wav", threads=0)
        .output("-", format="s16le", acodec="pcm_s16le", ac=1, ar=sample_rate)
        .run(
            cmd=["ffmpeg", "-nostdin"], capture_stdout=True, capture_stderr=True
        )
    )
except ffmpeg.Error as e:
    raise RuntimeError(f"Failed to load audio: {e.stderr.decode()}") from e

arr = np.frombuffer(y, np.int16).flatten().astype(np.float32) / 32768.0

w.transcribe(arr)

The Pybind11 bindings supports all of the features from whisper.cpp.

The binding can also be used via api:

from whispercpp import api

ctx = api.Context.from_file("/path/to/saved_weight.bin")
params = api.Params()

ctx.full(arr, params)

Development

See DEVELOPMENT.md

APIs

Whisper

  1. Whisper.from_pretrained(model_name: str) -> Whisper

    Load a pre-trained model from the local cache or download and cache if needed.

    w = Whisper.from_pretrained("tiny.en")

The model will be saved to $XDG_DATA_HOME/whispercpp or ~/.local/share/whispercpp if the environment variable is not set.

  1. Whisper.transcribe(arr: NDArray[np.float32], num_proc: int = 1)

    Running transcription on a given Numpy array. This calls full from whisper.cpp. If num_proc is greater than 1, it will use full_parallel instead.

    w.transcribe(np.ones((1, 16000)))

api

api is a direct binding from whisper.cpp, that has similar APIs to whisper-rs.

  1. api.Context

    This class is a wrapper around whisper_context

    from whispercpp import api
    
    ctx = api.Context.from_file("/path/to/saved_weight.bin")
  2. api.Params

    This class is a wrapper around whisper_params

    from whispercpp import api
    
    params = api.Params()

Why not?

  • whispercpp.py. There are a few key differences here:

    • They provides the Cython bindings. From the UX standpoint, this achieves the same goal as whispercpp. The difference is whispercpp use Pybind11 instead. Feel free to use it if you prefer Cython over Pybind11. Note that whispercpp.py and whispercpp are mutually exclusive, as they also use the whispercpp namespace.

    • whispercpp doesn’t pollute your $HOME directory, rather it follows the XDG Base Directory Specification for saved weights.

  • Using cdll and ctypes and be done with it?

    • This is also valid, but requires a lot of hacking and it is pretty slow comparing to Cython and Pybind11.

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

whispercpp-0.0.17.tar.gz (4.0 MB view hashes)

Uploaded Source

Built Distributions

whispercpp-0.0.17-cp311-cp311-manylinux2014_x86_64.whl (1.5 MB view hashes)

Uploaded CPython 3.11

whispercpp-0.0.17-cp311-cp311-macosx_10_9_x86_64.whl (1.3 MB view hashes)

Uploaded CPython 3.11 macOS 10.9+ x86-64

whispercpp-0.0.17-cp310-cp310-manylinux2014_x86_64.whl (1.4 MB view hashes)

Uploaded CPython 3.10

whispercpp-0.0.17-cp310-cp310-macosx_10_9_x86_64.whl (1.3 MB view hashes)

Uploaded CPython 3.10 macOS 10.9+ x86-64

whispercpp-0.0.17-cp39-cp39-manylinux2014_x86_64.whl (1.4 MB view hashes)

Uploaded CPython 3.9

whispercpp-0.0.17-cp39-cp39-macosx_10_9_x86_64.whl (1.3 MB view hashes)

Uploaded CPython 3.9 macOS 10.9+ x86-64

whispercpp-0.0.17-cp38-cp38-manylinux2014_x86_64.whl (1.4 MB view hashes)

Uploaded CPython 3.8

whispercpp-0.0.17-cp38-cp38-macosx_10_9_x86_64.whl (1.3 MB view hashes)

Uploaded CPython 3.8 macOS 10.9+ x86-64

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page