Skip to main content

server for mozilla deepspeech

Project description

https://travis-ci.org/MainRo/deepspeech-server.svg?branch=master https://badge.fury.io/py/deepspeech-server.svg

Key Features

This is an http server that can be used to test the Mozilla DeepSpeech project. You need an environment with DeepSpeech and a model to run this server.

This code uses the DeepSpeech 0.7 APIs.

Installation

You first need to install deepspeech. Depending on your system you can use the CPU package:

pip3 install deepspeech

Or the GPU package:

pip3 install deepspeech-gpu

Then you can install the deepspeech server:

python3 setup.py install

The server is also available on pypi, so you can install it with pip:

pip3 install deepspeech-server

Note that python 3.5 is the minimum version required to run the server.

Starting the server

deepspeech-server --config config.json

You can use deepspeech without training a model yourself. Pre-trained models are provided by Mozilla in the release page of the project (See the assets section of the release note):

https://github.com/mozilla/DeepSpeech/releases

Once your downloaded a pre-trained model, you can untar it and directly use the sample configuration file:

cp config.sample.json config.json
deepspeech-server --config config.json

Server configuration

The configuration is done with a json file, provided with the “–config” argument. Its structure is the following one:

{
  "deepspeech": {
    "model" :"deepspeech-0.7.1-models.pbmm",
    "scorer" :"deepspeech-0.7.1-models.scorer",
    "beam_width": 500,
    "lm_alpha": 0.931289039105002,
    "lm_beta": 1.1834137581510284
  },
  "server": {
    "http": {
      "host": "0.0.0.0",
      "port": 8080,
      "request_max_size": 1048576
    }
  },
  "log": {
    "level": [
      { "logger": "deepspeech_server", "level": "DEBUG"}
    ]
  }
}

The configuration file contains several sections and sub-sections.

deepspeech section configuration

Section “deepspeech” contains configuration of the deepspeech engine:

model: The model that was generated by deepspeech. Can be a protobuf file or a memory mapped protobuf.

scorer: [Optional] The scorer file. The scorer is necessary to set lm_alpha or lm_beta manually

beam_width: [Optional] The size of the beam search

lm_alpha and lm_beta: [Optional] The hyperparmeters of the scorer

Section “server” contains configuration of the access part, with on subsection per protocol:

http section configuration

request_max_size (default value: 1048576, i.e. 1MiB) is the maximum payload size allowed by the server. A received payload size above this threshold will return a “413: Request Entity Too Large” error.

host (default value: “0.0.0.0”) is the listen address of the http server.

port (default value: 8080) is the listening port of the http server.

log section configuration

The log section can be used to set the log levels of the server. This section contains a list of log entries. Each log entry contains the name of a logger and its level. Both follow the convention of the python logging module.

Using the server

Inference on the model is done via http post requests. For example with the following curl command:

curl -X POST --data-binary @testfile.wav http://localhost:8080/stt

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

deepspeech-server-2.1.0.tar.gz (5.8 kB view details)

Uploaded Source

File details

Details for the file deepspeech-server-2.1.0.tar.gz.

File metadata

  • Download URL: deepspeech-server-2.1.0.tar.gz
  • Upload date:
  • Size: 5.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.23.0 setuptools/46.4.0 requests-toolbelt/0.9.1 tqdm/4.46.0 CPython/3.5.6

File hashes

Hashes for deepspeech-server-2.1.0.tar.gz
Algorithm Hash digest
SHA256 b128a56277c08091417bfa26b18683c464e2045ab68ea76475d8260091f7f2bf
MD5 600514764fba2408fa1ddf26b8f2a035
BLAKE2b-256 72eb66cceb9636d90396c98940627763233c2bb9d5deec32bd3c626e509d5af4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page