Skip to main content

HuggingFace runtime for MLServer

Project description

HuggingFace runtime for MLServer

This package provides a MLServer runtime compatible with HuggingFace Transformers.

Usage

You can install the runtime, alongside mlserver, as:

pip install mlserver mlserver-huggingface

For further information on how to use MLServer with HuggingFace, you can check out this worked out example.

Content Types

The HuggingFace runtime will always decode the input request using its own built-in codec. Therefore, content type annotations at the request level will be ignored. Note that this doesn't include input-level content type annotations, which will be respected as usual.

Settings

The HuggingFace runtime exposes a couple extra parameters which can be used to customise how the runtime behaves. These settings can be added under the parameters.extra section of your model-settings.json file, e.g.

---
emphasize-lines: 5-8
---
{
  "name": "qa",
  "implementation": "mlserver_huggingface.HuggingFaceRuntime",
  "parameters": {
    "extra": {
      "task": "question-answering",
      "optimum_model": true
    }
  }
}
These settings can also be injected through environment variables prefixed with `MLSERVER_MODEL_HUGGINGFACE_`, e.g.

```bash
MLSERVER_MODEL_HUGGINGFACE_TASK="question-answering"
MLSERVER_MODEL_HUGGINGFACE_OPTIMUM_MODEL=true
```

Loading models

Local models

It is possible to load a local model into a HuggingFace pipeline by specifying the model artefact folder path in parameters.uri in model-settings.json.

HuggingFace models

Models in the HuggingFace hub can be loaded by specifying their name in parameters.extra.pretrained_model in model-settings.json.

If `parameters.extra.pretrained_model` is specified, it takes precedence over `parameters.uri`.

Reference

You can find the full reference of the accepted extra settings for the HuggingFace runtime below:

.. autopydantic_settings:: mlserver_huggingface.settings.HuggingFaceSettings

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mlserver_huggingface-1.6.2rc1.tar.gz (15.8 kB view details)

Uploaded Source

Built Distribution

mlserver_huggingface-1.6.2rc1-py3-none-any.whl (21.4 kB view details)

Uploaded Python 3

File details

Details for the file mlserver_huggingface-1.6.2rc1.tar.gz.

File metadata

  • Download URL: mlserver_huggingface-1.6.2rc1.tar.gz
  • Upload date:
  • Size: 15.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.10.15 Linux/6.5.0-1025-azure

File hashes

Hashes for mlserver_huggingface-1.6.2rc1.tar.gz
Algorithm Hash digest
SHA256 5cf1cf28a758a467a2a45f9ac99c0596f1af61b9177dd9215d111e506f493ff9
MD5 3a70c265d64498192587040dace52708
BLAKE2b-256 29c857b9a0308e7c9df042855f5f38464148be6a2618909585e9b50945b94962

See more details on using hashes here.

File details

Details for the file mlserver_huggingface-1.6.2rc1-py3-none-any.whl.

File metadata

File hashes

Hashes for mlserver_huggingface-1.6.2rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 b2b28243c28599c3e8b4d4e1aa4dbc73bc4f414cd4fbf95253be802b89a7e49b
MD5 f89a0d8cfe2ed2ad3a24e4416c0ba539
BLAKE2b-256 66b117c86286dd6290de3711777fa12129c384b657c972f2d44b7892f36860ab

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page