Skip to main content

Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices.

Project description

Optimum-TPU

Take the most out of Google Cloud TPUs with the ease of 🤗 transformers

Documentation license Optimum TPU / Test TGI on TPU

Tensor Processing Units (TPU) are AI accelerator made by Google to optimize performance and cost from AI training to inference.

This repository exposes an interface similar to what Hugging Face transformers library provides to interact with a magnitude of models developed by research labs, institutions and the community.

We aim at providing our user the best possible performances targeting Google Cloud TPUs for both training and inference working closely with Google and Google Cloud to make this a reality.

Supported Model and Tasks

We currently support a few LLM models targeting text generation scenarios:

  • 💎 Gemma (2b, 7b)
  • 🦙 Llama2 (7b) and Llama3 (8b)
  • 💨 Mistral (7b)

Installation

optimum-tpu comes with an handy PyPi released package compatible with your classical python dependency management tool.

pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html

export PJRT_DEVICE=TPU

Inference

optimum-tpu provides a set of dedicated tools and integrations in order to leverage Cloud TPUs for inference, especially on the latest TPU version v5e.

Other TPU versions will be supported along the way.

Text-Generation-Inference

As part of the integration, we do support a text-generation-inference (TGI) backend allowing to deploy and serve incoming HTTP requests and execute them on Cloud TPUs.

Please see the TGI specific documentation on how to get started.

JetStream Pytorch Engine

optimum-tpu provides an optional support of JetStream Pytorch engine inside of TGI. This support can be installed using the dedicated CLI command:

optimum-tpu install-jetstream-pytorch

To enable the support, export the environment variable JETSTREAM_PT=1.

Training

Fine-tuning is supported and tested on the TPU v5e. We have tested so far:

  • 🦙 Llama-2 7B and Llama-3 8B
  • 💎 Gemma 2B and 7B

You can check the examples:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimum_tpu-0.2.0.tar.gz (135.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

optimum_tpu-0.2.0-py3-none-any.whl (80.1 kB view details)

Uploaded Python 3

File details

Details for the file optimum_tpu-0.2.0.tar.gz.

File metadata

  • Download URL: optimum_tpu-0.2.0.tar.gz
  • Upload date:
  • Size: 135.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for optimum_tpu-0.2.0.tar.gz
Algorithm Hash digest
SHA256 42e9bbe80f26eb1fe9368ff07ee40e969fe0156c59f8d5d009973d2947fd1aa4
MD5 88293d2bae1a4056628205463d29b9ac
BLAKE2b-256 1afec832f0965ed978d5268d4306c6ad392dbbd22790ed30ad0c2d3da05ecc43

See more details on using hashes here.

File details

Details for the file optimum_tpu-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: optimum_tpu-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 80.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for optimum_tpu-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7ffbaa2b647b01b82cd764acc8a0db6315700a9383458256faed7c93616dc8ca
MD5 a7780646600d15421f9096d5c3955f40
BLAKE2b-256 9d6af73015d0d605e170d0bc345865a5d71c1dcadc24e09f591423dd08e09e48

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page