Skip to main content

Optimum TPU is the interface between the Hugging Face Transformers library and Google Cloud TPU devices.

Project description

Optimum-TPU

Take the most out of Google Cloud TPUs with the ease of 🤗 transformers

Documentation license

Tensor Processing Units (TPU) are AI accelerator made by Google to optimize performance and cost from AI training to inference.

This repository exposes an interface similar to what Hugging Face transformers library provides to interact with a magnitude of models developed by research labs, institutions and the community.

We aim at providing our user the best possible performances targeting Google Cloud TPUs for both training and inference working closely with Google and Google Cloud to make this a reality.

Supported Model and Tasks

We currently support a few LLM models targeting text generation scenarios:

  • 💎 Gemma (2b, 7b)
  • 🦙 Llama2 (7b) and Llama3 (8b)
  • 💨 Mistral (7b)

Installation

optimum-tpu comes with an handy PyPi released package compatible with your classical python dependency management tool.

pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html

Inference

optimum-tpu provides a set of dedicated tools and integrations in order to leverage Cloud TPUs for inference, especially on the latest TPU version v5e.

Other TPU versions will be supported along the way.

Text-Generation-Inference

As part of the integration, we do support a text-generation-inference (TGI) backend allowing to deploy and serve incoming HTTP requests and execute them on Cloud TPUs.

Please see the TGI specific documentation on how to get started

Training

Fine-tuning is supported and tested on the TPU v5e. We have tested so far:

  • 🦙 Llama-2 7B and Llama-3 8B
  • 💎 Gemma 2B and 7B

You can check the examples:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

optimum_tpu-0.1.5.tar.gz (113.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

optimum_tpu-0.1.5-py3-none-any.whl (77.4 kB view details)

Uploaded Python 3

File details

Details for the file optimum_tpu-0.1.5.tar.gz.

File metadata

  • Download URL: optimum_tpu-0.1.5.tar.gz
  • Upload date:
  • Size: 113.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for optimum_tpu-0.1.5.tar.gz
Algorithm Hash digest
SHA256 e59183e6e4bcf74de3a3bdd5af48d7308fbd9e3477b1671ae85580dcdcf0beea
MD5 2e7e4bf5ca9f5c2495d09ac778147279
BLAKE2b-256 6d3265c4584f5652d0329373e881c990e5abced795915a9850838d65cd57e9e5

See more details on using hashes here.

File details

Details for the file optimum_tpu-0.1.5-py3-none-any.whl.

File metadata

  • Download URL: optimum_tpu-0.1.5-py3-none-any.whl
  • Upload date:
  • Size: 77.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.12

File hashes

Hashes for optimum_tpu-0.1.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e12e154a974ba41ca90df63daa963632b2dfa6642139f281a9a1e8a1ca721b3f
MD5 2833072aa7a53e82dc0df5ac81755dc2
BLAKE2b-256 4ce60960fe4c878f7ae4a50ec880b33b415e246493e6063cc04133dfd19c5eb7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page