Skip to main content

SHARK layers and inference models for genai

Project description

SHARK Tank

WARNING: This is an early preview that is in progress. It is not ready for general use.

Light weight inference optimized layers and models for popular genai applications.

This sub-project is a work in progress. It is intended to be a repository of layers, model recipes, and conversion tools from popular LLM quantization tooling.

Examples

The repository will ultimately grow a curated set of models and tools for constructing them, but for the moment, it largely contains some CLI exmaples. These are all under active development and should not yet be expected to work.

Perform batched inference in PyTorch on a paged llama derived LLM:

python -m sharktank.examples.paged_llm_v1 \
  --hf-dataset=open_llama_3b_v2_f16_gguf \
  "Prompt 1" \
  "Prompt 2" ...

Export an IREE compilable batched LLM for serving:

python -m sharktank.examples.export_paged_llm_v1 --hf-dataset=open_llama_3b_v2_f16_gguf

Dump parsed information about a model from a gguf file:

python -m sharktank.tools.dump_gguf --hf-dataset=open_llama_3b_v2_f16_gguf

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

sharktank-0.1.dev3-py3-none-any.whl (63.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page