Skip to main content

A CLI to estimate inference memory requirements for Hugging Face models, written in Python.

Project description


[!WARNING] hf-mem is still experimental and therefore subject to major changes across releases, so please keep in mind that breaking changes may occur until v1.0.0.

hf-mem is a CLI to estimate inference memory requirements for Hugging Face models, written in Python. hf-mem is lightweight, only depends on httpx, as it pulls the Safetensors metadata via HTTP Range requests. It's recommended to run with uv for a better experience.

hf-mem lets you estimate the inference requirements to run any model from the Hugging Face Hub, including Transformers, Diffusers and Sentence Transformers models, as well as any model that contains Safetensors compatible weights.

Read more information about hf-mem in this short-form post.

Usage

Transformers

uvx hf-mem --model-id MiniMaxAI/MiniMax-M2

Diffusers

uvx hf-mem --model-id Qwen/Qwen-Image

Sentence Transformers

uvx hf-mem --model-id google/embeddinggemma-300m

Experimental

By enabling the --experimental flag, you can enable the KV Cache memory estimation for LLMs (...ForCausalLM) and VLMs (...ForConditionalGeneration), even including a custom --max-model-len (defaults to the config.json default), --batch-size (defaults to 1), and the --kv-cache-dtype (defaults to auto which means it uses the default data type set in config.json under torch_dtype or dtype, or rather from quantization_config when applicable).

uvx hf-mem --model-id MiniMaxAI/MiniMax-M2 --experimental

(Optional) Agent Skills

Optionally, you can add hf-mem as an agent skill, which allows the underlying coding agent to discover and use it when provided as a SKILL.md.

More information can be found at Anthropic Agent Skills and how to use them.

References

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hf_mem-0.4.3.tar.gz (11.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hf_mem-0.4.3-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file hf_mem-0.4.3.tar.gz.

File metadata

  • Download URL: hf_mem-0.4.3.tar.gz
  • Upload date:
  • Size: 11.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.5

File hashes

Hashes for hf_mem-0.4.3.tar.gz
Algorithm Hash digest
SHA256 452739b78dc77eb6710562ff246499e050c8432b8e4cca83654cfc044108405f
MD5 0af042a1fe5bca177dbee8c74e1ba8c1
BLAKE2b-256 9f1e8faac3ea08c04b52308b7ae2f02820241a0644541712d30e35888146c542

See more details on using hashes here.

File details

Details for the file hf_mem-0.4.3-py3-none-any.whl.

File metadata

  • Download URL: hf_mem-0.4.3-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.5

File hashes

Hashes for hf_mem-0.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 8a311dad1eb90dd1050d52da9e070788e1da95c3bdd2b3040e89d65491675e87
MD5 2f07f229becc53d48e7a5232bc90f70a
BLAKE2b-256 ad783c663252fa4bc0c163595c3c853fc482000c6109e1e8d50d3095de9663ca

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page