Skip to main content

A package and GitHub Action to generate quality reports on Hugging Face model stats and usage, addressing ambiguity in open model releases.

Project description

HF Model Inspector

A package and GitHub Action to generate quality reports on Hugging Face model stats and usage, addressing ambiguity in open model releases.


Overview

hf_model_inspector is a Python package and GitHub Action designed to provide clear, structured reports for models hosted on Hugging Face. Open model releases often come with incomplete or inconsistent metadata, making it hard to quickly assess model size, architecture, quantization, and usage statistics.

This tool helps you:

  • Inspect model metadata including architecture, parameters, and downloads.
  • Handle quantization info cleanly, even when formats differ across releases.
  • Generate JSON and Markdown reports for documentation or review purposes.
  • Recommend suitable models for your GPU based on memory constraints.
  • Automate reporting with a GitHub Action for CI/CD pipelines.

Installation

pip install hf_model_inspector

Optional: For private models, you can use a Hugging Face token.


Quick Start

Example 1: Inspect a model and print summary

from hf_model_inspector import get_model_report_json

repo_id = "openai/gpt-oss-20b"
report = get_model_report_json(repo_id)

total_params = report["parameters"]["total"]
param_str = f"{total_params:,}" if total_params else "Unknown"

quant_info = report.get("quantization", {})
if quant_info.get("quantized"):
    methods = ", ".join(quant_info.get("quant_methods", [])) or "Unknown"
    precision = quant_info.get("precision", "Unknown")
    quant_status = f"{methods} ({precision})"
else:
    quant_status = quant_info.get("dtype", "Not Quantized") or "Not Quantized"

print(f"Model: {report['repo_id']}")
print(f"Architecture: {report['architecture']}")
print(f"Parameters: {param_str}")
print(f"Quantization: {quant_status}")
print(f"Downloads: {report['metadata']['downloads']}")
print(f"Tags: {', '.join(report['metadata']['tags']) if report['metadata']['tags'] else 'None'}")

Example 2: Full inspection and Markdown report

from hf_model_inspector import get_model_report_md, save_model_report
from hf_model_inspector.loader import authenticate_hf

token = authenticate_hf()
repo_id = "openai/gpt-oss-20b"

# Generate and save Markdown report
report_md = get_model_report_md(repo_id, token)
save_model_report(repo_id, md_path="model_report.md", token=token)

print("Markdown report saved as 'model_report.md'")

GitHub Action Integration

You can automate model reporting on every push or PR using our GitHub Action:

name: HF Model Inspector

on:
  push:
    branches: [main]

jobs:
  inspect:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - name: Run HF Model Inspector
        uses: ParagEkbote/hf-model-inspector@v1.0.0
        with:
          repo_id: "openai/gpt-oss-20b"
          token: ${{ secrets.HF_TOKEN }}

This will automatically generate and store JSON/Markdown reports for your chosen model.


Features

  • ✅ Inspect public and private models.
  • ✅ Clean handling of quantization and parameter counts.
  • ✅ Save JSON or Markdown reports.
  • ✅ Recommend models suitable for your GPU.
  • ✅ Automate with GitHub Actions for reproducible reporting.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hf_model_inspector-1.0.2.tar.gz (21.2 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hf_model_inspector-1.0.2-py3-none-any.whl (16.9 kB view details)

Uploaded Python 3

File details

Details for the file hf_model_inspector-1.0.2.tar.gz.

File metadata

  • Download URL: hf_model_inspector-1.0.2.tar.gz
  • Upload date:
  • Size: 21.2 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.1

File hashes

Hashes for hf_model_inspector-1.0.2.tar.gz
Algorithm Hash digest
SHA256 cf6f03fe75005f0293f442e78d62fadb3e8ea4643e14c8f8f100ceca292609cf
MD5 1b797094326546c93f661df2b990d53a
BLAKE2b-256 eb2dad6d084cc65614b45f234378901159258c222b12df6028dbe04edb6e279d

See more details on using hashes here.

File details

Details for the file hf_model_inspector-1.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for hf_model_inspector-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 13d659a65d5aa8878e0b39a4543ecb209fbb1f0999dcb091fefd8c756d017cfe
MD5 eaf13492f4a205d08c6250fd620f3b8a
BLAKE2b-256 47d05b4aaa3531444facd84aa62f17a95fa090d0df818eaa32f4992902f876c8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page