Track and check deprecation status of LLM provider models (OpenAI, Anthropic, etc.)
Project description
llm-model-deprecation
Track and check deprecation status of LLM provider models (OpenAI, Anthropic, Gemini, etc.). Use it to warn when your app uses deprecated or retired models and to get replacement suggestions.
Install
pip install llm-model-deprecation
Optional: URL loading via requests (otherwise stdlib urllib is used):
pip install "llm-model-deprecation[fetch]"
Library usage
Data is loaded from the default registry (online with built-in fallback). No config needed.
from llm_deprecation import DeprecationChecker, DeprecationStatus
checker = DeprecationChecker()
# Check by model id (searches all providers)
checker.is_deprecated("gpt-3.5-turbo-0301") # True
checker.is_retired("gpt-3.5-turbo-0301") # True
checker.status("gpt-4") # DeprecationStatus.ACTIVE
# With provider for exact match
checker.get("claude-2.0", provider="anthropic")
# -> ModelInfo(provider='anthropic', model_id='claude-2.0', status=..., replacement='...', ...)
# List deprecated models
for m in checker.list_deprecated(provider="openai"):
print(m.model_id, m.status.value, m.replacement)
Status values
- active — Currently supported, no deprecation.
- legacy — Still supported; prefer newer models.
- deprecated — Will be retired; migrate before sunset date.
- retired — No longer available.
Add or override models in code
from datetime import date
from llm_deprecation import DeprecationChecker
from llm_deprecation.models import ModelInfo, DeprecationStatus
checker = DeprecationChecker()
checker.register(ModelInfo(
provider="openai",
model_id="gpt-4-old",
status=DeprecationStatus.DEPRECATED,
sunset_date=date(2026, 1, 1),
replacement="gpt-4o",
))
CLI
Scan a project for deprecated or retired model references (CI, cron, or local):
llm-deprecation scan
llm-deprecation scan /path/to/project
llm-deprecation scan --fail-on-deprecated # exit 1 if any found (for CI)
Example output:
Scanning project...
⚠ openai:gpt-3.5-turbo → deprecated soon
⚠ anthropic:claude-instant → retired
The scanner looks in common code and config files (.py, .json, .yaml, .env, .ts, etc.).
GitHub Action
Run the same check in GitHub Actions:
- name: Check LLM deprecations
uses: techdevsynergy/llm-model-deprecation@v1
with:
fail-on-deprecated: true
Options: path (project root to scan), fail-on-deprecated, version (pin package version).
Data source
Registry is loaded from the default URL (llm-deprecation-data); if unreachable (e.g. offline), the built-in registry in the library is used.
Links
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_model_deprecation-1.1.0.tar.gz.
File metadata
- Download URL: llm_model_deprecation-1.1.0.tar.gz
- Upload date:
- Size: 13.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3d042233076e57be263a37a7b104abb16bff35b16d6f7f0651e488aaed8ae9fd
|
|
| MD5 |
377eae550104a1584e93d197814aa716
|
|
| BLAKE2b-256 |
149f5fa320972e425a567bbdf92628fce6ce7fc469c0a96350db769d0fb2b43c
|
File details
Details for the file llm_model_deprecation-1.1.0-py3-none-any.whl.
File metadata
- Download URL: llm_model_deprecation-1.1.0-py3-none-any.whl
- Upload date:
- Size: 12.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ed656151c7f452e33db2c85a27724ef1b88c41f137babf0c4f276911530a5ad3
|
|
| MD5 |
de35b17b2d13442679cf60f3be4cdb52
|
|
| BLAKE2b-256 |
844702a389a8cf05838557a1aee21c524869a5c24c88c857bf56d098bd6b2401
|