A plugin to add support for the GreenPT AI inference service to llm
Project description
llm-greenpt
A plugin to add support for the GreenPT inference service to LLM. GreenPT provides sustainable AI inference with direct energy consumption and carbon emissions measurements in API responses.
This is not an official plugin from GreenPT.
Installation
Install this plugin in the same environment as LLM:
llm install llm-greenpt
Usage
Once you have llm-greenpt installed, you should see new models available when you call llm models:
GreenPT: greenpt/green-l (aliases: greenpt-green-l, greenpt-large)
GreenPT: greenpt/green-l-raw (aliases: greenpt-green-l-raw)
GreenPT: greenpt/green-r (aliases: greenpt-green-r, greenpt-reasoning)
GreenPT: greenpt/green-r-raw (aliases: greenpt-green-r-raw)
GreenPT: greenpt/llama-3.3-70b-instruct (aliases: greenpt-llama-70b)
GreenPT: greenpt/deepseek-r1-distill-llama-70b (aliases: greenpt-deepseek-r1)
GreenPT: greenpt/mistral-nemo-instruct-2407 (aliases: greenpt-mistral-nemo)
GreenPT: greenpt/qwen3-235b-a22b-instruct-2507 (aliases: greenpt-qwen3)
You will need to set a key with:
llm keys set greenpt
You can sign up for GreenPT and get an API key from https://greenpt.ai/.
Basic Usage
# Use the large generative model
llm "Explain quantum computing" -m greenpt-large
# Use the reasoning model
llm "What is 15% of 847?" -m greenpt-reasoning
# Use a third-party model via GreenPT
llm "Write a haiku" -m greenpt-llama-70b
Energy and Carbon Impact Logging
This plugin automatically captures and logs energy consumption and carbon emissions data from GreenPT API responses. Impact data is stored in the response_json field of the LLM logs database.
GreenPT returns impact data in this format:
{
"impact": {
"version": "20250922",
"inferenceTime": {"total": 156, "unit": "ms"},
"energy": {"total": 40433, "unit": "Wms"},
"emissions": {"total": 1, "unit": "ugCO2e"}
}
}
To view energy consumption for your requests:
# View recent logs with impact data
llm logs --model greenpt-large --json | jq '.[-5:].response_json.impact'
# Query specific energy metrics
llm logs --model greenpt-large --json | jq -r '.[] | select(.response_json.impact != null) | "\(.datetime_utc): \(.response_json.impact.energy.total) Wms, \(.response_json.impact.emissions.total) ugCO2e"'
Each impact measurement includes:
inferenceTime: Inference duration in millisecondsenergy: Energy consumption in watt-milliseconds (Wms)emissions: Carbon emissions in micrograms CO2 equivalent (ugCO2e)version: Version of the impact calculation methodology
Unit conversions:
- To convert Wms to watt-hours:
Wms / 3,600,000 = Wh - To convert Wms to kilowatt-hours:
Wms / 3,600,000,000 = kWh - To convert ugCO2e to grams:
ugCO2e / 1,000,000 = gCO2e
Streaming Support
Energy and emissions data is captured in both streaming and non-streaming modes:
# Streaming (default)
llm "Explain machine learning" -m greenpt-large
# Non-streaming
llm "Explain machine learning" -m greenpt-large --no-stream
In streaming mode, GreenPT sends the impact data in the final SSE chunk after the content is complete.
Available Models
| Model | Description |
|---|---|
green-l |
Large generative model optimized for sustainability |
green-l-raw |
Large model with custom system prompt support |
green-r |
Reasoning model for logical reasoning and problem-solving |
green-r-raw |
Reasoning model with custom system prompt support |
llama-3.3-70b-instruct |
Meta's Llama 3.3 70B |
deepseek-r1-distill-llama-70b |
DeepSeek R1 distilled |
mistral-nemo-instruct-2407 |
Mistral Nemo |
qwen3-235b-a22b-instruct-2507 |
Qwen3 235B |
For the full list of available models, see the GreenPT documentation.
Development
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-greenpt
python -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
python -m pip install -e '.[test]'
To run the tests:
python -m pytest
License
Apache 2.0
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file llm_greenpt-0.0.2.tar.gz.
File metadata
- Download URL: llm_greenpt-0.0.2.tar.gz
- Upload date:
- Size: 15.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
de85b90fb2aa2f18888567bf53837d123753a0a1b9fde055c55b93fbf58de0f7
|
|
| MD5 |
b0dfeb590e18557132950bdd142807f2
|
|
| BLAKE2b-256 |
5298fb740a3c53f5406e07d8055c176419a9c16d0d5f6b18e8d1b78084c3ef40
|
Provenance
The following attestation bundles were made for llm_greenpt-0.0.2.tar.gz:
Publisher:
publish.yml on mrchrisadams/llm-greenpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_greenpt-0.0.2.tar.gz -
Subject digest:
de85b90fb2aa2f18888567bf53837d123753a0a1b9fde055c55b93fbf58de0f7 - Sigstore transparency entry: 855026215
- Sigstore integration time:
-
Permalink:
mrchrisadams/llm-greenpt@06b4472fde1e6dbb11429f17eb1a37fcd0167640 -
Branch / Tag:
refs/tags/v0.0.2 - Owner: https://github.com/mrchrisadams
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@06b4472fde1e6dbb11429f17eb1a37fcd0167640 -
Trigger Event:
release
-
Statement type:
File details
Details for the file llm_greenpt-0.0.2-py3-none-any.whl.
File metadata
- Download URL: llm_greenpt-0.0.2-py3-none-any.whl
- Upload date:
- Size: 10.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fb860779f5f00d4f955743f2597a1a377a9ba0c3f0976e528691a2910e15d0f3
|
|
| MD5 |
7732dcaf4473d63134a67d8fae1b43ea
|
|
| BLAKE2b-256 |
9e776f66c07b805518b45f9f2f36a129be705cc5e20cfa20f2b982080e4448ce
|
Provenance
The following attestation bundles were made for llm_greenpt-0.0.2-py3-none-any.whl:
Publisher:
publish.yml on mrchrisadams/llm-greenpt
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
llm_greenpt-0.0.2-py3-none-any.whl -
Subject digest:
fb860779f5f00d4f955743f2597a1a377a9ba0c3f0976e528691a2910e15d0f3 - Sigstore transparency entry: 855026221
- Sigstore integration time:
-
Permalink:
mrchrisadams/llm-greenpt@06b4472fde1e6dbb11429f17eb1a37fcd0167640 -
Branch / Tag:
refs/tags/v0.0.2 - Owner: https://github.com/mrchrisadams
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@06b4472fde1e6dbb11429f17eb1a37fcd0167640 -
Trigger Event:
release
-
Statement type: