ESP Metrics CLI - Command-line tool for validating and uploading project metrics
Project description
ESP Metrics CLI
A command-line tool for standardizing metrics collection, validation, and upload across all Espressif projects. This tool enables consistent metrics processing in CI/CD pipelines by providing a unified interface for generating, validating, and submitting project metrics to the central Espressif metrics service.
- ESP Metrics CLI
Why ESP Metrics CLI?
The Problem
Espressif has multiple projects (esp-idf, esp-bluetooth, esp-matter, etc.) that need to track various metrics like:
- Build metrics: Binary sizes, compilation times, memory usage
- Test metrics: Pass/fail rates, execution times, coverage
- Custom metrics: Project-specific measurements
Without standardization, each project would:
- Implement metrics collection differently
- Use incompatible data formats
- Duplicate validation and upload logic
The Solution
ESP Metrics CLI provides a standardized client-side architecture that:
- Unifies metrics collection across all Espressif projects
- Enforces consistent data schemas via validation
- Centralizes upload logic to object storage
- Enables cross-project analytics and dashboards
- Integrates seamlessly with existing CI/CD pipelines
Architecture Overview
The ESP Metrics CLI follows a three-stage workflow designed for CI/CD integration:
Key Principles:
- Project-side generation: Projects create custom scripts to transform raw artifacts into meaningful metrics
- Standardized validation: All metrics conform to a contracted JSON schema
- Centralized upload: Single tool handles validation and upload to object storage
- Backend integration: Uploaded metrics trigger webhook processing for dashboard/API consumption
Getting Started
Installation
Choose your preferred installation method:
Option 1: Install from PyPI (Recommended)
# Using pip
pip install esp-metrics-cli
# Using uv (faster)
uv tool install esp-metrics-cli
# Verify installation
esp-metrics-cli --help
Option 2: Standalone Binary
Download pre-built binaries from the GitLab Package Registry on the project page.
Option 3: From Source (Development)
# Clone repository
cd esp-metrics-cli
# Install with uv
uv venv && source .venv/bin/activate
uv sync --all-extras
# Verify installation
python -m esp_metrics_cli.cli --help
Developer Integration Guide
Step 1: Add esp-metrics-cli to Your Project
Install the tool in your project environment:
# Add to requirements.txt
echo "esp-metrics-cli>=0.1.0" >> requirements.txt
# Or add to pyproject.toml
esp-metrics-cli = ">=0.1.0"
# Or install directly
pip install esp-metrics-cli
Step 2: Generate Metrics
Create a script that transforms your raw artifacts into JSON which satisfies the format you described in metrics_definitions.yaml.
The script can be implemented in any language, or you can use tools like jq to produce metrics.json in the required format.
The metrics.json file should contain a JSON object where the top-level keys are metric categories (like "builds", "tests", "coverage", etc.) and the values follow the hierarchical structure you defined in your metrics_definitions.yaml. Read more in Step 3: Create Metric Definitions:
Here is an example script:
#!/usr/bin/env python3
# tools/generate_metrics.py
import json
def generate_metrics():
"""Generate metrics from your project artifacts."""
return {
'builds': {
'apps/my_app/build': {
'size': {'flash_size': 1048576, 'ram_size': 65536},
'build': {'time': 45.2, 'warnings': 3},
}
},
'tests': {'unit_tests': {'passed': 42, 'failed': 1, 'duration': 12.5}},
'coverage': {'line_coverage': 85.3, 'branch_coverage': 78.9},
}
if __name__ == '__main__':
metrics = generate_metrics()
with open('metrics.json', 'w') as f:
json.dump(metrics, f, indent=2)
Notes about the format of JSON file:
- Top-level keys are the metrics and can be anything meaningful to your project.
- You can mix flat metrics and nested structures at the same level
- Metric names must use snake_case with units/dimensions at the end (e.g., time_seconds, size_bytes)
- Lists of dictionaries are supported in the metrics structure
- The JSON structure must match what you've defined in your
metrics_definitions.yamlStep 3: Create Metric Definitions
Step 3: Create Metric Definitions
Definition files (metrics_definitions.yaml) serve multiple purposes:
- Schema validation: Ensures your
metrics.jsonfile has the correct structure and data types - Documentation: Describes what each metric represents and its purpose
- Type safety: Prevents errors by enforcing consistent data types across uploads
- API contract: Defines the expected format for service consuming your metrics
Schema Structure
The schema follows JSON Schema specification and should contain:
type: object- Root level must be an objectproperties- Define your top-level metric categories (e.g. builds, tests, coverage, etc.)description- Human-readable explanation of what each section contains- Data types -
number,integer,string,boolean,array,object patternProperties- For dynamic keys (like build paths or test suite names)
Create a metrics_definitions.yaml file that defines your project's metrics schema. Look at the example below:
type: object
properties:
builds:
type: object
description: "Build-related metrics organized by path"
patternProperties:
"^.+$": # Matches any build path like 'apps/my_app/build'
type: object
properties:
size:
type: object
properties:
flash_size:
type: integer
unit: "bytes"
friendly_name: "Flash Memory Size"
description: "Flash memory usage in bytes"
ram_size:
type: integer
unit: "bytes"
friendly_name: "RAM Memory Size"
description: "RAM memory usage in bytes"
build:
type: object
properties:
time:
type: number
unit: "seconds"
friendly_name: "Build Time"
description: "Build compilation time in seconds"
warnings:
type: integer
unit: "count"
friendly_name: "Build Warnings"
description: "Number of compiler warnings"
tests:
type: object
description: "Test-related metrics organized by test suite"
patternProperties:
"^.+$": # Matches any test suite name like 'unit_tests'
type: object
properties:
passed:
type: integer
properties:
unit:
type: string
value: "count"
friendly_name:
type: string
value: "Passed Tests"
description:
type: string
value: "Number of passed tests"
failed:
type: integer
properties:
unit:
type: string
value: "count"
friendly_name:
type: string
value: "Failed Tests"
description:
type: string
value: "Number of failed tests"
duration:
type: number
properties:
unit:
type: string
value: "seconds"
friendly_name:
type: string
value: "Test Duration"
description:
type: string
value: "Total test execution time in seconds"
coverage:
type: object
description: "Code coverage metrics"
properties:
line_coverage:
type: number
properties:
unit:
type: string
value: "percent"
friendly_name:
type: string
value: "Line Coverage"
description:
type: string
value: "Line coverage percentage"
branch_coverage:
type: number
properties:
unit:
type: string
value: "percent"
friendly_name:
type: string
value: "Branch Coverage"
description:
type: string
value: "Branch coverage percentage"
Step 4: Integrate with CI Pipeline
File Organization
To keep your metrics infrastructure organized and maintainable, create a centralized structure for all metrics-related files.
As an example:
project-root/
├── tools/
│ └── ci/
│ └── metrics/
│ ├── build/
│ │ ├── metrics_definitions.yaml
│ │ └── generate_metrics.py
│ ├── tests/
│ │ ├── metrics_definitions.yaml
│ │ └── generate_metrics.py
│ └── coverage/
│ ├── metrics_definitions.yaml
│ └── generate_metrics.py
└── .gitlab-ci.yml
Key principles:
- Centralized location: All metrics files under one location, e.g.
tools/ci/metrics/ - Category-based organization: Separate directories for
build/,tests/,coverage/, etc. - Consistent naming: Each category has its own
metrics_definitions.yamlandgenerate_metrics.py - Version controlled: All files committed to your repository for reproducibility
Update your existing jobs to generate and upload metrics. Make sure the following variables are defined in your GitLab CI settings:
ESP_METRICS_S3_SERVERESP_METRICS_S3_ACCESS_KEYESP_METRICS_S3_SECRET_KEYESP_METRICS_S3_LONG_TERM_BUCKETESP_METRICS_S3_SHORT_TERM_BUCKET
The CLI automatically uses the following GitLab CI built-in variables, make sure to have them available in the variables section.
CI_PROJECT_URL- for project identificationCI_PROJECT_ID- for project identificationCI_COMMIT_SHA- for commit identificationCI_COMMIT_REF_NAME- for branch identification (optional, auto-detected if not set viaESP_METRICS_BRANCH_NAME)
Update your existing jobs to generate and upload metrics in gitlab ci config for the build job that produces build-related metrics:
# .gitlab-ci.yml
variables:
ESP_METRICS_PROJECT_URL: "$CI_PROJECT_URL"
ESP_METRICS_PROJECT_ID: "$CI_PROJECT_ID"
ESP_METRICS_COMMIT_SHA: "$CI_COMMIT_SHA"
ESP_METRICS_BRANCH_NAME: "$CI_COMMIT_REF_NAME"
build_job:
stage: build
script:
- make build
- pip install esp-metrics-cli
- python tools/ci/metrics/build/generate_metrics.py > metrics.json
- esp-metrics-cli upload -d tools/ci/metrics/build/metrics_definitions.yaml -i metrics.json --verbose
Metrics Validation
Local Validation
Validate your metrics before CI upload:
# Basic validation
esp-metrics-cli validate -d definitions.yaml -i metrics.json
# Verbose validation with detailed output
esp-metrics-cli validate -d definitions.yaml -i metrics.json --verbose
Validation checks:
- Required fields present
- Data types match schema
- Custom validation rules
Command Reference
esp-metrics-cli init
Create a configuration file with default settings. The config file (git ignored by default) is used for the local development. For the production environment it is recommended to use environment variables.
esp-metrics-cli init [OPTIONS]
Options:
--path TEXT: Configuration file path (default:.esp_metrics.toml)--force: Overwrite existing configuration file
Examples:
# Create default config
esp-metrics-cli init
# Create config with custom path
esp-metrics-cli init --path config/metrics.toml
# Overwrite existing config
esp-metrics-cli init --force
esp-metrics-cli validate
Validate metrics against schema definitions.
esp-metrics-cli validate [OPTIONS]
Required Options:
-d, --definitions PATH: Path to definitions.yaml file-i, --input PATH: Path to metrics.json file
Optional Options:
-v, --verbose: Enable detailed output-o, --output PATH: Path to save validated metrics data as JSON file
Examples:
# Basic validation
esp-metrics-cli validate -d definitions.yaml -i metrics.json
# Verbose validation
esp-metrics-cli validate -d definitions.yaml -i metrics.json --verbose
# Validation with output file to save validated metrics
esp-metrics-cli validate -d definitions.yaml -i metrics.json --output validated_metrics.json
esp-metrics-cli upload
Upload metrics to the service. Note: This command automatically validates metrics before upload, so a separate validation step is not required.
esp-metrics-cli upload [OPTIONS]
Required Options:
-d, --definitions PATH: Path to definitions.yaml file-i, --input PATH: Path to metrics.json file
Optional Options:
-v, --verbose: Enable detailed output--bucket-type CHOICE: Override bucket selection (auto|long_term|short_term|custom)--custom-bucket TEXT: Custom bucket name (required with--bucket-type=custom)
Examples:
# Upload with auto bucket detection
esp-metrics-cli upload -d definitions.yaml -i metrics.json
# Force upload to long-term bucket
esp-metrics-cli upload -d definitions.yaml -i metrics.json --bucket-type long_term
# Upload to custom bucket
esp-metrics-cli upload -d definitions.yaml -i metrics.json --bucket-type custom --custom-bucket my-bucket
# Verbose upload with detailed validation output
esp-metrics-cli upload -d definitions.yaml -i metrics.json --verbose
Global Options
Available for all commands:
-c, --config-file PATH: Path to configuration file (for local dev)-h, --help: Show help message
Examples:
# Use custom config file
esp-metrics-cli -c /path/to/config.toml validate -d definitions.yaml -i metrics.json
esp-metrics-cli upload -d definitions.yaml -i metrics.json
Configuration
ESP Metrics CLI supports two configuration methods depending on your use case:
Configuration File (Recommended for Local Development)
For local development and testing, use the .esp_metrics.toml configuration file. This method provides:
- Easy editing
- Clear documentation with inline comments
- Persistent settings across development sessions
- Template generation via
esp-metrics-cli init
Create and customize your configuration file:
# Generate default configuration file
esp-metrics-cli init
The .esp_metrics.toml file contains all configuration settings with helpful comments. Look at the example.
Environment Variables (Required for CI/CD)
For CI/CD pipelines, use environment variables instead of configuration files. This approach provides:
- Secure credential handling via CI/CD secrets
- No file management
- Dynamic configuration per pipeline run
All configuration can be set via environment variables with the ESP_METRICS_ prefix.
You can take a look at the example.
Bucket Types
The tool supports different bucket types for upload destinations:
auto(default): Automatically selects bucket based on branch name- Release branches → long-term bucket
- Other branches → short-term bucket
long_term: Always use long-term bucket (for releases)short_term: Always use short-term bucket (for development)custom: Use custom bucket name (requires--custom-bucket)
Integration Points
CI/CD Pipeline Integration
ESP Metrics CLI provides a reusable template for uploading metrics generated by your existing jobs. The example is shown in Step 4: Integrate with CI Pipeline.
MinIO Object Storage
The tool uploads metrics to MinIO-compatible S3 storage:
- Authentication: S3 access/secret keys
- Bucket selection: Automatic or manual bucket routing
- File naming: Structured naming convention for easy retrieval
- Retry logic: Automatic retry on upload failures
Upload path structure:
bucket/
├── project_host/
│ ├── projectid_project_path/
│ │ ├── commit_sha/
│ │ │ └── {timestamp}.json
│ │ ├── commit_sha/
│ │ │ └── {timestamp}.json
│ │ └── ...
│ ├── projectid_project_path/
│ │ └── ...
│ └── ...
├── project_host/
│ └── ...
└── ...
Example:
esp-metrics-bucket/
├── https_gitlab_example_com_8080/
│ ├── 1111_myorg_esp-project/
│ │ ├── abc123def456/
│ │ │ ├── 2024-01-15_14-30-00.507.json
│ │ │ └── 2024-01-15_15-45-30.507.json
│ │ └── def789ghi012/
│ │ └── 2024-01-16_09-15-22.507.json
│ └── 2222_myorg_esp-components/
│ └── xyz987uvw654/
│ └── 2024-01-17_11-20-45.507.json
├── https_github_example_com/
│ └── 3333_myteam_my-project/
│ └── 1a2b3c4d5e6f/
│ └── 2024-01-18_16-55-10.507.json
└── https_git_example_com/
└── 4444_devteam_repo-name/
└── 9z8y7x6w5v4u/
└── 2024-01-19_12-00-00.507.json
Backend Metrics Service
Uploaded metrics trigger backend processing:
- Webhook trigger: MinIO notifies backend service of new uploads
- Data validation: Backend re-validates metrics schema
- Database storage: Metrics stored in MongoDB for analytics
- API exposure: Metrics available via REST API
- Dashboard updates: Real-time dashboard updates
Integration flow:
CLI Upload → MinIO → Webhook → Backend Service → MongoDB → Dashboard/API
Local Development Workflow
Developer workflow for testing metrics locally using configuration files:
# 1. Set up development environment with configuration file
esp-metrics-cli init
# Edit .esp_metrics.toml with your S3 settings and preferences
# 2. Generate test metrics
python tools/generate_metrics.py
# 3. Upload and validate in one step (if S3 credentials are configured)
esp-metrics-cli upload -d definitions.yaml -i metrics.json --verbose
# Optional: Validate only (for testing schema without uploading)
esp-metrics-cli validate -d definitions.yaml -i metrics.json --verbose
Development
This project uses modern Python development tools for maintainability and code quality.
Development setup:
# Clone repository
cd esp-metrics-cli
# Install with development dependencies
uv venv && source .venv/bin/activate
uv sync --all-extras
# Install pre-commit hooks
pre-commit install && pre-commit autoupdate
# Run tests
pytest
# Run quality checks
pre-commit run --all-files
Tools used:
- UV: Fast package manager
- Hatchling: Modern build system
- Ruff: Fast linter and formatter
- Pytest: Testing framework
- Pre-commit: Code quality hooks
- Mypy: Type checking
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file esp_metrics_cli-0.3.0.tar.gz.
File metadata
- Download URL: esp_metrics_cli-0.3.0.tar.gz
- Upload date:
- Size: 221.9 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
608ce1e3b2b6c942ff482db98335a7445acee3adb07a3053e495ff62c78d777c
|
|
| MD5 |
a7a7a1d7bee4c482354b7d8c6db60be5
|
|
| BLAKE2b-256 |
96f6e33c7875b58f3130ac78b0bba5db6bb8fd0ad6e275b9a33a073de5725651
|
File details
Details for the file esp_metrics_cli-0.3.0-py3-none-any.whl.
File metadata
- Download URL: esp_metrics_cli-0.3.0-py3-none-any.whl
- Upload date:
- Size: 32.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.18
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1181ff5f44b71828fca5a6c4c426322d4312dce6affd31fa952e1be529a45cb9
|
|
| MD5 |
62bf182de7d0c93a34a76161f5844470
|
|
| BLAKE2b-256 |
2b76c265fbaee3d047678d3487ca5082d0f4f6cbe8486bab5efb32015a4623b3
|