Skip to main content

HTML pipeline report renderer for Cogniflow

Project description

cf-pipeline-report

Minimal HTML report renderer for Cogniflow pipeline manifests.

Published distribution name:

pip install cf-pipeline-report

Status: early scaffolding. The renderer outputs a static HTML shell that uses Observable Runtime (observablehq) to fetch a JSON payload and render a live pipeline view. The payload can be served by a small API so the report stays decoupled from the running pipeline.

Quick start (static JSON payload):

cf pipeline report-render sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.jsonld
python -m http.server --directory sandcastle/cf_pipeline/cf_pipeline_engine/examples 8000

Live API (DuckDB-backed):

cf pipeline report sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.jsonld --duckdb path/to/pipeline.duckdb

Then open http://127.0.0.1:8765/ in the browser.

Multiple pipelines (directory of manifests):

cf pipeline report sandcastle/cf_pipeline/cf_pipeline_engine/examples --duckdb-dir path/to/duckdb

The server will expose one report per pipeline and look for files named <pipeline_id>.duckdb in the provided directory (or nested under <pipeline_id>/<pipeline_id>.duckdb).

Live tables/plots read from the DuckDB view pipeline_data created by the Parquet archive sink. Ensure the pipeline writes to DuckDB before opening the report, or the live sections will show empty data.

If no --duckdb/--duckdb-dir is provided, the server will try to resolve the DuckDB file from the manifest by reading pipelineId and baseDir parameters on the ParquetArchiveSinkStep. In this repo the recommended default is sandcastle/data_hive/<pipelineId>/<pipelineId>.duckdb.

Publishing

cf_pipeline_report is published with the dedicated Windows workflow:

  • Workflow: .github/workflows/cf_pipeline_report_windows_publish.yml
  • Package directory: sandcastle/cf_pipeline/cf_pipeline_report
  • PyPI tag: cf-pipeline-report-v<version>
  • TestPyPI tag: cf-pipeline-report-v<version>-test

Local preflight:

powershell -ExecutionPolicy Bypass -File scripts/mimic_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_report_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_report `
  -PythonExe py `
  -PythonVersion 3.13

Queue a dry-run dispatch:

powershell -ExecutionPolicy Bypass -File scripts/queue_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_report_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_report `
  -PublishTarget testpypi `
  -Ref main `
  -RequireLocalPass `
  -DryRun

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cf_pipeline_report-0.0.1.tar.gz (23.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cf_pipeline_report-0.0.1-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file cf_pipeline_report-0.0.1.tar.gz.

File metadata

  • Download URL: cf_pipeline_report-0.0.1.tar.gz
  • Upload date:
  • Size: 23.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cf_pipeline_report-0.0.1.tar.gz
Algorithm Hash digest
SHA256 61570b804e148ccc6322bffd0707e55369eb98420bb284fbdb70cf2a9f97c8e7
MD5 197955d93a6642df1483ea728df8b6f3
BLAKE2b-256 0e9500121cb6791320dae75b2c9f07e3303aa46967b146e9528bd136d4ab2131

See more details on using hashes here.

File details

Details for the file cf_pipeline_report-0.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for cf_pipeline_report-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 c20bf36d141a3c2e656dd61ae53bea8c96d97056d793e7d1e583f74da2507ff5
MD5 8e37d9d344d967cb112805a66eb8da66
BLAKE2b-256 3a4e7eff5f446420f0508014bf937258dfb15a3e0f43c56b273969f2440e3d6b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page