Skip to main content

HTML pipeline report renderer for Cogniflow

Project description

cf-pipeline-report

Minimal HTML report renderer for Cogniflow pipeline manifests.

Published distribution name:

pip install cf-pipeline-report

Status: early scaffolding. The renderer outputs a static HTML shell that uses Observable Runtime (observablehq) to fetch a JSON payload and render a live pipeline view. The payload can be served by a small API so the report stays decoupled from the running pipeline.

Quick start (static JSON payload):

cf pipeline report-render sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.jsonld
python -m http.server --directory sandcastle/cf_pipeline/cf_pipeline_engine/examples 8000

Live API (DuckDB-backed):

cf pipeline report sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.jsonld --duckdb path/to/pipeline.duckdb

Then open http://127.0.0.1:8765/ in the browser.

Multiple pipelines (directory of manifests):

cf pipeline report sandcastle/cf_pipeline/cf_pipeline_engine/examples --duckdb-dir path/to/duckdb

The server will expose one report per pipeline and look for files named <pipeline_id>.duckdb in the provided directory (or nested under <pipeline_id>/<pipeline_id>.duckdb).

Live tables/plots read from the DuckDB view pipeline_data created by the Parquet archive sink. Ensure the pipeline writes to DuckDB before opening the report, or the live sections will show empty data.

If no --duckdb/--duckdb-dir is provided, the server will try to resolve the DuckDB file from the manifest by reading pipelineId and baseDir parameters on the ParquetArchiveSinkStep. In this repo the recommended default is sandcastle/data_hive/<pipelineId>/<pipelineId>.duckdb.

Publishing

cf_pipeline_report is published with the dedicated Windows workflow:

  • Workflow: .github/workflows/cf_pipeline_report_windows_publish.yml
  • Package directory: sandcastle/cf_pipeline/cf_pipeline_report
  • PyPI tag: cf-pipeline-report-v<version>
  • TestPyPI tag: cf-pipeline-report-v<version>-test

Local preflight:

powershell -ExecutionPolicy Bypass -File scripts/mimic_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_report_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_report `
  -PythonExe py `
  -PythonVersion 3.13

Queue a dry-run dispatch:

powershell -ExecutionPolicy Bypass -File scripts/queue_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_report_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_report `
  -PublishTarget testpypi `
  -Ref main `
  -RequireLocalPass `
  -DryRun

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cf_pipeline_report-0.0.2.tar.gz (23.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cf_pipeline_report-0.0.2-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file cf_pipeline_report-0.0.2.tar.gz.

File metadata

  • Download URL: cf_pipeline_report-0.0.2.tar.gz
  • Upload date:
  • Size: 23.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.13.12

File hashes

Hashes for cf_pipeline_report-0.0.2.tar.gz
Algorithm Hash digest
SHA256 a84a5e0fb83b6cb6b8656362b58f495918142615256a082c04a69a27a9401cd5
MD5 e5dbbea76dab87b0850d3377119fd7ed
BLAKE2b-256 22f8ac0723c9efacb4b54f738e6326ed7e91405776f2867babc52e2929eec0fa

See more details on using hashes here.

File details

Details for the file cf_pipeline_report-0.0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for cf_pipeline_report-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 27581c395554bf597ed25459d39fb641957f294ef17b3eb873768bb11a2bb2b2
MD5 b89ef01f8ebd5f43606da4f9b262bfd6
BLAKE2b-256 74831842b70986130b251ea5bb189df4913919fadd7ef4dc85ca5d545e292c58

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page