Skip to main content

Cogniflow pipeline engine (C++ core) with a thin Python package wrapper.

Project description

cf-pipeline-engine

The Cogniflow pipeline engine implemented in C++ (compiler, scheduler, runtime).

This folder is structured as a Python package for consistency with the other cf_* components, even though the core implementation is native C++ and built via CMake.

Build (CMake)

cmake -S . -B build
cmake --build build

Python package wrapper

The Python package is intentionally thin and provides access to the packaged native engine assets:

  • cf_pipeline_engine.cf_pipeline_v2_path()
  • cf_pipeline_engine.cf_siggen_path()
  • cf_pipeline_engine.cf_engine_include_path()
  • cf_pipeline_engine.cf_type_registry_path()
  • cf_pipeline_engine.resolve_cf_pipeline_v2_executable()

Published distribution name:

pip install cf-pipeline-engine

The published wheel installs:

  • bin/cf_pipeline_v2(.exe)
  • bin/cf_siggen(.exe)
  • bin/type_registry.v0.json
  • include/*.h
  • examples/opcua_fifo_avg_to_duckdb_parquet_triggered.nq
  • examples/invocations/opcua_fifo_avg.endpoint_4841.invocation.nq

The native sink path no longer embeds cf_datahive_cpp sources into the engine build. Instead, cf-pipeline-engine links against the packaged native consumer surface exported by cf-datahive.

The Python wrapper also exposes the packaged mini-demo resources:

  • cf_pipeline_engine.cf_demo_pipeline_path()
  • cf_pipeline_engine.cf_demo_invocation_path()

cfio:OpcuaReaderStep is now runner-owned as well. The engine executes that step through the installed cf-opcua-server owner surface instead of loading protocol code from cf-basic-io. If the console script is not on PATH, set CF_OPCUA_SERVER_CMD to the absolute script or executable path.

Python CLI

The legacy standalone Python wrapper package has been retired. Its CLI surface now lives in this cf_pipeline_engine package.

Direct invocation remains policy-gated by CF_ALLOW_DIRECT_ENGINE:

CF_ALLOW_DIRECT_ENGINE=1 python -m cf_pipeline_engine.cli \
  --pipeline sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.nq \
  --interval 1 \
  --duration 10

Equivalent module entrypoint:

CF_ALLOW_DIRECT_ENGINE=1 python -m cf_pipeline_engine --help

Publishing

cf_pipeline_engine is published with the dedicated Windows workflow:

  • Workflow: .github/workflows/cf_pipeline_engine_windows_publish.yml
  • Package directory: sandcastle/cf_pipeline/cf_pipeline_engine
  • PyPI tag: cf-pipeline-engine-v<version>
  • TestPyPI tag: cf-pipeline-engine-v<version>-test

Local preflight:

powershell -ExecutionPolicy Bypass -File scripts/mimic_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_engine_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_engine `
  -PythonExe py `
  -PythonVersion 3.14

Queue a dry-run dispatch:

powershell -ExecutionPolicy Bypass -File scripts/queue_windows_python_publish_workflow.ps1 `
  -WorkflowFile .github/workflows/cf_pipeline_engine_windows_publish.yml `
  -PackageDir sandcastle/cf_pipeline/cf_pipeline_engine `
  -PublishTarget testpypi `
  -Ref main `
  -RequireLocalPass `
  -DryRun

OPC UA Demo Pipeline Sink

The existing demo pipeline examples/opcua_fifo_avg_to_duckdb_parquet_triggered.nq now uses cfsink:DataHiveParquetSinkStep from cf_basic_sinks.

Its cfio:OpcuaReaderStep ingress remains in cf-basic-io as a declarative step definition, while the runner performs the actual OPC UA snapshot through cf-opcua-server.

The step stays declarative in cf_basic_sinks, while the runner executes the sink through the packaged C++ gatekeeper library exported by cf-datahive, producing one committed data hive run with 20 rows (cycle_id 0..19) and no archive.jsonl.

Run the one-click demo:

.\scripts\fresh_install_v2.ps1 -Clean -RunDemo -ShowDetails

Expected output layout after one demo session:

  • workspace/<data_hive>/opcua_fifo_avg/latest.txt
  • workspace/<data_hive>/opcua_fifo_avg/runs/<run_id>/manifest.json
  • workspace/<data_hive>/opcua_fifo_avg/runs/<run_id>/tables/measurements/part-*.parquet

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cf_pipeline_engine-0.2.6.tar.gz (78.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

cf_pipeline_engine-0.2.6-cp314-cp314-win_amd64.whl (12.7 MB view details)

Uploaded CPython 3.14Windows x86-64

File details

Details for the file cf_pipeline_engine-0.2.6.tar.gz.

File metadata

  • Download URL: cf_pipeline_engine-0.2.6.tar.gz
  • Upload date:
  • Size: 78.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.2.0 CPython/3.14.3

File hashes

Hashes for cf_pipeline_engine-0.2.6.tar.gz
Algorithm Hash digest
SHA256 c3f0a455eac32f30f962ff53cab655eecebf3baaf04ec2dfdf6c5514c5aed352
MD5 b22d0e31b770564189d05ffa1623d83f
BLAKE2b-256 4a76d036c8138a5bf5c2cfb7a29a3cc16dd1ac3f564067cad9ce0454561a6558

See more details on using hashes here.

File details

Details for the file cf_pipeline_engine-0.2.6-cp314-cp314-win_amd64.whl.

File metadata

File hashes

Hashes for cf_pipeline_engine-0.2.6-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 4c124535976b234cc755b639ce7b0ab051a8c7e34639154fbe445cdf7f441272
MD5 7229300ba8a894abe54b0fb4ff52a667
BLAKE2b-256 e38d0bb84be906c7de8e99b2a23a8b5da01d9f020136fd8af2619d47edd64aab

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page