Cogniflow pipeline engine (C++ core) with a thin Python package wrapper.
Project description
cf-pipeline-engine
The Cogniflow pipeline engine implemented in C++ (compiler, scheduler, runtime).
This folder is structured as a Python package for consistency with the other
cf_* components, even though the core implementation is native C++ and built
via CMake.
Build (CMake)
cmake -S . -B build
cmake --build build
Python package wrapper
The Python package is intentionally thin and provides access to the packaged native engine assets:
cf_pipeline_engine.cf_pipeline_v2_path()cf_pipeline_engine.cf_siggen_path()cf_pipeline_engine.cf_engine_include_path()cf_pipeline_engine.cf_type_registry_path()cf_pipeline_engine.resolve_cf_pipeline_v2_executable()
Published distribution name:
pip install cf-pipeline-engine
The published wheel installs:
bin/cf_pipeline_v2(.exe)bin/cf_siggen(.exe)bin/type_registry.v0.jsoninclude/*.hexamples/opcua_fifo_avg_to_duckdb_parquet_triggered.nqexamples/invocations/opcua_fifo_avg.endpoint_4841.invocation.nq
The native sink path no longer embeds cf_datahive_cpp sources into the engine
build. Instead, cf-pipeline-engine links against the packaged native consumer
surface exported by cf-datahive.
The Python wrapper also exposes the packaged mini-demo resources:
cf_pipeline_engine.cf_demo_pipeline_path()cf_pipeline_engine.cf_demo_invocation_path()
cfio:OpcuaReaderStep is now runner-owned as well. The engine executes that
step through the installed cf-opcua-server owner surface instead of loading
protocol code from cf-basic-io. If the console script is not on PATH, set
CF_OPCUA_SERVER_CMD to the absolute script or executable path.
Python CLI
The legacy standalone Python wrapper package has been retired. Its CLI surface
now lives in this cf_pipeline_engine package.
Direct invocation remains policy-gated by CF_ALLOW_DIRECT_ENGINE:
CF_ALLOW_DIRECT_ENGINE=1 python -m cf_pipeline_engine.cli \
--pipeline sandcastle/cf_pipeline/cf_pipeline_engine/examples/opcua_fifo_avg_to_duckdb_parquet_triggered.nq \
--interval 1 \
--duration 10
Equivalent module entrypoint:
CF_ALLOW_DIRECT_ENGINE=1 python -m cf_pipeline_engine --help
Publishing
cf_pipeline_engine is published with the dedicated Windows workflow:
- Workflow:
.github/workflows/cf_pipeline_engine_windows_publish.yml - Package directory:
sandcastle/cf_pipeline/cf_pipeline_engine - PyPI tag:
cf-pipeline-engine-v<version> - TestPyPI tag:
cf-pipeline-engine-v<version>-test
Local preflight:
powershell -ExecutionPolicy Bypass -File scripts/mimic_windows_python_publish_workflow.ps1 `
-WorkflowFile .github/workflows/cf_pipeline_engine_windows_publish.yml `
-PackageDir sandcastle/cf_pipeline/cf_pipeline_engine `
-PythonExe py `
-PythonVersion 3.14
Queue a dry-run dispatch:
powershell -ExecutionPolicy Bypass -File scripts/queue_windows_python_publish_workflow.ps1 `
-WorkflowFile .github/workflows/cf_pipeline_engine_windows_publish.yml `
-PackageDir sandcastle/cf_pipeline/cf_pipeline_engine `
-PublishTarget testpypi `
-Ref main `
-RequireLocalPass `
-DryRun
OPC UA Demo Pipeline Sink
The existing demo pipeline
examples/opcua_fifo_avg_to_duckdb_parquet_triggered.nq now uses
cfsink:DataHiveParquetSinkStep from cf_basic_sinks.
Its cfio:OpcuaReaderStep ingress remains in cf-basic-io as a declarative
step definition, while the runner performs the actual OPC UA snapshot through
cf-opcua-server.
The step stays declarative in cf_basic_sinks, while the runner executes the
sink through the packaged C++ gatekeeper library exported by cf-datahive,
producing one committed data hive run with 20 rows (cycle_id 0..19) and no
archive.jsonl.
Run the one-click demo:
.\scripts\fresh_install_v2.ps1 -Clean -RunDemo -ShowDetails
Expected output layout after one demo session:
workspace/<data_hive>/opcua_fifo_avg/latest.txtworkspace/<data_hive>/opcua_fifo_avg/runs/<run_id>/manifest.jsonworkspace/<data_hive>/opcua_fifo_avg/runs/<run_id>/tables/measurements/part-*.parquet
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file cf_pipeline_engine-0.2.7.tar.gz.
File metadata
- Download URL: cf_pipeline_engine-0.2.7.tar.gz
- Upload date:
- Size: 79.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b4b88d2b7ece782854c2517f4fb116f935ddb502a4e52e307f529dadc79c1e0e
|
|
| MD5 |
b9b7095021d47fa728dcfd05122e1569
|
|
| BLAKE2b-256 |
0d400b2ef6af88bf2f07b68808116ad1bc7acfef8c0c470fd382c221d7861de8
|
File details
Details for the file cf_pipeline_engine-0.2.7-cp314-cp314-win_amd64.whl.
File metadata
- Download URL: cf_pipeline_engine-0.2.7-cp314-cp314-win_amd64.whl
- Upload date:
- Size: 12.7 MB
- Tags: CPython 3.14, Windows x86-64
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.2.0 CPython/3.14.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
db1ed9e6a2c25624106519f06bd0fa7657fdb6b6ccbf6f4134c8342eda322309
|
|
| MD5 |
033878cc471942268bf18bb8bf9937f7
|
|
| BLAKE2b-256 |
59ccfcae4a7b8c9ce059ff573cecef7eab0e0486ab1208dec36dc27cefe4f4da
|