Skip to main content

Utilities for ML models targeting hardware triggers

Project description

Machine Learning for Hardware Triggers

triggerflow provides a set of utilities for Machine Learning models targeting FPGA deployment. The TriggerModel class consolidates several Machine Learning frontends and compiler backends to construct a "trigger model". MLflow utilities are for logging, versioning, and loading of trigger models.

Installation

pip install triggerflow

Usage

from triggerflow.core import TriggerModel

triggerflow = TriggerModel(name="my-trigger-model", ml_backend="Keras", compiler="hls4ml", model, compiler_config or None)
triggerflow() # call the constructor

# then:
output_software = triggerflow.software_predict(input_data)
output_firmware = triggerflow.firmware_predict(input_data)
output_qonnx = triggerflow.qonnx_predict(input_data)

# save and load trigger models:
triggerflow.save("triggerflow.tar.xz")

# in a separate session:
from triggerflow.core import TriggerModel
triggerflow = TriggerModel.load("triggerflow.tar.xz")

Logging with MLflow

# logging with MLFlow:
import mlflow
from triggerflow.mlflow_wrapper import log_model

mlflow.set_tracking_uri("https://ngt.cern.ch/models")
experiment_id = mlflow.create_experiment("example-experiment")

with mlflow.start_run(run_name="trial-v1", experiment_id=experiment_id):
    log_model(triggerflow, registered_model_name="TriggerModel")

Note: This package doesn't install dependencies so it won't disrupt specific training environments or custom compilers. For a reference environment, see environment.yml.

Creating a kedro pipeline

This repository also comes with a default pipeline for trigger models based on kedro. One can create a new pipeline via:

NOTE: no "-" and upper cases!

# Create a conda environment & activate it
conda create -n triggerflow python=3.11
conda activate triggerflow

# install triggerflow
pip install triggerflow

# Create a pipeline
triggerflow new demo_pipeline

# NOTE: since we dont install dependency one has to create a
# conda env based on the environment.yml file of the pipeline
# this file can be changed to the needs of the indiviual project
cd demo_pipeline
conda env update -n triggerflow --file environment.yml

# Run Kedro
kedro run

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

triggerflow-0.2.2.tar.gz (43.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

triggerflow-0.2.2-py3-none-any.whl (70.3 kB view details)

Uploaded Python 3

File details

Details for the file triggerflow-0.2.2.tar.gz.

File metadata

  • Download URL: triggerflow-0.2.2.tar.gz
  • Upload date:
  • Size: 43.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for triggerflow-0.2.2.tar.gz
Algorithm Hash digest
SHA256 02f34d7007b3a4610d68d7ba9ec18688e5d20569f657fd85a88dc43a65ff2508
MD5 fde3e18579e76065c58a1d267333dc6d
BLAKE2b-256 07676d8feea56c8af0866a3a2b28b412cbda3a365f052074e05163ccb32bb889

See more details on using hashes here.

File details

Details for the file triggerflow-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: triggerflow-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 70.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.14

File hashes

Hashes for triggerflow-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 d0e330d3ab3bb3e8565d2ecf493ee6fd6df722b639368f16d687e85cd21e737c
MD5 26b7b26f97a394fb71506603b716ad75
BLAKE2b-256 1ef26b7ac546c60dd1b8b2a3bad89763746783bdcb9afbf6a8a8dc957fb5ec20

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page