Skip to main content

An app to map event logs into ontology-based process knowledge and analyze the data through a library of knowledge patterns

Project description

Process Meaning Patterns - Python Repository

Process Meaning Patterns is a framework designed to enable ontology-aware process mining. Ontology-aware process mining is meant to formalize process knowledge applied throughout the process mining lifecycle to enable transparency and replicability. This is accomplished through process meaning patterns, first-order logic ontology patterns that correspond to common verification and inference steps in the process mining lifecycle. This repository contains an implementation of the process meaning pattern framework implemented in python and supporting exporting of process knowledge to various formats including OWL/RDF, First order logic via CLIF or Prover9 syntax, and prolog/datalog.

The process meaning patterns framework is actively maintaned and developed by Riley Moher of the semantic technologies laboratory (stl) at the University of Toronto.

Example Usage

The central object of the process meaning pattern library is the LogProcessor, which can ingest a business process event log in a variety of formats and output corresponding facts to be validated against or queried with process meaning patterns. Here is a simple example:

import ProMean4Py

# define column names
col_dict = {'case_id': 'caseID', 'activity': 'activityID', 'timestamp': 'timestamp', 'resource': 'resourceID', 'event_id' : 'eventID'}
# create output directory
output_dir= '../output/testing/'
# define namespaces for knowledge graph
namespaces = {'ex' : "http://example.com/", 'on' : "https://stl.mie.utoronto.ca/ontologies/spm/"}
# initialize the log processor on some data
log_processor = LogProcessor('sample_log.csv', process_name='P1', column_dict=col_dict, prefixes=namespaces)
# save the event log facts as a knowledge graph
log_processor.save_knowledge_graph(output_dir, format='xml')
# save the event log facts as first-order-logic facts
log_processor.save_FOL(output_dir)
# save the event log facts as datalog facts
log_processor.save_datalog(output_dir)

More thorough examples, including an example utilizing real-world enterprise data, are available in the notebooks directory of this repository.

Installation

ProMean4Py is published on the python package index (pypi) and can be installed on any python version >= 3.9.X by simply invoking pip:

pip install ProMean4Py

Release Notes

Release notes are tracked in the CHANGELOG.md file of this repository.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

processmeaningpatternspython-0.10.tar.gz (7.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

ProcessMeaningPatternsPython-0.10-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file processmeaningpatternspython-0.10.tar.gz.

File metadata

File hashes

Hashes for processmeaningpatternspython-0.10.tar.gz
Algorithm Hash digest
SHA256 e2688e8e5a5b009807d54545175e17c986abc533287f855b2535ade9e8bb75db
MD5 9564c8989000db3f3387a184b757aae6
BLAKE2b-256 39fd696872b8c14590ebb52f96e6a4f007836f53b050605ef50bd3e2ff3d0752

See more details on using hashes here.

File details

Details for the file ProcessMeaningPatternsPython-0.10-py3-none-any.whl.

File metadata

File hashes

Hashes for ProcessMeaningPatternsPython-0.10-py3-none-any.whl
Algorithm Hash digest
SHA256 557f005d6cc985c72f2bba1497b5c70ded51de4de6681b066df45767677d0817
MD5 c4c7a9ce245ef1f855f75ebdacc1142f
BLAKE2b-256 79403af40ae0dd0136e3f097ce23088e6a773ecd489449025e294af5d1172624

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page