Skip to main content

No project description provided

Project description

Extraneous activity delays BPS model enhancer

build version

Python implementation of the approach to enhance BPS models by modeling the extraneous activity delays with timer events. A preliminary version of this technique has been presented in the paper "Modeling Extraneous Activity Delays in Business Process Simulation", by David Chapela-Campa and Marlon Dumas. The complete version is presented in the paper "Enhancing Business Process Simulation Models with Extraneous Activity Delays", by David Chapela-Campa and Marlon Dumas.

The technique supports two simulation engines: QBP and Prosimos. The input consists of i) an event log (pd.DataFrame) recording the execution of the activities of a process (including resource information), and ii) a BPS model. In QBP, the BPS model is represented by a BPMN file (example here). In Prosimos, by a BPMN file with the process model structure, and JSON file with the simulation parameters (example here).

The approach uses the input event log to discover the waiting time caused by extraneous factors, and enhances the input BPS model with timer events to model such delays. The proposal consists of different configurations such as:

  • Discover the extraneous delays using the Naive proposal (DiscoveryMethod.NAIVE), or the Eclipse-aware proposal (DiscoveryMethod.COMPLEX).
  • Consider the extraneous delays to occur after an activity instance, i.e., ex-post configuration (TimerPlacement.AFTER), or to occur before an activity instance, i.e., ex-ante configuration (TimerPlacement.BEFORE).
  • Enhance the BPS model with the discovered extraneous delays (DirectEnhancer()), or try to tune the discovered delays with a TPE hyper-optimization stage (HyperOptEnhancer())

For a more detailed explanation of the different variants of the approach, we refer to the paper "Enhancing Business Process Simulation Models with Extraneous Activity Delays".

Requirements

$ git submodule update --init --recursive
$ cd ./external_tools/
$ cd ./pix-utils/
$ pip install -e .
$ cd ../log-distance-measures/
$ pip install -e .
$ cd ../start-time-estimator/
$ pip install -e .
$ cd ../Prosimos/
$ pip install -e .
$ cd ../..
$ pip install -e .

Basic Usage

Check this test file for a simple example of how to run the technique with Prosimos, and config file for an explanation of the configuration parameters.

More sophisticated configurations of the approach are used in the test files for synthetic and real-life evaluations.

Here, we provide two example of the proposal:

Using Prosimos as simulation engine with no TPE optimization stage

import json

import pandas as pd
from lxml import etree

from estimate_start_times.config import DEFAULT_CSV_IDS
from extraneous_activity_delays.config import Configuration, SimulationModel, SimulationEngine
from extraneous_activity_delays.config import DiscoveryMethod, TimerPlacement, OptimizationMetric
from extraneous_activity_delays.enhance_with_delays import DirectEnhancer

# Set up default configuration
log_ids = DEFAULT_CSV_IDS
config = Configuration(
    log_ids=log_ids, process_name="prosimos-example",
    simulation_engine=SimulationEngine.PROSIMOS,
    discovery_method=DiscoveryMethod.COMPLEX,  # Eclipse-aware method
    timer_placement=TimerPlacement.BEFORE,  # ex-ante configuration
    optimization_metric=OptimizationMetric.RELATIVE_EMD
    # working_schedules=working_schedules  # Use this to consider resource unavailability
)
# Read event log
event_log = pd.read_csv("path_to_input_log.csv")
event_log[log_ids.start_time] = pd.to_datetime(event_log[log_ids.start_time], utc=True)
event_log[log_ids.end_time] = pd.to_datetime(event_log[log_ids.end_time], utc=True)
event_log[log_ids.resource].fillna("NOT_SET", inplace=True)
event_log[log_ids.resource] = event_log[log_ids.resource].astype("string")
# Read BPMN model
parser = etree.XMLParser(remove_blank_text=True)
bpmn_model = etree.parse("path_to_bps_model.bpmn", parser)
# Read simulation parameters
with open("path_to_bps_parameters.json") as json_file:
    simulation_parameters = json.load(json_file)
simulation_model = SimulationModel(bpmn_model, simulation_parameters)
# Enhance with hyper-parametrized activity delays with hold-out
enhancer = DirectEnhancer(event_log, simulation_model, config)
enhanced_simulation_model = enhancer.enhance_simulation_model_with_delays()
# Write enhanced BPS model (BPMN and parameters)
enhanced_simulation_model.bpmn_document.bpmn_document.write("path_of_enhanced_bps_model.bpmn", pretty_print=True)
with open("path_to_enhanced_bps_parameters.json") as json_file:
    json.dump(enhanced_simulation_model.simulation_parameters, json_file)

Using QBP as simulation engine with TPE optimization stage

import pandas as pd
from lxml import etree

from estimate_start_times.config import DEFAULT_CSV_IDS
from extraneous_activity_delays.config import Configuration, SimulationModel, SimulationEngine
from extraneous_activity_delays.config import DiscoveryMethod, TimerPlacement, OptimizationMetric
from extraneous_activity_delays.enhance_with_delays import HyperOptEnhancer

# Set up default configuration
log_ids = DEFAULT_CSV_IDS
config = Configuration(
    log_ids=log_ids, process_name="qbp-example",
    max_alpha=10.0, training_partition_ratio=0.5,
    num_iterations=100, simulation_engine=SimulationEngine.QBP,
    discovery_method=DiscoveryMethod.COMPLEX,  # Eclipse-aware method
    timer_placement=TimerPlacement.BEFORE,  # ex-ante configuration
    optimization_metric=OptimizationMetric.RELATIVE_EMD
    # working_schedules=working_schedules  # Use this to consider resource unavailability
)
# Read event log
event_log = pd.read_csv("path_to_input_log.csv")
event_log[log_ids.start_time] = pd.to_datetime(event_log[log_ids.start_time], utc=True)
event_log[log_ids.end_time] = pd.to_datetime(event_log[log_ids.end_time], utc=True)
event_log[log_ids.resource].fillna("NOT_SET", inplace=True)
event_log[log_ids.resource] = event_log[log_ids.resource].astype("string")
# Read BPMN model
parser = etree.XMLParser(remove_blank_text=True)
bpmn_model = etree.parse("path_to_bps_model.bpmn", parser)
simulation_model = SimulationModel(bpmn_model)
# Enhance with hyper-parametrized activity delays with hold-out
enhancer = HyperOptEnhancer(event_log, simulation_model, config)
enhanced_simulation_model = enhancer.enhance_simulation_model_with_delays()
# Write enhanced BPS model
enhanced_simulation_model.bpmn_document.write("path_of_enhanced_bps_model.bpmn", pretty_print=True)

Resource unavailability: working schedules format

from pix_framework.calendar.resource_calendar import RCalendar

weekly_calendars = [
    {
        "resource_name": "Jonathan",
        "time_periods": [
            {
                "from": "MONDAY",
                "to": "FRIDAY",
                "beginTime": "09:00:00.000",
                "endTime": "14:00:00.000"
            },
            {
                "from": "MONDAY",
                "to": "THURSDAY",
                "beginTime": "16:00:00.000",
                "endTime": "19:00:00.000"
            }
        ]
    }, {
        "resource_name": "DIO",
        "time_periods": [
            {
                "from": "MONDAY",
                "to": "SUNDAY",
                "beginTime": "08:00:00.000",
                "endTime": "20:00:00.000"
            }
        ]
    }
]


def from_weekly_calendar() -> dict:
    # Read calendars
    resource_calendars = {}
    for calendar in weekly_calendars:
        resource_name = calendar['resource_name']
        r_calendar = RCalendar("calendar_{}".format(resource_name))
        for slot in calendar["time_periods"]:
            r_calendar.add_calendar_item(
                slot["from"], slot["to"], slot["beginTime"], slot["endTime"]
            )
        resource_calendars[resource_name] = r_calendar
    # Return resource calendars
    return resource_calendars

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

extraneous_activity_delays-2.1.13.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

File details

Details for the file extraneous_activity_delays-2.1.13.tar.gz.

File metadata

File hashes

Hashes for extraneous_activity_delays-2.1.13.tar.gz
Algorithm Hash digest
SHA256 aee12f319e18298bcbde77d7f6af909ce077b1cbe13521ec424c9c207d6dca48
MD5 f6cf4e6c9b921a3bbb83238fc37ebe5f
BLAKE2b-256 99f04abfe576373bf85e33b0f4ff6a0c44913d9143b9a40e3ae964824cb4d942

See more details on using hashes here.

File details

Details for the file extraneous_activity_delays-2.1.13-py3-none-any.whl.

File metadata

File hashes

Hashes for extraneous_activity_delays-2.1.13-py3-none-any.whl
Algorithm Hash digest
SHA256 0549bdfa3a6fbde43cb22b90f56c3756170f1d73ed3d12c6226c9c3724f8b114
MD5 92b20a730b4262e9a3b814fb6f01bb8e
BLAKE2b-256 967318df20153a721dd2dfd468841db4c763c9ea7b272b148ed7dc32328ebe34

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page