Ventilate prose (#1305)

* ventilate prose

* ventilate all the things

---------

Co-authored-by: burnettk <burnettk@users.noreply.github.com>
This commit is contained in:
Kevin Burnett 2024-04-01 14:17:38 +00:00 committed by GitHub
parent 4c8ca01740
commit 9232c70ff3
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
32 changed files with 538 additions and 188 deletions

View File

@ -155,10 +155,13 @@ Instructions
## Essential Example ## Essential Example
Now that we have explored the minimal example, let's delve into a more comprehensive BPMN model known as the Essential Example. This example serves as a stepping stone towards a deeper understanding of BPMN, as it incorporates a variety of fundamental concepts that work harmoniously together. Now that we have explored the minimal example, let's delve into a more comprehensive BPMN model known as the Essential Example.
This example serves as a stepping stone towards a deeper understanding of BPMN, as it incorporates a variety of fundamental concepts that work harmoniously together.
### Access the Process Directory ### Access the Process Directory
Clicking on the process name will open the directory dedicated to the Essential Example process. Here are the four files in the Process:
Clicking on the process name will open the directory dedicated to the Essential Example process.
Here are the four files in the Process:
**BPMN editor** : The BPMN editor is a primary file that runs the engine. In the minimal example, we learned that it allows you to visually design and represent business processes using the Business Process Model and Notation (BPMN) standard. **BPMN editor** : The BPMN editor is a primary file that runs the engine. In the minimal example, we learned that it allows you to visually design and represent business processes using the Business Process Model and Notation (BPMN) standard.
@ -179,25 +182,31 @@ Here's what a DMN table looks like:
![Image](images/UI-Schema.png) ![Image](images/UI-Schema.png)
### Process Workflow ### Process Workflow
In this BPMN diagram example, the process is outlined step by step: The process initiates with a start event, serving as the entry point for the workflow. In this BPMN diagram example, the process is outlined step by step: The process initiates with a start event, serving as the entry point for the workflow.
Following the start event, a manual task named "Introduction" is incorporated, responsible for displaying a welcoming message to the user. Following the start event, a manual task named "Introduction" is incorporated, responsible for displaying a welcoming message to the user.
![](images/Manual_EM.png) ![](images/Manual_EM.png)
Next, a **User task** named "Display Questions"is added, facilitating the collection of information from real individuals through web forms. In the properties section, a JSON form is created to specify the questions for the users to respond to. Next, a **User task** named "Display Questions"is added, facilitating the collection of information from real individuals through web forms.
In the properties section, a JSON form is created to specify the questions for the users to respond to.
![](images/User_EM.png) ![](images/User_EM.png)
Once the user completes the form, the gathered data is passed on to a **script task** named "Modify Information", responsible for calculating the data score. The script for this calculation is embedded in the properties section. Once the user completes the form, the gathered data is passed on to a **script task** named "Modify Information", responsible for calculating the data score.
The script for this calculation is embedded in the properties section.
![](images/Script_Em.png) ![](images/Script_Em.png)
As an alternative approach, the data score can also be determined using a **DMN table** named "Determine Score Message". Decision tables offer an effective means of defining business rules in an easily comprehensible format. The DMN table calculates the score based on pre-defined rules. As an alternative approach, the data score can also be determined using a **DMN table** named "Determine Score Message".
Decision tables offer an effective means of defining business rules in an easily comprehensible format.
The DMN table calculates the score based on pre-defined rules.
![](images/DMN_EM.png) ![](images/DMN_EM.png)
After the score calculation, an **exclusive gateway** is employed to make decisions based on the determined score. Three manual tasks are defined, each displaying a different message based on the obtained score: After the score calculation, an **exclusive gateway** is employed to make decisions based on the determined score.
Three manual tasks are defined, each displaying a different message based on the obtained score:
![](images/Exclusive_Em.png) ![](images/Exclusive_Em.png)
@ -207,6 +216,7 @@ b. **At Least One Correct Response**: If the score indicates that at least one r
c. **Perfect Score**: If the score indicates a perfect score, a manual task displays a message recognizing the excellent performance. c. **Perfect Score**: If the score indicates a perfect score, a manual task displays a message recognizing the excellent performance.
Once the score messages are displayed, a **signal event** is included, providing users with the option to continue and conclude the process or choose to repeat the process from the beginning. Signal events enable external forces or internal errors to interact with the process, and in this scenario, a button press allows for the interruption of the diagram's normal course. Once the score messages are displayed, a **signal event** is included, providing users with the option to continue and conclude the process or choose to repeat the process from the beginning.
Signal events enable external forces or internal errors to interact with the process, and in this scenario, a button press allows for the interruption of the diagram's normal course.
![](images/Signal_EM.png) ![](images/Signal_EM.png)

View File

@ -154,6 +154,7 @@ Example for UI schema:
Date validation when compared to another date allows you to ensure that a date field meets certain criteria concerning another date field. Date validation when compared to another date allows you to ensure that a date field meets certain criteria concerning another date field.
#### Minimum date validation #### Minimum date validation
For instance, you can require that a date must be equal to or greater than another date within the form. For instance, you can require that a date must be equal to or greater than another date within the form.
- To implement date validation compared to another date, use your JSON schema and specify the date field to compare with using the "minimumDate" property with a format like "field:field_name:start_or_end." - To implement date validation compared to another date, use your JSON schema and specify the date field to compare with using the "minimumDate" property with a format like "field:field_name:start_or_end."
@ -174,9 +175,11 @@ These enhancements provide you with more flexibility and control when building f
#### Maximum date validation #### Maximum date validation
Maximum date validation in relation to another date allows you to set constraints on a date field to ensure that it falls on or before another specified date within the form. This type of validation is particularly useful for setting deadlines, end dates, or the latest possible dates that are contingent on other dates in the workflow. Maximum date validation in relation to another date allows you to set constraints on a date field to ensure that it falls on or before another specified date within the form.
This type of validation is particularly useful for setting deadlines, end dates, or the latest possible dates that are contingent on other dates in the workflow.
To apply maximum date validation in your JSON schema, use the `maximumDate` property and specify the field to compare with, using the format `field:field_name`. This ensures that the date chosen does not exceed the referenced field's date. To apply maximum date validation in your JSON schema, use the `maximumDate` property and specify the field to compare with, using the format `field:field_name`.
This ensures that the date chosen does not exceed the referenced field's date.
Heres an example where `delivery_date` must be on or before `end_date`: Heres an example where `delivery_date` must be on or before `end_date`:
@ -191,7 +194,8 @@ Heres an example where `delivery_date` must be on or before `end_date`:
If the referenced field is a date range, and you want to validate against the end of that range, the same `field:end_date` reference can be used, as the `maximumDate` will intuitively apply to the end of the range. If the referenced field is a date range, and you want to validate against the end of that range, the same `field:end_date` reference can be used, as the `maximumDate` will intuitively apply to the end of the range.
These schema configurations provide a robust framework for ensuring date fields in forms maintain logical consistency and adhere to process requirements. Utilizing maximum date validation, you can prevent dates from exceeding a certain threshold, which is essential for managing project timelines, delivery schedules, or any scenario where the latest permissible date is a factor. These schema configurations provide a robust framework for ensuring date fields in forms maintain logical consistency and adhere to process requirements.
Utilizing maximum date validation, you can prevent dates from exceeding a certain threshold, which is essential for managing project timelines, delivery schedules, or any scenario where the latest permissible date is a factor.
By incorporating these validations into SpiffWorkflow forms, you can create interactive forms that automatically enforce business rules, improving data quality and user experience. By incorporating these validations into SpiffWorkflow forms, you can create interactive forms that automatically enforce business rules, improving data quality and user experience.
@ -202,6 +206,7 @@ By incorporating these validations into SpiffWorkflow forms, you can create inte
Workflow processes often require the enforcement of minimum and maximum date constraints to align with operational timelines or project deadlines. This scenario demonstrates the configuration of both `minimumDate` and `maximumDate` validations within a form, ensuring that selected dates fall within a specific period defined by other date fields in the workflow. Workflow processes often require the enforcement of minimum and maximum date constraints to align with operational timelines or project deadlines. This scenario demonstrates the configuration of both `minimumDate` and `maximumDate` validations within a form, ensuring that selected dates fall within a specific period defined by other date fields in the workflow.
#### JSON Schema Configuration: #### JSON Schema Configuration:
The "test-maximum-date-schema.json" process model outlines a form structure that includes fields for `end_date`, `delivery_date`, and `delivery_date_range`, each with constraints on the earliest and latest dates that can be selected. The "test-maximum-date-schema.json" process model outlines a form structure that includes fields for `end_date`, `delivery_date`, and `delivery_date_range`, each with constraints on the earliest and latest dates that can be selected.
```json ```json
@ -232,17 +237,21 @@ The "test-maximum-date-schema.json" process model outlines a form structure that
``` ```
#### Field Descriptions: #### Field Descriptions:
- **End Date**: The final date by which all activities should be completed. - **End Date**: The final date by which all activities should be completed.
- **Preferred Delivery Date**: A single date indicating when the delivery of a service or product is preferred, bounded by today's date and the `end_date`. - **Preferred Delivery Date**: A single date indicating when the delivery of a service or product is preferred, bounded by today's date and the `end_date`.
- **Preferred Delivery Date Range**: A span of dates indicating an acceptable window for delivery, constrained by today's date and the `end_date`. - **Preferred Delivery Date Range**: A span of dates indicating an acceptable window for delivery, constrained by today's date and the `end_date`.
### Implementation in SpiffWorkflow Forms: ### Implementation in SpiffWorkflow Forms:
The schema enforces the following rules: The schema enforces the following rules:
- The `Preferred Delivery Date` cannot be earlier than today (the `minimumDate`) and not later than the `end_date` (the `maximumDate`). - The `Preferred Delivery Date` cannot be earlier than today (the `minimumDate`) and not later than the `end_date` (the `maximumDate`).
- The `Preferred Delivery Date Range` must start no earlier than today and end no later than the `end_date`. - The `Preferred Delivery Date Range` must start no earlier than today and end no later than the `end_date`.
### Display Fields Side-By-Side on Same Row ### Display Fields Side-By-Side on Same Row
When designing forms, it's often more user-friendly to display related fields, such as First Name and Last Name, side by side on the same row, rather than stacked vertically. The `ui:layout` attribute in your form's JSON schema enables this by allowing you to specify how fields are displayed relative to each other, controlling the grid columns each field occupies for a responsive design.
When designing forms, it's often more user-friendly to display related fields, such as First Name and Last Name, side by side on the same row, rather than stacked vertically.
The `ui:layout` attribute in your form's JSON schema enables this by allowing you to specify how fields are displayed relative to each other, controlling the grid columns each field occupies for a responsive design.
#### Form Schema Example: #### Form Schema Example:
@ -263,7 +272,8 @@ Define your form fields in the JSON schema as follows:
#### `ui:layout` Configuration: #### `ui:layout` Configuration:
The `ui:layout` attribute accepts an array of objects, each representing a conceptual "row" of fields. Here's how to use it: The `ui:layout` attribute accepts an array of objects, each representing a conceptual "row" of fields.
Here's how to use it:
```json ```json
{ {
@ -368,7 +378,8 @@ To incorporate the markdown widget into your rjsf form, follow these steps:
#### Overview #### Overview
The `NumericRangeField` component is a new feature in `spiffworkflow-frontend` that allows users to input numeric ranges. This component is designed to work with JSON schemas and provides two text inputs for users to enter minimum and maximum values for a given numeric range. The `NumericRangeField` component is a new feature in `spiffworkflow-frontend` that allows users to input numeric ranges.
This component is designed to work with JSON schemas and provides two text inputs for users to enter minimum and maximum values for a given numeric range.
#### JSON Schema Example #### JSON Schema Example

View File

@ -1,17 +1,21 @@
# Script Tasks # Script Tasks
Writing scripts refers to the process of creating custom code or scripts to enhance the functionality and automation of a software application or system. Writing scripts refers to the process of creating custom code or scripts to enhance the functionality and automation of a software application or system.
In SpiffArena, the scripting language used for writing scripts is Python, a widely used programming language. Python offers a rich array of libraries, frameworks, and tools that facilitate script development, making it a popular choice for implementing custom logic and automation. In SpiffArena, the scripting language used for writing scripts is Python, a widely used programming language.
Python offers a rich array of libraries, frameworks, and tools that facilitate script development, making it a popular choice for implementing custom logic and automation.
Let's explore an example of a Script Task in our basics section: Let's explore an example of a Script Task in our basics section:
1. **Start Event and User Task - "Form"** 1. **Start Event and User Task - "Form"**
The process starts with a Start Event, followed by a User Task named "Form". Users fill out the form, and the three values from the form are passed to the subsequent task, which is a Script Task. The process starts with a Start Event, followed by a User Task named "Form".
Users fill out the form, and the three values from the form are passed to the subsequent task, which is a Script Task.
2. **Script Task to collect data** 2. **Script Task to collect data**
In the Script Task, we have created a script that collects three variables from the form and calculates a score based on certain conditions. The score is then stored in the "score" variable. Let's delve into how we configured the script tasks: In the Script Task, we have created a script that collects three variables from the form and calculates a score based on certain conditions.
The score is then stored in the "score" variable.
Let's delve into how we configured the script tasks:
![Script_Task](images/Script_task_example.png) ![Script_Task](images/Script_task_example.png)
@ -95,69 +99,92 @@ del(k)
Please see the [implementing files themselves](https://github.com/sartography/spiff-arena/tree/main/spiffworkflow-backend/src/spiffworkflow_backend/scripts) for the gory details. Please see the [implementing files themselves](https://github.com/sartography/spiff-arena/tree/main/spiffworkflow-backend/src/spiffworkflow_backend/scripts) for the gory details.
### `delete_process_instances_with_criteria` ### `delete_process_instances_with_criteria`
This function deletes process instances that match the provided criteria. This function deletes process instances that match the provided criteria.
### `get_all_permissions` ### `get_all_permissions`
This function gets all permissions currently in the system. This function gets all permissions currently in the system.
### `get_current_task_info` ### `get_current_task_info`
This function returns the information about the current task. This function returns the information about the current task.
### `get_current_user` ### `get_current_user`
This function returns the current user. This function returns the current user.
### `get_data_sizes` ### `get_data_sizes`
This function returns a dictionary of information about the size of task data. This function returns a dictionary of information about the size of task data.
### `get_encoded_file_data` ### `get_encoded_file_data`
This function returns a string which is the encoded file data. This is a very expensive call.
This function returns a string which is the encoded file data.
This is a very expensive call.
### `get_env` ### `get_env`
This function returns the current environment - i.e., testing, staging, production. This function returns the current environment - i.e., testing, staging, production.
### `get_frontend_url` ### `get_frontend_url`
This function returns the URL to the frontend. This function returns the URL to the frontend.
### `get_group_members` ### `get_group_members`
This function returns the list of usernames of the users in the given group. This function returns the list of usernames of the users in the given group.
### `get_last_user_completing_task` ### `get_last_user_completing_task`
This function returns the last user who completed the given task. This function returns the last user who completed the given task.
### `get_localtime` ### `get_localtime`
This function converts a Datetime object into a Datetime object for a specific timezone. This function converts a Datetime object into a Datetime object for a specific timezone.
### `get_process_initiator_user` ### `get_process_initiator_user`
This function returns the user that initiated the process instance. This function returns the user that initiated the process instance.
### `get_secret` ### `get_secret`
This function returns the value for a previously configured secret. This function returns the value for a previously configured secret.
### `get_task_data_value` ### `get_task_data_value`
This function checks to see if a given value is in task data and returns its value. This function checks to see if a given value is in task data and returns its value.
If it does not exist or is None, it returns the default value. If it does not exist or is None, it returns the default value.
### `get_toplevel_process_info` ### `get_toplevel_process_info`
This function returns a dictionary of information about the currently running process. This function returns a dictionary of information about the currently running process.
### `get_url_for_task_with_bpmn_identifier` ### `get_url_for_task_with_bpmn_identifier`
This function returns the URL to the task show page for a task with the given BPMN identifier. This function returns the URL to the task show page for a task with the given BPMN identifier.
The script task calling this MUST be in the same process as the desired task and should be next to each other in the diagram. The script task calling this MUST be in the same process as the desired task and should be next to each other in the diagram.
### `get_user_properties` ### `get_user_properties`
This function gets the user properties for the current user. This function gets the user properties for the current user.
### `markdown_file_download_link` ### `markdown_file_download_link`
This function returns a string which is a markdown format string. This function returns a string which is a markdown format string.
### `refresh_permissions` ### `refresh_permissions`
This function adds permissions using a dictionary. This function adds permissions using a dictionary.
### `set_user_properties` ### `set_user_properties`
This function sets given user properties on the current user. This function sets given user properties on the current user.
### `times_executed_by_user` ### `times_executed_by_user`
This function returns a number indicating how many times the user has started an instance of the current process model. This function returns a number indicating how many times the user has started an instance of the current process model.
### `user_has_started_instance` ### `user_has_started_instance`
This function returns a boolean to indicate if the user has started an instance of the current process model. This function returns a boolean to indicate if the user has started an instance of the current process model.

View File

@ -1,6 +1,7 @@
# How to Create a BPMN Diagram # How to Create a BPMN Diagram
Starting to model a business process can indeed be a challenging task, especially when multiple departments and users are involved. Here are some helpful tips to guide you through the process and create effective process models: Starting to model a business process can indeed be a challenging task, especially when multiple departments and users are involved.
Here are some helpful tips to guide you through the process and create effective process models:
**Understand BPMN Symbols:** **Understand BPMN Symbols:**
Begin by thoroughly understanding the meaning and usage of each BPMN symbol. This will ensure that you use the symbols correctly to represent the various elements of your business process. Refer to the [Learn Basics](../appendices/bpmn_terminology.md) section to learn more about each symbol. Begin by thoroughly understanding the meaning and usage of each BPMN symbol. This will ensure that you use the symbols correctly to represent the various elements of your business process. Refer to the [Learn Basics](../appendices/bpmn_terminology.md) section to learn more about each symbol.

View File

@ -1,6 +1,7 @@
# Data Objects # Data Objects
In BPMN (Business Process Model and Notation), a data object represents the information or data used and produced by activities within a business process. It signifies the data elements or artifacts that are relevant to the process and provides a means to model the flow of data through the process. In BPMN (Business Process Model and Notation), a data object represents the information or data used and produced by activities within a business process.
It signifies the data elements or artifacts that are relevant to the process and provides a means to model the flow of data through the process.
They aid in elucidating the data flow and dependencies within the process, thus making it more straightforward to comprehend how information is utilized and transformed throughout the process execution. They aid in elucidating the data flow and dependencies within the process, thus making it more straightforward to comprehend how information is utilized and transformed throughout the process execution.
**Reasons to use data objects:** **Reasons to use data objects:**
@ -21,19 +22,23 @@ They aid in elucidating the data flow and dependencies within the process, thus
![data_input](images/data_input.png) ![data_input](images/data_input.png)
This represents the data or information that is needed as an input to initiate or carry out a specific task or process. BPMN input defines the data elements that must be provided or available for the task to be performed. This represents the data or information that is needed as an input to initiate or carry out a specific task or process.
BPMN input defines the data elements that must be provided or available for the task to be performed.
### Data Output ### Data Output
![data_output](images/data_output.png) ![data_output](images/data_output.png)
This signifies the data or information that is created or generated as a result of executing a task or process. BPMN output describes the data elements that are produced or altered during the execution of the task. This signifies the data or information that is created or generated as a result of executing a task or process.
BPMN output describes the data elements that are produced or altered during the execution of the task.
### Data Object Reference ### Data Object Reference
![data_object_reference](images/data_object_reference.png) ![data_object_reference](images/data_object_reference.png)
A Data Object in BPMN typically signifies a particular piece of information or a data entity that is exchanged or manipulated during the course of a business process. It can represent both physical and digital data. Examples of Data Objects include documents, forms, reports, databases, or any other data entity relevant to the process. A Data Object in BPMN typically signifies a particular piece of information or a data entity that is exchanged or manipulated during the course of a business process.
It can represent both physical and digital data.
Examples of Data Objects include documents, forms, reports, databases, or any other data entity relevant to the process.
## Data Input Configuration ## Data Input Configuration

View File

@ -1,8 +1,12 @@
# Decision Tables # Decision Tables
DMN tables are powerful tools for modeling and implementing business rules and decision logic. They allow you to define rules and their associated conditions and actions in a structured manner. By evaluating the conditions in each rule, the DMN engine can determine which rules are triggered based on the provided inputs and execute the corresponding actions. This provides a flexible and configurable approach to decision-making in various scenarios. DMN tables are powerful tools for modeling and implementing business rules and decision logic.
They allow you to define rules and their associated conditions and actions in a structured manner.
By evaluating the conditions in each rule, the DMN engine can determine which rules are triggered based on the provided inputs and execute the corresponding actions.
This provides a flexible and configurable approach to decision-making in various scenarios.
A DMN (Decision Model and Notation) table consists of several components that help define the decision logic and structure the decision-making process. The main components of a DMN table are: A DMN (Decision Model and Notation) table consists of several components that help define the decision logic and structure the decision-making process.
The main components of a DMN table are:
## DMN Components ## DMN Components

View File

@ -1,6 +1,7 @@
# Error Events # Error Events
Error Events in Business Process Model and Notation (BPMN) are pivotal in managing exceptions and errors that occur within business process workflows. These events enable processes to handle errors gracefully, ensuring that workflows are robust, resilient, and capable of addressing unforeseen issues efficiently. Error Events in Business Process Model and Notation (BPMN) are pivotal in managing exceptions and errors that occur within business process workflows.
These events enable processes to handle errors gracefully, ensuring that workflows are robust, resilient, and capable of addressing unforeseen issues efficiently.
Below, we delve into the types of Error Events, offering definitions and enriched context for their practical applications. Below, we delve into the types of Error Events, offering definitions and enriched context for their practical applications.
## Types of Error Events ## Types of Error Events
@ -9,7 +10,8 @@ Below, we delve into the types of Error Events, offering definitions and enriche
![Error Start Event](images/error-events1.png) ![Error Start Event](images/error-events1.png)
The Error Start Event triggers the start of a subprocess in reaction to an error identified in a different process or subprocess. It is a specialized event used to initiate error handling workflows dynamically. The Error Start Event triggers the start of a subprocess in reaction to an error identified in a different process or subprocess.
It is a specialized event used to initiate error handling workflows dynamically.
**Reason to Use**: **Reason to Use**:
- **Modular Error Handling**: Separates error handling logic into dedicated subprocesses, improving process organization and maintainability. - **Modular Error Handling**: Separates error handling logic into dedicated subprocesses, improving process organization and maintainability.
@ -27,6 +29,7 @@ In an automated supply chain system, an Error Start Event initiates a "Supplier
``` ```
### 2. Error Intermediate Event/Error Boundary Event ### 2. Error Intermediate Event/Error Boundary Event
![Error intermediate Event](images/error_intermediate_event.png) ![Error intermediate Event](images/error_intermediate_event.png)
An Error Boundary Event is attached to an activity, such as a service task, and is designed to catch errors that occur during the execution of that activity, allowing for an immediate transition to an error handling flow. An Error Boundary Event is attached to an activity, such as a service task, and is designed to catch errors that occur during the execution of that activity, allowing for an immediate transition to an error handling flow.
@ -43,7 +46,10 @@ Positioned within the normal flow of a process, this event signifies where an er
![Error Boundary Event Error Event](images/error_boundary_event.png) ![Error Boundary Event Error Event](images/error_boundary_event.png)
In a customer order workflow, when payment is initiated, a "Process Payment" service task interacts with an external gateway. An attached Error Boundary Event catches errors like "Payment Gateway Timeout" or "Payment Declined." For timeouts, the process redirects to "Retry Payment," allowing another attempt or urging the customer to use a different method. This setup ensures efficient error handling, guiding the process toward resolution based on the error type. In a customer order workflow, when payment is initiated, a "Process Payment" service task interacts with an external gateway.
An attached Error Boundary Event catches errors like "Payment Gateway Timeout" or "Payment Declined."
For timeouts, the process redirects to "Retry Payment," allowing another attempt or urging the customer to use a different method.
This setup ensures efficient error handling, guiding the process toward resolution based on the error type.
### 3. Error End Event ### 3. Error End Event
@ -60,10 +66,13 @@ This event marks the termination of a process path due to an error, signaling th
![Error End Event](images/ErrorEndEventExample.png) ![Error End Event](images/ErrorEndEventExample.png)
In a retail inventory management workflow, an End Error Event within a stock replenishment subprocess indicates the detection of an "Out of Stock" condition for a critical product that cannot be immediately resolved. This error propagates to the main inventory management process, prompting a temporary pause in sales operations for the affected product. In a retail inventory management workflow, an End Error Event within a stock replenishment subprocess indicates the detection of an "Out of Stock" condition for a critical product that cannot be immediately resolved.
This error propagates to the main inventory management process, prompting a temporary pause in sales operations for the affected product.
## Example 1: Error Boundary Events in SpiffArena ## Example 1: Error Boundary Events in SpiffArena
In this example, we're modeling a process in BPMN that involves fetching employee data from an external HR system (BambooHR) and handling potential errors using an Error Boundary Event. This process begins with a simple task and moves through a service task designed to interact with the BambooHR API, with specific error handling in place.
In this example, we're modeling a process in BPMN that involves fetching employee data from an external HR system (BambooHR) and handling potential errors using an Error Boundary Event.
This process begins with a simple task and moves through a service task designed to interact with the BambooHR API, with specific error handling in place.
### Process Overview: ### Process Overview:
@ -92,7 +101,8 @@ Prior to the service task's execution, one potential error ID (`error_0`) is def
![Error Event](images/error_event_example5.png) ![Error Event](images/error_event_example5.png)
Attached to the service task, this event catches `error_0` ("Invalid Schema"), setting an alternative path for error handling. The error details are stored in a variable named `err0`. Attached to the service task, this event catches `error_0` ("Invalid Schema"), setting an alternative path for error handling.
The error details are stored in a variable named `err0`.
5. **Manual Tasks for Error Handling and Success Path**: 5. **Manual Tasks for Error Handling and Success Path**:
@ -106,13 +116,17 @@ Both paths conclude with an End Event, signifying the process's completion regar
### Explanation: ### Explanation:
The process starts when a department manager requests the details of an employee for performance evaluation. The service task activates, attempting to fetch the requested data from BambooHR. If the data retrieved does not match the expected schema—perhaps due to an API update or misconfiguration—the Error Boundary Event triggers, diverting the process to a corrective task. The process starts when a department manager requests the details of an employee for performance evaluation.
The service task activates, attempting to fetch the requested data from BambooHR.
If the data retrieved does not match the expected schema—perhaps due to an API update or misconfiguration—the Error Boundary Event triggers, diverting the process to a corrective task.
Here, an IT specialist might investigate the schema issue, adjust the service task's parameters, or manually retrieve the required information. Concurrently, the successful execution path without errors would lead directly to the HR department for immediate use of the employee data, streamlining departmental operations and decision-making. Here, an IT specialist might investigate the schema issue, adjust the service task's parameters, or manually retrieve the required information.
Concurrently, the successful execution path without errors would lead directly to the HR department for immediate use of the employee data, streamlining departmental operations and decision-making.
This BPMN example highlights the utility of Error Boundary Events in ensuring process resilience, especially when integrating external services. This BPMN example highlights the utility of Error Boundary Events in ensuring process resilience, especially when integrating external services.
## Example 2: Error Boundary Events in Subprocess ## Example 2: Error Boundary Events in Subprocess
In this example, we're outlining a BPMN process that demonstrates how to handle errors within an expanded subprocess and subsequently manage the error through an Error Boundary Event. In this example, we're outlining a BPMN process that demonstrates how to handle errors within an expanded subprocess and subsequently manage the error through an Error Boundary Event.
### Process Description: ### Process Description:
@ -127,7 +141,11 @@ The process is triggered by a user action or system event, setting the stage for
![Error Event](images/error_boundary_event_with_expanded-subprocess2.png) ![Error Event](images/error_boundary_event_with_expanded-subprocess2.png)
This element encapsulates a more detailed process flow within itself, starting with its own Start Event and comprising several tasks. The **Start Event** marks the beginning of the subprocess. Next, the **Manual Task 1** represents an initial activity within the subprocess that could be anything from data entry to review by a human operator. Then the Error End Event is used to throw an error within the process. The setup of the error end event is: This element encapsulates a more detailed process flow within itself, starting with its own Start Event and comprising several tasks.
The **Start Event** marks the beginning of the subprocess.
Next, the **Manual Task 1** represents an initial activity within the subprocess that could be anything from data entry to review by a human operator.
Then the Error End Event is used to throw an error within the process.
The setup of the error end event is:
- **Error ID Setup** - **Error ID Setup**
@ -144,7 +162,8 @@ Configured to represent the occurrence of Error1, with an error code "Err1" and
![Error Event](images/error_boundary_event_with_expanded-subprocess4.png) ![Error Event](images/error_boundary_event_with_expanded-subprocess4.png)
Attached to the Expanded Subprocess, an Error Boundary Event is designed to catch Error1 emanating from the subprocess, particularly from the Error End Event. The error caught is identified by the code "Err1", and its details are captured in a variable named `message`. Attached to the Expanded Subprocess, an Error Boundary Event is designed to catch Error1 emanating from the subprocess, particularly from the Error End Event.
The error caught is identified by the code "Err1", and its details are captured in a variable named `message`.
4. **Manual Task** 4. **Manual Task**
@ -164,4 +183,5 @@ This example demonstrates the utility of expanded subprocesses for detailed inte
### Conclusion ### Conclusion
Error Events in BPMN offer a nuanced approach to managing errors within business processes. By defining Error Start, End, and Boundary Events, BPMN provides process designers with the tools necessary to anticipate, signal, and handle errors efficiently. Error Events in BPMN offer a nuanced approach to managing errors within business processes.
By defining Error Start, End, and Boundary Events, BPMN provides process designers with the tools necessary to anticipate, signal, and handle errors efficiently.

View File

@ -91,6 +91,7 @@ The application of the last example aligns with the first, where the escalation
It's crucial to remember that whether a process is created or terminated in these contexts depends on whether non-interrupting or interrupting events are utilized. It's crucial to remember that whether a process is created or terminated in these contexts depends on whether non-interrupting or interrupting events are utilized.
## Configuring Escalation Events Properties ## Configuring Escalation Events Properties
Setting up an escalation event within a workflow in SpiffWorkflow involves defining both the escalation trigger (throw event) and the point where the escalation is handled (catch event). Setting up an escalation event within a workflow in SpiffWorkflow involves defining both the escalation trigger (throw event) and the point where the escalation is handled (catch event).
Here's how to set up these components: Here's how to set up these components:
@ -115,7 +116,8 @@ Here's how to set up these components:
**Define the Escalation Catch Event**: **Define the Escalation Catch Event**:
This can be a boundary event attached to a task where the escalation should be caught and handled, or an intermediate event in the workflow where the escalation process converges. This can be a boundary event attached to a task where the escalation should be caught and handled, or an intermediate event in the workflow where the escalation process converges.
For a boundary catch event, attach it to the task designated to handle the escalation. For an intermediate catch event, place it at the appropriate point in the process flow. For a boundary catch event, attach it to the task designated to handle the escalation.
For an intermediate catch event, place it at the appropriate point in the process flow.
![Escalation Order](images/Escalation_Order_2.png) ![Escalation Order](images/Escalation_Order_2.png)

View File

@ -1,6 +1,8 @@
# Events # Events
Events are specific occurrences that dictate the flow or outcome of processes. They are visually represented by circles. Based on their position and function, events are categorized as: Start Events, Intermediate Events, or End Events. Events are specific occurrences that dictate the flow or outcome of processes.
They are visually represented by circles.
Based on their position and function, events are categorized as: Start Events, Intermediate Events, or End Events.
![start_message_event](images/events_categories.png) ![start_message_event](images/events_categories.png)
@ -20,7 +22,12 @@ End Events signify the conclusion of a process and don't have outgoing flows.
- Link Events - Link Events
- Terminate Events - Terminate Events
We will delve into the various event types, exploring their categorizations and applications. It's vital to note that not every type of event is suitable for all variations; there are specific rules and guidelines governing their use. As highlighted in the table, there are distinct limitations. For instance, while you cannot initiate a primary process with an escalation event, it's entirely permissible to kickstart a subprocess with such an event. To adhere to BPMN standards, it's crucial to consult and follow the provided guide. Always ensure that your processes and diagrams conform to these accepted norms. We will delve into the various event types, exploring their categorizations and applications.
It's vital to note that not every type of event is suitable for all variations; there are specific rules and guidelines governing their use.
As highlighted in the table, there are distinct limitations.
For instance, while you cannot initiate a primary process with an escalation event, it's entirely permissible to kickstart a subprocess with such an event.
To adhere to BPMN standards, it's crucial to consult and follow the provided guide.
Always ensure that your processes and diagrams conform to these accepted norms.
![events_table](images/events_table.png) ![events_table](images/events_table.png)
@ -28,7 +35,8 @@ We will delve into the various event types, exploring their categorizations and
![start_event_t](images/start_event_t.png) ![start_event_t](images/start_event_t.png)
Start events signify the beginning of a specific process and can consist of catch, timer, or conditional events. They do not possess any incoming sequence flows. Start events signify the beginning of a specific process and can consist of catch, timer, or conditional events.
They do not possess any incoming sequence flows.
**Reasons to Use a Start Event:** **Reasons to Use a Start Event:**
@ -39,7 +47,8 @@ Start events signify the beginning of a specific process and can consist of catc
![intermediate_event](images/intermediate_event.png) ![intermediate_event](images/intermediate_event.png)
An Intermediate Event takes place between the beginning and the conclusion of a process. They can either wait for a specific occurrence (such as receiving a message), initiate an occurrence (like dispatching a message), or pause until a condition is fulfilled or a designated time elapses. An Intermediate Event takes place between the beginning and the conclusion of a process.
They can either wait for a specific occurrence (such as receiving a message), initiate an occurrence (like dispatching a message), or pause until a condition is fulfilled or a designated time elapses.
**Reasons to Use an Intermediate Event:** **Reasons to Use an Intermediate Event:**
@ -53,7 +62,8 @@ An Intermediate Event takes place between the beginning and the conclusion of a
![intermediate_throw_message_event](images/end_event.png) ![intermediate_throw_message_event](images/end_event.png)
End events signify the end of a particular process. Once this event is reached, the process stops, and no further activities within this process will be executed. End events signify the end of a particular process.
Once this event is reached, the process stops, and no further activities within this process will be executed.
**Reasons to Use an End Event:** **Reasons to Use an End Event:**
@ -70,7 +80,8 @@ Non-Interrupting Events allow the current process or activity to continue its ex
![interrupting_group](images/interrupting_group.png) ![interrupting_group](images/interrupting_group.png)
When an interrupting event is triggered, it interrupts the flow of the process or activity it's attached to. Once triggered, the current activity or process is halted, and the process flow directed by the interrupting event is taken up. When an interrupting event is triggered, it interrupts the flow of the process or activity it's attached to.
Once triggered, the current activity or process is halted, and the process flow directed by the interrupting event is taken up.
**Reasons to Use an Interrupting Event:** **Reasons to Use an Interrupting Event:**
@ -86,7 +97,8 @@ Think of a scenario where a manager is given a specific duration to assess a req
![non-interrupting_group](images/non-interrupting_group.png) ![non-interrupting_group](images/non-interrupting_group.png)
When a Non-Interrupting event is triggered, it does not halt or disrupt the main activity it's attached to. Instead, the process or activity continues its execution in parallel with the event's associated flow. When a Non-Interrupting event is triggered, it does not halt or disrupt the main activity it's attached to.
Instead, the process or activity continues its execution in parallel with the event's associated flow.
**Reasons to Use a Non-Interrupting Event:** **Reasons to Use a Non-Interrupting Event:**
@ -107,7 +119,9 @@ Catch Events passively wait or listen for messages, signals, or events to be rec
![throw_events](images/throw_events.png) ![throw_events](images/throw_events.png)
Throw events are used to "send" or "throw" a particular type of event. In BPMN, when we talk about a throw event, we're generally discussing an activity or a situation where a specific signal, message, or error is being generated or sent out. It is the trigger, initiating an action. Throw events are used to "send" or "throw" a particular type of event.
In BPMN, when we talk about a throw event, we're generally discussing an activity or a situation where a specific signal, message, or error is being generated or sent out.
It is the trigger, initiating an action.
```{admonition} Note ```{admonition} Note
@ -129,7 +143,9 @@ Imagine a situation where, as soon as the manufacturing of an item begins, a sig
![catch_events](images/catch_events.png) ![catch_events](images/catch_events.png)
Catch events are used to "receive" or "catch" a particular type of event. In BPMN, when we refer to a catch event, we're talking about a point in the process where it's waiting for or listening to a specific event from another process or activity. It is the listener, awaiting a trigger. Catch events are used to "receive" or "catch" a particular type of event.
In BPMN, when we refer to a catch event, we're talking about a point in the process where it's waiting for or listening to a specific event from another process or activity.
It is the listener, awaiting a trigger.
```{admonition} Note ```{admonition} Note
⚠ End Events are always throw events and cannot act as catch events. They serve as triggers to initiate subsequent processes or actions. ⚠ End Events are always throw events and cannot act as catch events. They serve as triggers to initiate subsequent processes or actions.
@ -154,7 +170,8 @@ Boundary and Non-Boundary Events are pivotal in determining how certain events a
![boundary_event](images/boundary_event.png) ![boundary_event](images/boundary_event.png)
Boundary events are attached to specific activities in a BPMN diagram, representing something that could happen while the activity is being executed. If the boundary event gets triggered, it can interrupt or not interrupt the attached activity, depending on its type. Boundary events are attached to specific activities in a BPMN diagram, representing something that could happen while the activity is being executed.
If the boundary event gets triggered, it can interrupt or not interrupt the attached activity, depending on its type.
**Reasons to Use a Boundary Event:** **Reasons to Use a Boundary Event:**
@ -171,7 +188,9 @@ In our initial scenario where a manager needs to approve or review a submission,
![non-boundary_event](images/non-boundary_event.png) ![non-boundary_event](images/non-boundary_event.png)
Non-boundary Events stand alone in the BPMN process flow. They aren't attached to any activity, and they represent something that happens between activities. Unlike boundary events that are attached to tasks or sub-processes. Non-boundary Events stand alone in the BPMN process flow.
They aren't attached to any activity, and they represent something that happens between activities.
Unlike boundary events that are attached to tasks or sub-processes.
**Reasons to Use a Non-Boundary Event:** **Reasons to Use a Non-Boundary Event:**
@ -182,10 +201,16 @@ Non-boundary Events stand alone in the BPMN process flow. They aren't attached t
**Example:** **Example:**
Consider a manufacturing scenario. If we want to initiate a separate process before starting the manufacturing task, we deploy an intermediate signal event. This event's role is specifically to trigger a distinct process. Unlike a start event, if a boundary catch event isn't active—meaning there's no active instance waiting at that signal event, the thrown signal won't be caught, and the separate process remains unused. Consider a manufacturing scenario.
If our goal is to schedule the manufacturing to kick off at a specific time, say 6 pm, or delay it for a short while, we can represent this with a timer. The workflow will pause at this event until the timer's conditions are satisfied. If we want to initiate a separate process before starting the manufacturing task, we deploy an intermediate signal event.
Conditions function as gatekeepers. For instance, the process will halt until the 'production_sheet_signed' variable evaluates as true, indicating the production sheet has been signed and manufacturing can commence. This event's role is specifically to trigger a distinct process.
The concept mirrors that of messages and signals. A message, much like a signal, will only be caught in an intermediate event if there's an instance at the ready in the associated catch event. Unlike a start event, if a boundary catch event isn't active—meaning there's no active instance waiting at that signal event, the thrown signal won't be caught, and the separate process remains unused.
If our goal is to schedule the manufacturing to kick off at a specific time, say 6 pm, or delay it for a short while, we can represent this with a timer.
The workflow will pause at this event until the timer's conditions are satisfied.
Conditions function as gatekeepers.
For instance, the process will halt until the 'production_sheet_signed' variable evaluates as true, indicating the production sheet has been signed and manufacturing can commence.
The concept mirrors that of messages and signals.
A message, much like a signal, will only be caught in an intermediate event if there's an instance at the ready in the associated catch event.
```{admonition} Note ```{admonition} Note
⚠ Remember the key distinction between signals and message events is that while messages adhere to a one-to-one correspondence, signals can potentially relate to multiple recipients in a one-to-many fashion. ⚠ Remember the key distinction between signals and message events is that while messages adhere to a one-to-one correspondence, signals can potentially relate to multiple recipients in a one-to-many fashion.

View File

@ -1,6 +1,8 @@
# Gateways # Gateways
Gateways in BPMN are essential for controlling the flow of a business process. They act as decision points where the process flow can diverge into multiple paths or converge back into a single flow. Gateways are used to evaluate conditions or rules and determine the appropriate path for the process to follow. Gateways in BPMN are essential for controlling the flow of a business process.
They act as decision points where the process flow can diverge into multiple paths or converge back into a single flow.
Gateways are used to evaluate conditions or rules and determine the appropriate path for the process to follow.
**Reasons to use a Gateway:** **Reasons to use a Gateway:**
@ -14,11 +16,15 @@ Gateways in BPMN are essential for controlling the flow of a business process. T
![exclusive_gateway](images/exclusive_gateway.png) ![exclusive_gateway](images/exclusive_gateway.png)
Exclusive Gateway (XOR): An Exclusive Gateway represents a decision point where only one outgoing sequence flow can be taken. It is used when the process flow needs to make a mutually exclusive choice between different paths. Each outgoing sequence flow has a condition associated with it, and the flow with a true condition is selected. Exclusive Gateway (XOR): An Exclusive Gateway represents a decision point where only one outgoing sequence flow can be taken.
It is used when the process flow needs to make a mutually exclusive choice between different paths.
Each outgoing sequence flow has a condition associated with it, and the flow with a true condition is selected.
**Default Flow:** **Default Flow:**
Whenever the conditions on the other paths aren't met, the instance will proceed via the Default Flow. In other words, if none of the conditions for the outgoing sequence flows are met, the Default Flow provides an alternative route for the process to follow. This ensures that the process can still progress even if none of the explicitly defined conditions are satisfied, providing a fallback option for handling unexpected scenarios. Whenever the conditions on the other paths aren't met, the instance will proceed via the Default Flow.
In other words, if none of the conditions for the outgoing sequence flows are met, the Default Flow provides an alternative route for the process to follow.
This ensures that the process can still progress even if none of the explicitly defined conditions are satisfied, providing a fallback option for handling unexpected scenarios.
![exclusive_gateway_default](images/exclusive_gateway_default.png) ![exclusive_gateway_default](images/exclusive_gateway_default.png)
@ -27,14 +33,19 @@ Avoiding conflicting conditions is straightforward when evaluating only one vari
![exclusive_gateway_examples](images/exclusive_gateway_examples.png) ![exclusive_gateway_examples](images/exclusive_gateway_examples.png)
For example, consider a scenario where multiple variables are involved in a process, and it's possible for more than one variable to be true in the process context (see image 2 above). To guarantee that only one condition will be true, you can use additional expressions, such as "voucher == false" to specify distinct paths for each condition. This ensures that only one branch of the expression will be true, preventing conflicts and providing a clear direction for the process flow (see image 3 above). For example, consider a scenario where multiple variables are involved in a process, and it's possible for more than one variable to be true in the process context (see image 2 above).
To guarantee that only one condition will be true, you can use additional expressions, such as "voucher == false" to specify distinct paths for each condition.
This ensures that only one branch of the expression will be true, preventing conflicts and providing a clear direction for the process flow (see image 3 above).
In cases where there might be more options to evaluate later, and all options will follow the same route, consider using a Default Flow. This can be particularly useful when dealing with scenarios where additional payment gateways might be added in the future, but they will all follow the same processing path. You won't have to modify the expression whenever new payment gateways are added, only if the underlying logic changes (see image 4 above). In cases where there might be more options to evaluate later, and all options will follow the same route, consider using a Default Flow.
This can be particularly useful when dealing with scenarios where additional payment gateways might be added in the future, but they will all follow the same processing path.
You won't have to modify the expression whenever new payment gateways are added, only if the underlying logic changes (see image 4 above).
**Join:** **Join:**
To join or merge an Exclusive Gateway (see Image 1) is not mandatory; it depends on the specific scenario. When the process encounters the Exclusive Merge, only one of the incoming sequence flows will be activated, indicating which path was completed first or satisfied its specific condition. To join or merge an Exclusive Gateway (see Image 1) is not mandatory; it depends on the specific scenario. When the process encounters the Exclusive Merge, only one of the incoming sequence flows will be activated, indicating which path was completed first or satisfied its specific condition.
While the Exclusive Merge is commonly used alongside the Exclusive Gateway, it is also compatible with other gateway types in BPMN. It serves as a valuable mechanism for synchronizing and consolidating multiple parallel paths, ensuring that only one path is followed based on the given conditions. While the Exclusive Merge is commonly used alongside the Exclusive Gateway, it is also compatible with other gateway types in BPMN.
It serves as a valuable mechanism for synchronizing and consolidating multiple parallel paths, ensuring that only one path is followed based on the given conditions.
![exclusive_merge](images/exclusive_merge.png) ![exclusive_merge](images/exclusive_merge.png)
@ -42,7 +53,9 @@ While the Exclusive Merge is commonly used alongside the Exclusive Gateway, it i
![inclusive_gateway](images/inclusive_gateway.png) ![inclusive_gateway](images/inclusive_gateway.png)
Inclusive Gateway (OR): Represents a decision point, but it allows multiple outgoing sequence flows to be taken. It is used when the process flow needs to make an inclusive choice, where multiple paths can be followed simultaneously. Each outgoing sequence flow can have a condition associated with it, but even if multiple conditions evaluate to true, all the flows are taken. Inclusive Gateway (OR): Represents a decision point, but it allows multiple outgoing sequence flows to be taken.
It is used when the process flow needs to make an inclusive choice, where multiple paths can be followed simultaneously.
Each outgoing sequence flow can have a condition associated with it, but even if multiple conditions evaluate to true, all the flows are taken.
```{admonition} Note ```{admonition} Note
⚠ Note that Default Flow is not possible with Inclusive Gateways. ⚠ Note that Default Flow is not possible with Inclusive Gateways.
@ -55,23 +68,35 @@ At least one path should be true for the process to continue. Unlike an Exclusiv
![inclusive_gateway_conditions](images/inclusive_gateway_conditions.png) ![inclusive_gateway_conditions](images/inclusive_gateway_conditions.png)
For example, in a career matching system, individuals can input their skillsets, educational qualifications, and work experience. An Inclusive Gateway can be employed to assess the compatibility of the individual's skillsets with various job roles. The process may diverge into multiple paths, each representing different job categories. For example, some candidates may possess strong problem-solving skills but lack coding proficiency, making them suitable for specific departments that require problem-solving expertise. On the other hand, other candidates might have a combination of problem-solving and coding skills, making them eligible for multiple departments where these skills are essential, this means the result is not exclusive to one path.. For example, in a career matching system, individuals can input their skillsets, educational qualifications, and work experience.
An Inclusive Gateway can be employed to assess the compatibility of the individual's skillsets with various job roles.
The process may diverge into multiple paths, each representing different job categories.
For example, some candidates may possess strong problem-solving skills but lack coding proficiency, making them suitable for specific departments that require problem-solving expertise.
On the other hand, other candidates might have a combination of problem-solving and coding skills, making them eligible for multiple departments where these skills are essential, this means the result is not exclusive to one path..
**Join:** **Join:**
The purpose of an Inclusive Gateway merge is to consolidate multiple parallel paths that were previously split. Unlike an Exclusive Gateway merge, which selects only one path based on conditions, the Inclusive Gateway merge evaluates all incoming sequence flows and allows all paths with true conditions to proceed. This means that if multiple paths were activated during the parallel execution, all these paths will converge. The purpose of an Inclusive Gateway merge is to consolidate multiple parallel paths that were previously split.
Unlike an Exclusive Gateway merge, which selects only one path based on conditions, the Inclusive Gateway merge evaluates all incoming sequence flows and allows all paths with true conditions to proceed.
This means that if multiple paths were activated during the parallel execution, all these paths will converge.
![inclusive_gateway_merge](images/inclusive_gateway_merge.png) ![inclusive_gateway_merge](images/inclusive_gateway_merge.png)
It's important to note that the use of an Inclusive Gateway and its corresponding merge is not mandatory in a process. They can be used independently, depending on the specific scenario and process requirements. In some cases, only the Inclusive Gateway might be used to split the flow into multiple paths based on different conditions without necessarily requiring a merge later in the process. Similarly, the Inclusive Gateway merge can be used without an Inclusive Gateway to consolidate parallel paths from other types of gateways, or even from different parts of the process. It's important to note that the use of an Inclusive Gateway and its corresponding merge is not mandatory in a process.
They can be used independently, depending on the specific scenario and process requirements.
In some cases, only the Inclusive Gateway might be used to split the flow into multiple paths based on different conditions without necessarily requiring a merge later in the process.
Similarly, the Inclusive Gateway merge can be used without an Inclusive Gateway to consolidate parallel paths from other types of gateways, or even from different parts of the process.
## Parallel Gateway ## Parallel Gateway
![parallel_gateway](images/parallel_gateway.png) ![parallel_gateway](images/parallel_gateway.png)
Parallel Gateway (AND): Is used to split the process flow into multiple parallel paths, allowing concurrent execution of activities. All outgoing sequence flows from a Parallel Gateway are taken simultaneously, and the process flow continues along all the paths simultaneously. Parallel Gateway (AND): Is used to split the process flow into multiple parallel paths, allowing concurrent execution of activities.
All outgoing sequence flows from a Parallel Gateway are taken simultaneously, and the process flow continues along all the paths simultaneously.
Unlike other gateways, a parallel gateway does not dictate the flow based on conditions. Instead, it ensures that all outgoing paths are followed concurrently, regardless of any conditions that may exist. This means that tasks or activities connected to the outgoing sequence flows will be executed simultaneously and independently from one another. Unlike other gateways, a parallel gateway does not dictate the flow based on conditions.
Instead, it ensures that all outgoing paths are followed concurrently, regardless of any conditions that may exist.
This means that tasks or activities connected to the outgoing sequence flows will be executed simultaneously and independently from one another.
```{admonition} Note ```{admonition} Note
⚠ Note that Default Flow is not possible with Parallel Gateways ⚠ Note that Default Flow is not possible with Parallel Gateways
@ -89,11 +114,16 @@ Note that the behavior for a parallel join, also known as a Parallel Gateway mer
![event_based_gateway](images/event_based_gateway.png) ![event_based_gateway](images/event_based_gateway.png)
Event-Based Gateway: An Event-Based Gateway is used to represent a branching point based on events occurring in the process. It is often associated with intermediate events in the process flow. When an event occurs, the gateway determines the subsequent flow based on event definitions and conditions. Event-Based Gateway: An Event-Based Gateway is used to represent a branching point based on events occurring in the process.
It is often associated with intermediate events in the process flow.
When an event occurs, the gateway determines the subsequent flow based on event definitions and conditions.
## Gateway Configuration ## Gateway Configuration
Unlike most tasks in BPMN, the configuration for Gateways is primarily set on the outgoing sequence flows, not in the Side Panel. Every Gateway, with the exception of the Parallel Gateway, requires conditions to be established on these outgoing sequence flows. These conditions dictate the direction of the process flow. It's also crucial to understand that conditions aren't required for incoming sequence flows to Gateways. Unlike most tasks in BPMN, the configuration for Gateways is primarily set on the outgoing sequence flows, not in the Side Panel.
Every Gateway, with the exception of the Parallel Gateway, requires conditions to be established on these outgoing sequence flows.
These conditions dictate the direction of the process flow.
It's also crucial to understand that conditions aren't required for incoming sequence flows to Gateways.
**Gateway:** **Gateway:**

View File

@ -2,9 +2,12 @@
## BPMN and SpiffWorkflow ## BPMN and SpiffWorkflow
Business Process Model and Notation (BPMN) is a diagramming language for specifying business processes. BPMN bridges the gap between business and IT, creating a shared process language for both parties. Business Process Model and Notation (BPMN) is a diagramming language for specifying business processes.
BPMN bridges the gap between business and IT, creating a shared process language for both parties.
BPMN efficiently depicts the details of process behaviors in a diagram. The precision of its meaning allows it to describe the technical details that control process execution in an automation engine. SpiffWorkflow enables you to create code to execute a BPMN diagram directly. BPMN efficiently depicts the details of process behaviors in a diagram.
The precision of its meaning allows it to describe the technical details that control process execution in an automation engine.
SpiffWorkflow enables you to create code to execute a BPMN diagram directly.
By using SpiffWorkflow, a client can create the BPMN diagram and have their product work without the need for you to modify the Python code, thus improving response and turnaround time. By using SpiffWorkflow, a client can create the BPMN diagram and have their product work without the need for you to modify the Python code, thus improving response and turnaround time.
@ -14,7 +17,8 @@ Flow objects are divided into three groups: Events, Gateways, and Tasks.
### Events ### Events
Events, represented by circles, describe occurrences during a process. There are three main types of events in business process modeling: start events, intermediate events, and end events. Events, represented by circles, describe occurrences during a process.
There are three main types of events in business process modeling: start events, intermediate events, and end events.
| **Event** | **Symbol**| **Description** | | **Event** | **Symbol**| **Description** |
|-----------|-----------|-----------------| |-----------|-----------|-----------------|
@ -24,7 +28,9 @@ Events, represented by circles, describe occurrences during a process. There are
### Gateways ### Gateways
Gateways represent decision points in a process. Based on certain conditions or rules, they determine which path the process will follow. There are various types of gateways: Gateways represent decision points in a process.
Based on certain conditions or rules, they determine which path the process will follow.
There are various types of gateways:
| **Gateway** | **Symbol**| **Description** | | **Gateway** | **Symbol**| **Description** |
|---------------|-----------|-----------------| |---------------|-----------|-----------------|
@ -35,7 +41,8 @@ Gateways represent decision points in a process. Based on certain conditions or
### Tasks ### Tasks
Tasks represent activities or work that needs to be done as part of a process. They can either be manual tasks that require human intervention or automated tasks that are performed by systems or applications. Tasks represent activities or work that needs to be done as part of a process.
They can either be manual tasks that require human intervention or automated tasks that are performed by systems or applications.
| **Task** | **Symbol** | **Description** | | **Task** | **Symbol** | **Description** |
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
@ -51,7 +58,8 @@ Tasks represent activities or work that needs to be done as part of a process. T
## Connecting Objects ## Connecting Objects
Connecting objects are lines that connect BPMN flow objects. Three different types exist: sequence flows, message flows, and associations. Connecting objects are lines that connect BPMN flow objects.
Three different types exist: sequence flows, message flows, and associations.
| **Connecting Objects** | **Symbol** | **Description** | | **Connecting Objects** | **Symbol** | **Description** |
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
@ -61,7 +69,8 @@ Connecting objects are lines that connect BPMN flow objects. Three different typ
## Artifacts ## Artifacts
Artifacts are used to provide additional information or documentation within a process. They include data objects (which represent information or data needed for the process), annotations (which provide explanatory or descriptive text), and groups (which are used to visually group related elements). Artifacts are used to provide additional information or documentation within a process.
They include data objects (which represent information or data needed for the process), annotations (which provide explanatory or descriptive text), and groups (which are used to visually group related elements).
| **Artifact** | **Symbol** | **Description** | | **Artifact** | **Symbol** | **Description** |
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| |---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
@ -72,8 +81,10 @@ Artifacts are used to provide additional information or documentation within a p
## Swimlanes ## Swimlanes
Swimlanes are used in a BPMN diagram to organize aspects of a process. They visually group objects into lanes, with each aspect of the process added to a separate lane. Swimlanes are used in a BPMN diagram to organize aspects of a process.
They visually group objects into lanes, with each aspect of the process added to a separate lane.
These elements can be arranged either horizontally or vertically. Not only do swimlanes organize activities into separate categories, but they also reveal delays, inefficiencies, and the individuals responsible for each step in a process. These elements can be arranged either horizontally or vertically.
Not only do swimlanes organize activities into separate categories, but they also reveal delays, inefficiencies, and the individuals responsible for each step in a process.
![Untitled](images/BPMN_swimlane-500x197.png) ![Untitled](images/BPMN_swimlane-500x197.png)

View File

@ -1,6 +1,8 @@
# Message Event # Message Event
A Message Event acts as a channel for the exchange of information between different process participants or external systems. While it might be tempting to associate "message events" with emails, their scope extends beyond digital correspondence. They signify the transmission of information between various process components, whether within the same process or across different processes. A Message Event acts as a channel for the exchange of information between different process participants or external systems.
While it might be tempting to associate "message events" with emails, their scope extends beyond digital correspondence.
They signify the transmission of information between various process components, whether within the same process or across different processes.
![message_relationship](images/relationship_message.png) ![message_relationship](images/relationship_message.png)
@ -30,27 +32,35 @@ In a subsequent example, it's clarified that message events can span multiple BP
![intermediate_throw_message_event](images/intermediate_throw_msg_event.png) ![intermediate_throw_message_event](images/intermediate_throw_msg_event.png)
An Intermediate Throw Event sends a message from within the process to a receiver. When the process execution reaches this event, it triggers a message event that can be captured by a corresponding Message Catch Event in another process or by an external system. An Intermediate Throw Event sends a message from within the process to a receiver.
When the process execution reaches this event, it triggers a message event that can be captured by a corresponding Message Catch Event in another process or by an external system.
![message_event_example_4](images/msg_event_example_2.png) ![message_event_example_4](images/msg_event_example_2.png)
In the example provided, once a quote is finalized, the sales team forwards a formal quotation. Similarly, upon receiving a purchase order from the customer, the sales department generates an order confirmation receipt and sends it to the customer. In the example provided, once a quote is finalized, the sales team forwards a formal quotation.
Similarly, upon receiving a purchase order from the customer, the sales department generates an order confirmation receipt and sends it to the customer.
## Intermediate Message Catch Event ## Intermediate Message Catch Event
![intermediate_catch_message_event](images/intermediate_catch_msg_event.png) ![intermediate_catch_message_event](images/intermediate_catch_msg_event.png)
An Intermediate Catch Event is used to wait for and capture a specific message from another source. Once activated upon receiving the designated message, it allows the process flow to continue from that point onward. It's crucial to understand that the process instance remains in a waiting state until triggered by another source or process. This fundamental distinction sets Intermediate Catch Events apart from Intermediate Throw Events, as Catch Events exclusively await external triggers, while Throw Events initiate those triggers. An Intermediate Catch Event is used to wait for and capture a specific message from another source.
Once activated upon receiving the designated message, it allows the process flow to continue from that point onward.
It's crucial to understand that the process instance remains in a waiting state until triggered by another source or process.
This fundamental distinction sets Intermediate Catch Events apart from Intermediate Throw Events, as Catch Events exclusively await external triggers, while Throw Events initiate those triggers.
![message_event_example_5](images/msg_event_example_5.png) ![message_event_example_5](images/msg_event_example_5.png)
Alternatively, Message Events can be utilized beyond the confines of pools and lanes. The process outlined in the previous section can be split into two distinct BPMN files without affecting its functionality, as demonstrated in the example above. Alternatively, Message Events can be utilized beyond the confines of pools and lanes.
The process outlined in the previous section can be split into two distinct BPMN files without affecting its functionality, as demonstrated in the example above.
```{admonition} Note ```{admonition} Note
⚠ It should be noted that, in this situation, connectors cannot visually represent the link between the throw and catch events. Further in this document, the topic of correlation is discussed to clarify how these events are interconnected. ⚠ It should be noted that, in this situation, connectors cannot visually represent the link between the throw and catch events. Further in this document, the topic of correlation is discussed to clarify how these events are interconnected.
``` ```
In the given example, there are two Intermediate Catch Events. One waits for confirmation from the customer, and the other depends on the shipping department's verification of dispatch before producing the invoice. Without feedback from both the customer and the shipping department at this stage, the process instance won't move to the subsequent step. In the given example, there are two Intermediate Catch Events.
One waits for confirmation from the customer, and the other depends on the shipping department's verification of dispatch before producing the invoice.
Without feedback from both the customer and the shipping department at this stage, the process instance won't move to the subsequent step.
```{admonition} Note ```{admonition} Note
⚠ While it is generally true that most Throw Events have corresponding Catch Events, it is not an absolute rule. The need for a Catch Event alongside a Throw Event varies depending on the specific scenario and the requirements of the process. There may be cases where a Throw Event initiates an action without requiring a subsequent Catch Event to capture its effects. The decision to use a Catch Event in conjunction with a Throw Event is determined by the requirements of the particular process. ⚠ While it is generally true that most Throw Events have corresponding Catch Events, it is not an absolute rule. The need for a Catch Event alongside a Throw Event varies depending on the specific scenario and the requirements of the process. There may be cases where a Throw Event initiates an action without requiring a subsequent Catch Event to capture its effects. The decision to use a Catch Event in conjunction with a Throw Event is determined by the requirements of the particular process.
@ -60,11 +70,14 @@ In the given example, there are two Intermediate Catch Events. One waits for con
![end_message_event](images/end_msg_event.png) ![end_message_event](images/end_msg_event.png)
This type of event signifies the completion of a process and indicates that a message is sent to an external recipient to notify them of the process's conclusion. It serves as the endpoint of the process and sends a message when the process reaches this event. This type of event signifies the completion of a process and indicates that a message is sent to an external recipient to notify them of the process's conclusion.
It serves as the endpoint of the process and sends a message when the process reaches this event.
![message_event_example_4](images/msg_event_example_4.png) ![message_event_example_4](images/msg_event_example_4.png)
Please note that the End Event, when using pools, signifies the conclusion of the process within that specific pool, but it does not necessarily indicate the end of the entire process. In the provided example, the final step involves sending the customer an invoice. Prior to this, the last step for the shipping department was to send a confirmation. Please note that the End Event, when using pools, signifies the conclusion of the process within that specific pool, but it does not necessarily indicate the end of the entire process.
In the provided example, the final step involves sending the customer an invoice.
Prior to this, the last step for the shipping department was to send a confirmation.
```{admonition} Note ```{admonition} Note
⚠ Start Events mark the initiation point of a process. Intermediate Events occur during the course of the process, throwing, capturing and reacting to specific occurrences or messages from external sources or other processes. On the other hand, End Events denote the conclusion of the process, signifying its termination or reaching a final state. ⚠ Start Events mark the initiation point of a process. Intermediate Events occur during the course of the process, throwing, capturing and reacting to specific occurrences or messages from external sources or other processes. On the other hand, End Events denote the conclusion of the process, signifying its termination or reaching a final state.
@ -72,7 +85,9 @@ Please note that the End Event, when using pools, signifies the conclusion of th
## Correlation ## Correlation
A singular Throw Message Event corresponds exclusively to a single active Catch Message Event. This correlation is one-to-one, unlike Signal Events that could be sent to multiple active Signal Catch Events. It is important to configure the correlation of the Catch and Throw Events. A singular Throw Message Event corresponds exclusively to a single active Catch Message Event.
This correlation is one-to-one, unlike Signal Events that could be sent to multiple active Signal Catch Events.
It is important to configure the correlation of the Catch and Throw Events.
## Message Event Configuration ## Message Event Configuration

View File

@ -6,17 +6,21 @@ Multi-instance tasks can be configured to run either in parallel, where all inst
## **Sequential Execution** ## **Sequential Execution**
Tasks are executed one after another, ensuring that each task instance begins only after the previous one has completed. Tasks are executed one after another, ensuring that each task instance begins only after the previous one has completed.
In the case of a sequential multi-instance activity, the instances are executed one at a time. When one instance is completed, a new instance is created for the next element in the inputCollection. In the case of a sequential multi-instance activity, the instances are executed one at a time.
When one instance is completed, a new instance is created for the next element in the inputCollection.
![Multi_instance_Sequential](images/multiinstance_sequential_example.png) ![Multi_instance_Sequential](images/multiinstance_sequential_example.png)
## **Parallel Execution** ## **Parallel Execution**
All instances of the task are launched simultaneously, allowing for concurrent processing of the collection elements. In the case of a parallel multi-instance activity, all instances are created when the multi-instance body is activated. The instances are executed concurrently and independently from each other. All instances of the task are launched simultaneously, allowing for concurrent processing of the collection elements.
In the case of a parallel multi-instance activity, all instances are created when the multi-instance body is activated.
The instances are executed concurrently and independently from each other.
![Multi_instance_parallel](images/multiinstance_parallel_example.png) ![Multi_instance_parallel](images/multiinstance_parallel_example.png)
## Components of Multi-Instance Tasks ## Components of Multi-Instance Tasks
Multi-instance tasks comprise several key properties that define their behavior: Multi-instance tasks comprise several key properties that define their behavior:
```{image} ./images/multiinstance_properties.png ```{image} ./images/multiinstance_properties.png
@ -44,7 +48,6 @@ Specifically, the process manages a list of composers, their names, and genres,
### Process Overview: ### Process Overview:
1. **Start Event**: Marks the initiation of the process. 1. **Start Event**: Marks the initiation of the process.
#### 1. **Start Event**:
2. **Script Task - Create Dictionary**: This task initializes a list (array) of dictionaries, each representing a composer with their name and associated genre. The script effectively sets up the data structure that will be manipulated in subsequent steps of the process. 2. **Script Task - Create Dictionary**: This task initializes a list (array) of dictionaries, each representing a composer with their name and associated genre. The script effectively sets up the data structure that will be manipulated in subsequent steps of the process.
@ -93,9 +96,10 @@ This templating syntax iterates over the `composers` array, displaying each comp
### Summary: ### Summary:
This multi-instance example in a BPMN process highlights the capability to dynamically handle collections of data through scripting and manual tasks. By iterating over a list of composers, allowing for the editing of each item, and finally displaying the edited list, the process demonstrates how data can be manipulated and presented in a structured workflow, showcasing the flexibility and power of BPMN for data-driven processes. This multi-instance example in a BPMN process highlights the capability to dynamically handle collections of data through scripting and manual tasks.
By iterating over a list of composers, allowing for the editing of each item, and finally displaying the edited list, the process demonstrates how data can be manipulated and presented in a structured workflow, showcasing the flexibility and power of BPMN for data-driven processes.
## Loops ### Loops
Standard loops in Business Process Model and Notation (BPMN) are a fundamental mechanism to model repetitive tasks within a workflow. These loops allow for the execution of a specific task or sequence of tasks repeatedly until a predefined condition is met, mirroring traditional loop constructs found in programming languages. Standard loops in Business Process Model and Notation (BPMN) are a fundamental mechanism to model repetitive tasks within a workflow. These loops allow for the execution of a specific task or sequence of tasks repeatedly until a predefined condition is met, mirroring traditional loop constructs found in programming languages.

View File

@ -1,6 +1,9 @@
# Pools and Lanes # Pools and Lanes
A Pool represents a participant and can be seen as a self-contained process. This participant can be an internal entity (e.g., a department within a company) or an external entity (e.g., a customer or another company). Lanes are helpful in highlighting which specific role or department is responsible for certain activities or tasks in a process. A process can have one or more Pools, each with one or more Lanes. A Pool represents a participant and can be seen as a self-contained process.
This participant can be an internal entity (e.g., a department within a company) or an external entity (e.g., a customer or another company).
Lanes are helpful in highlighting which specific role or department is responsible for certain activities or tasks in a process.
A process can have one or more Pools, each with one or more Lanes.
**Reasons to Use Pools and Lanes:** **Reasons to Use Pools and Lanes:**
@ -13,11 +16,13 @@ A Pool represents a participant and can be seen as a self-contained process. Thi
## Pools ## Pools
A Pool can be configured as an "Empty Pool" (collapsed) or an "Expanded Pool". You can choose the desired configuration 🔧 from the element's options after dragging it onto your diagram. A Pool can be configured as an "Empty Pool" (collapsed) or an "Expanded Pool".
You can choose the desired configuration 🔧 from the element's options after dragging it onto your diagram.
![pools_and_lanes](images/pools_and_lanes_1.png) ![pools_and_lanes](images/pools_and_lanes_1.png)
Empty Pools are used to represent role players in cases where a specific process is neither known nor required, but the interaction points remain valuable. They serve to illustrate the engagement of certain entities without detailing their internal processes, for example, we don't know a customer's specific process but it matters when we interact with them to complete our process. Empty Pools are used to represent role players in cases where a specific process is neither known nor required, but the interaction points remain valuable.
They serve to illustrate the engagement of certain entities without detailing their internal processes, for example, we don't know a customer's specific process but it matters when we interact with them to complete our process.
Conversely, Expanded Pools are employed when the processes are known and hold relevance within the diagram's context. Conversely, Expanded Pools are employed when the processes are known and hold relevance within the diagram's context.
@ -27,7 +32,8 @@ Lanes group activities within a single Pool, usually signifying different roles
![lanes](images/lanes_1.png) ![lanes](images/lanes_1.png)
Lanes are incorporated into Pools when the roles they represent belong to the same entity. However, if a process doesn't logically fit within the same Pool, like those for different organizations or businesses, it's more appropriate to represent it as a separate Pool rather than another Lane. Lanes are incorporated into Pools when the roles they represent belong to the same entity.
However, if a process doesn't logically fit within the same Pool, like those for different organizations or businesses, it's more appropriate to represent it as a separate Pool rather than another Lane.
![lanes](images/separate_pools_1.png) ![lanes](images/separate_pools_1.png)

View File

@ -1,6 +1,10 @@
# Signal Event # Signal Event
A Signal Event is a type of event that provides a mechanism for communication across different processes. Unlike messages that are sent from a specific sender to a specific receiver, signals are broadcast to multiple recipients. When a signal is thrown, all active processes that are listening for that signal can catch and react to it. Signals do not have any expectation of a response. Once a signal is sent out, it does not wait for a reply. A Signal Event is a type of event that provides a mechanism for communication across different processes.
Unlike messages that are sent from a specific sender to a specific receiver, signals are broadcast to multiple recipients.
When a signal is thrown, all active processes that are listening for that signal can catch and react to it.
Signals do not have any expectation of a response.
Once a signal is sent out, it does not wait for a reply.
![signal_relationship](images/signal_relationships.png) ![signal_relationship](images/signal_relationships.png)
@ -28,7 +32,8 @@ A Start Signal Event is especially valuable in situations where a parallel proce
![intermediate_throw_message_event](images/intermediate_throw_signal_event.png) ![intermediate_throw_message_event](images/intermediate_throw_signal_event.png)
An Intermediate Signal Throw Event is an event that happens in the middle of a process (not at the start or the end) and sends out a signal. This can be caught by another process or a different part of the same process using a Signal Catch Event. An Intermediate Signal Throw Event is an event that happens in the middle of a process (not at the start or the end) and sends out a signal.
This can be caught by another process or a different part of the same process using a Signal Catch Event.
![intermediate_throw_signal_example_2](images/intermediate_throw_signal_example_2.png) ![intermediate_throw_signal_example_2](images/intermediate_throw_signal_example_2.png)
@ -39,7 +44,8 @@ Using an Intermediate Throw Event aligns perfectly with the scenario illustrated
![intermediate_catch_message_event](images/intermediate_catch_signal_event.png) ![intermediate_catch_message_event](images/intermediate_catch_signal_event.png)
An Intermediate Signal Catch Event waits for a specific signal to start or continue a process. To "catch" means that this event is actively waiting or listening for that signal to be thrown from another part of the process or even from a different process. An Intermediate Signal Catch Event waits for a specific signal to start or continue a process.
To "catch" means that this event is actively waiting or listening for that signal to be thrown from another part of the process or even from a different process.
![intermediate_catch_signal_example](images/intermediate_catch_signal_example.png) ![intermediate_catch_signal_example](images/intermediate_catch_signal_example.png)
@ -66,16 +72,20 @@ This type of event signifies the end of a process or path and, at the same time,
**Example:** **Example:**
In an online shopping system, when a customer's payment is successfully processed, an End Signal Event can be triggered. This signal initiates three distinct processes: (1) the "Send Notification" process alerts the customer of their successful purchase, (2) the "Pack Order" process prompts the warehouse team to prepare the item for dispatch, and (3) the "Schedule Delivery" process alerts logistics to arrange for the item's delivery. In this manner, one event efficiently orchestrates a sequence of actions across multiple departments. In an online shopping system, when a customer's payment is successfully processed, an End Signal Event can be triggered.
This signal initiates three distinct processes: (1) the "Send Notification" process alerts the customer of their successful purchase, (2) the "Pack Order" process prompts the warehouse team to prepare the item for dispatch, and (3) the "Schedule Delivery" process alerts logistics to arrange for the item's delivery.
In this manner, one event efficiently orchestrates a sequence of actions across multiple departments.
![end_signal_event_example](images/signal_sync_example.png) ![end_signal_event_example](images/signal_sync_example.png)
**Example:** **Example:**
Signals are instrumental in coordinating workflows among varied processes, making certain that tasks adhere to a specified order. Leveraging intermediate catch and throw events allows one process to temporarily halt until tasks in a different process are finished. This is especially beneficial when certain stages can only commence after the completion of others — imagine the utility of such a system across multiple departments. Signals are instrumental in coordinating workflows among varied processes, making certain that tasks adhere to a specified order. Leveraging intermediate catch and throw events allows one process to temporarily halt until tasks in a different process are finished. This is especially beneficial when certain stages can only commence after the completion of others — imagine the utility of such a system across multiple departments.
This example demonstrates how Signal Events, along with a Timer Boundary Event, can be orchestrated within a BPMN process to create conditional pathways based on user actions and timed events. The process begins with a standard initiation and primarily revolves around the user interaction with a manual task that offers multiple outcomes. This example demonstrates how Signal Events, along with a Timer Boundary Event, can be orchestrated within a BPMN process to create conditional pathways based on user actions and timed events.
The process begins with a standard initiation and primarily revolves around the user interaction with a manual task that offers multiple outcomes.
## Example : Using Signal Boundary Events as Buttons ## Example : Using Signal Boundary Events as Buttons
This BPMN example showcases the flexibility of using Signal Events to create dynamic, user-driven process flows. This BPMN example showcases the flexibility of using Signal Events to create dynamic, user-driven process flows.
By incorporating manual tasks with multiple outcomes, signal-based routing, and automated timing controls, the example illustrates how complex decision logic and external system integration can be efficiently managed within a BPMN process. By incorporating manual tasks with multiple outcomes, signal-based routing, and automated timing controls, the example illustrates how complex decision logic and external system integration can be efficiently managed within a BPMN process.
@ -83,15 +93,18 @@ By incorporating manual tasks with multiple outcomes, signal-based routing, and
![signal_event_example](images/Signal_events_spiff_example.png) ![signal_event_example](images/Signal_events_spiff_example.png)
### 1. **Start Event**: ### 1. **Start Event**:
Initiates the workflow, leading to the first and main manual task. Initiates the workflow, leading to the first and main manual task.
### 2. **Manual Task with Boundary Events**: ### 2. **Manual Task with Boundary Events**:
This task is unique in that it presents the user with distinct options in the form of buttons. The default flow is a standard submission button that, when clicked, directs the workflow towards a conventional end. This task is unique in that it presents the user with distinct options in the form of buttons.
The default flow is a standard submission button that, when clicked, directs the workflow towards a conventional end.
![signal_event_example](images/Signal_events_spiff_example1.png) ![signal_event_example](images/Signal_events_spiff_example1.png)
Attached to **My Manual Task**, three Signal Boundary Events are set to listen for specific signals. These signals determine the flow of the process after the second button is pressed. Attached to **My Manual Task**, three Signal Boundary Events are set to listen for specific signals.
These signals determine the flow of the process after the second button is pressed.
- **Signal Boundary Event 1**: - **Signal Boundary Event 1**:
Catches the signal for "eat_spam" and redirects the workflow to a Manual Task named "Spam Message". Catches the signal for "eat_spam" and redirects the workflow to a Manual Task named "Spam Message".
@ -107,17 +120,21 @@ Designed to catch the "eat_potato_chips" signal, but uniquely does not lead to a
#### **Timer Boundary Event**: #### **Timer Boundary Event**:
Attached to **My Manual Task**, this event is configured to trigger after a specific duration, automating the process flow if the user does not interact with the manual task within the given timeframe. Notably, this event leads the process towards an alternative path or end without requiring user input. Attached to **My Manual Task**, this event is configured to trigger after a specific duration, automating the process flow if the user does not interact with the manual task within the given timeframe.
Notably, this event leads the process towards an alternative path or end without requiring user input.
![signal_event_example](images/Signal_events_spiff_example6.png) ![signal_event_example](images/Signal_events_spiff_example6.png)
### 3. **End Events**: ### 3. **End Events**:
The process includes multiple End Events. One is directly connected to **My Manual Task**, concluding the workflow if the first button is used. The process includes multiple End Events.
One is directly connected to **My Manual Task**, concluding the workflow if the first button is used.
The others are linked to the outcomes of the Signal Boundary Events and the Timer Boundary Event, ensuring that each possible path through the process reaches a defined conclusion. The others are linked to the outcomes of the Signal Boundary Events and the Timer Boundary Event, ensuring that each possible path through the process reaches a defined conclusion.
### Output: ### Output:
After starting the task, the signal buttons "Eat Cheetos" and "Eat Spam" will appear. Clicking on any button will lead to the respective manual task.
After starting the task, the signal buttons "Eat Cheetos" and "Eat Spam" will appear.
Clicking on any button will lead to the respective manual task.
![signal_event_example](images/Signal_events_spiff_example3.png) ![signal_event_example](images/Signal_events_spiff_example3.png)

View File

@ -1,6 +1,7 @@
# Sub-Processes and Call Activities # Sub-Processes and Call Activities
Sub-processes and call activities are both useful for simplifying and organizing complex workflows within larger processes. They serve distinct purposes and are used in different scenarios. Sub-processes and call activities are both useful for simplifying and organizing complex workflows within larger processes.
They serve distinct purposes and are used in different scenarios.
**Reasons to use Sub-Processes or Call Activities:** **Reasons to use Sub-Processes or Call Activities:**
@ -12,7 +13,9 @@ Sub-processes and call activities are both useful for simplifying and organizing
![active_call_process](images/active_call_process.png) ![active_call_process](images/active_call_process.png)
A Call Process is similar to a Sub-Process in that it encapsulates part of a workflow, but it is designed to be reused across multiple different processes. It's essentially a stand-alone process that can be "called" into action as required by other processes. Using a Call Process can help to eliminate redundancy and ensure consistent execution of the process steps. A Call Process is similar to a Sub-Process in that it encapsulates part of a workflow, but it is designed to be reused across multiple different processes.
It's essentially a stand-alone process that can be "called" into action as required by other processes.
Using a Call Process can help to eliminate redundancy and ensure consistent execution of the process steps.
**When to use a Call Process:** **When to use a Call Process:**

View File

@ -1,6 +1,7 @@
# Handling Sensitive Data Using Data Store # Handling Sensitive Data Using Data Store
## Introduction ## Introduction
Handling sensitive data, such as credit card numbers and passwords, requires careful management to ensure security and privacy. Handling sensitive data, such as credit card numbers and passwords, requires careful management to ensure security and privacy.
This documentation outlines the process of creating and managing sensitive data objects within SpiffWorkflow, along with setting appropriate permissions. This documentation outlines the process of creating and managing sensitive data objects within SpiffWorkflow, along with setting appropriate permissions.
@ -9,11 +10,14 @@ This documentation outlines the process of creating and managing sensitive data
#### 1. Identifying Sensitive Data #### 1. Identifying Sensitive Data
- Determine what constitutes sensitive data within your workflow. This could include personal information, financial details, or confidential business information. - Determine what constitutes sensitive data within your workflow. This could include personal information, financial details, or confidential business information.
#### 2. Data Object Creation and Script Task Integration #### 2. Data Object Creation and Script Task Integration
- **Script Task Setup**: Develop a script task that interacts with the data object. The script should be designed to handle the sensitive data securely, ensuring it's not exposed or logged inadvertently. - **Script Task Setup**: Develop a script task that interacts with the data object. The script should be designed to handle the sensitive data securely, ensuring it's not exposed or logged inadvertently.
- **Data Object Creation**: Create a data object in the workflow to store the sensitive data. This object acts as a container for the data, separating it from the main workflow logic. - **Data Object Creation**: Create a data object in the workflow to store the sensitive data. This object acts as a container for the data, separating it from the main workflow logic.
#### 3. Assigning Data Categories #### 3. Assigning Data Categories
- **Categorization**: Assign a specific category to the data object that reflects its sensitive nature. For example, categories like `confidential` or `private` or the name of the field can be used. - **Categorization**: Assign a specific category to the data object that reflects its sensitive nature. For example, categories like `confidential` or `private` or the name of the field can be used.
#### 4. Implementing Access Controls #### 4. Implementing Access Controls
- **Permission Rules**: Establish permission rules, using a Decision Model and Notation (DMN) table or another mechanism as described under [Admin and Permissions](/DevOps_installation_integration/admin_and_permissions.md). This step involves specifying who can access the sensitive data. - **Permission Rules**: Establish permission rules, using a Decision Model and Notation (DMN) table or another mechanism as described under [Admin and Permissions](/DevOps_installation_integration/admin_and_permissions.md). This step involves specifying who can access the sensitive data.
- **Access Restrictions**: Define the access level (e.g., read, write, deny) for different user groups or roles. For instance, you might restrict read access to certain groups while denying it to others. - **Access Restrictions**: Define the access level (e.g., read, write, deny) for different user groups or roles. For instance, you might restrict read access to certain groups while denying it to others.
- **URL-Based Permissions**: Use URL patterns to enforce permissions. For example, a URL pattern like `/process-data/confidential/*` can be used to control access to all data objects categorized as confidential. - **URL-Based Permissions**: Use URL patterns to enforce permissions. For example, a URL pattern like `/process-data/confidential/*` can be used to control access to all data objects categorized as confidential.
@ -25,6 +29,7 @@ This documentation outlines the process of creating and managing sensitive data
- **Execution**: Run the tasks to observe the value of `a`. - **Execution**: Run the tasks to observe the value of `a`.
![image](images/private_data_object.png) ![image](images/private_data_object.png)
#### 2. Converting to a Data Object #### 2. Converting to a Data Object
- **Data Object Creation**: Create a data object and name it (e.g., `a`). Link this data object to the script task and set the data object ID to `a`. - **Data Object Creation**: Create a data object and name it (e.g., `a`). Link this data object to the script task and set the data object ID to `a`.
- **Assign a Category**: Assume the data object represents a credit card number. Assign a category to this data object, such as `creditcards`. - **Assign a Category**: Assume the data object represents a credit card number. Assign a category to this data object, such as `creditcards`.
- **Visibility**: The credit card data is visible until permissions are set to restrict access. - **Visibility**: The credit card data is visible until permissions are set to restrict access.
@ -32,6 +37,7 @@ This documentation outlines the process of creating and managing sensitive data
- **Process Execution**: Upon running the process, the value of the data object will be `1`. - **Process Execution**: Upon running the process, the value of the data object will be `1`.
![image](images/sensitive_value.png) ![image](images/sensitive_value.png)
#### 3. Setting Permissions with DMN Table #### 3. Setting Permissions with DMN Table
- **Access Control**: To control who can see the credit card data, you could set permissions in a DMN Table. - **Access Control**: To control who can see the credit card data, you could set permissions in a DMN Table.
- **Permission Configuration**: Set the following permissions: - **Permission Configuration**: Set the following permissions:
- `permission_groups` to `"everybody"` - `permission_groups` to `"everybody"`
@ -39,6 +45,8 @@ This documentation outlines the process of creating and managing sensitive data
- `permission_urls` to `"/process-data/creditcards/"` - `permission_urls` to `"/process-data/creditcards/"`
![image](images/setting_permissions.png) ![image](images/setting_permissions.png)
#### 4. Implementing Restricted Access #### 4. Implementing Restricted Access
With these permissions, access to the credit card data is denied to everyone, ensuring that no unauthorized individuals can view this sensitive information. With these permissions, access to the credit card data is denied to everyone, ensuring that no unauthorized individuals can view this sensitive information.
By following these steps, SpiffWorkflow users can securely handle sensitive data within their processes. The combination of data objects, categorization, and precise permission settings ensures that sensitive information like credit card numbers is protected and accessible only to those with the necessary authorization. By following these steps, SpiffWorkflow users can securely handle sensitive data within their processes.
The combination of data objects, categorization, and precise permission settings ensures that sensitive information like credit card numbers is protected and accessible only to those with the necessary authorization.

View File

@ -2,12 +2,16 @@
## Introduction ## Introduction
This document aims to guide users and administrators on how to configure secrets in SpiffArena, especially when dealing with BPMN diagrams stored in a public GitHub repository. The primary use case focuses on ensuring that sensitive information like API keys or OAuth tokens are not exposed while still making the process diagrams publicly available. This document aims to guide users and administrators on how to configure secrets in SpiffArena, especially when dealing with BPMN diagrams stored in a public GitHub repository.
The primary use case focuses on ensuring that sensitive information like API keys or OAuth tokens are not exposed while still making the process diagrams publicly available.
## Use Case ## Use Case
You might have service tasks in diagrams that require sensitive information like API keys or OAuth tokens, which you don't want to commit to GitHub. You might have service tasks in diagrams that require sensitive information like API keys or OAuth tokens, which you don't want to commit to GitHub.
SpiffArena allows you to create secrets that are stored in an encrypted format in the database. These secrets can be referenced in the XML of the BPMN diagrams, ensuring that while the process is visible, the sensitive information is not. Secrets are only used in service tasks. SpiffArena allows you to create secrets that are stored in an encrypted format in the database.
These secrets can be referenced in the XML of the BPMN diagrams, ensuring that while the process is visible, the sensitive information is not.
Secrets are only used in service tasks.
## Roles and Permissions ## Roles and Permissions
@ -42,4 +46,5 @@ SpiffArena allows you to create secrets that are stored in an encrypted format i
![Secrets Configuration](images/Secrets_configure_2.png) ![Secrets Configuration](images/Secrets_configure_2.png)
--- ---
Configuring secrets in SpiffArena provides a secure way to handle sensitive information in your BPMN diagrams. It allows you to make your processes public without exposing critical data, thereby enhancing both transparency and security. Configuring secrets in SpiffArena provides a secure way to handle sensitive information in your BPMN diagrams.
It allows you to make your processes public without exposing critical data, thereby enhancing both transparency and security.

View File

@ -22,8 +22,10 @@ permissions:
### Groups ### Groups
The "groups" section defines a group called "admin." This group is intended for users who have administrative privileges within the system. The "groups" section defines a group called "admin."
In this example, the "admin" group consists of a single user with the associated email address. Multiple groups can be added. This group is intended for users who have administrative privileges within the system.
In this example, the "admin" group consists of a single user with the associated email address.
Multiple groups can be added.
### Permissions ### Permissions
@ -66,7 +68,9 @@ To allow reading and DISALLOW updating, it would look like this:
## Site Administration ## Site Administration
Once the basic configuration setup is completed, specifying admin rights, you generally won't require additional permissions for designing processes and using the site. However, there might be certain situations that call for access control beyond the site or group level. In such cases, you have the flexibility to define and tailor admin requirements in a more detailed manner to fulfill specific needs. Once the basic configuration setup is completed, specifying admin rights, you generally won't require additional permissions for designing processes and using the site.
However, there might be certain situations that call for access control beyond the site or group level.
In such cases, you have the flexibility to define and tailor admin requirements in a more detailed manner to fulfill specific needs.
### Step 1: Create Process Group ### Step 1: Create Process Group
@ -122,11 +126,13 @@ The Process Model view should now include all uploaded files.
### Step 4: Understand the Process Models ### Step 4: Understand the Process Models
[Read more about DMN tables and how they work here.](../Building_Diagrams/dmn.md) [Read more about DMN tables and how they work here.
](../Building_Diagrams/dmn.md)
#### Users to Groups #### Users to Groups
Assess the roles and responsibilities of users within your organization or system. Look for common patterns or similarities in their job functions and tasks related to specific processes or process groups. Assess the roles and responsibilities of users within your organization or system.
Look for common patterns or similarities in their job functions and tasks related to specific processes or process groups.
Add a user email under the users 'column' and the group name under 'groups' and don't forget to add double quotes. Add a user email under the users 'column' and the group name under 'groups' and don't forget to add double quotes.
@ -144,7 +150,8 @@ Now that the groups have been identified, their permissions can be set by adding
- The hit policy is set to "Collect" which means that all conditions that are true will be applied. [Read more about DMN tables and hit policies here.](../Building_Diagrams/dmn.md) - The hit policy is set to "Collect" which means that all conditions that are true will be applied. [Read more about DMN tables and hit policies here.](../Building_Diagrams/dmn.md)
- The permission URL can be configured to define the user's access privileges. Our objective is to streamline the process by minimizing the necessity of being familiar with the complete set of permission URLs. In most instances, utilizing BASIC and ELEVATED permissions, as well as PM/PG, should be sufficient. However, it is also feasible to directly incorporate any API URL into the permissions. - The permission URL can be configured to define the user's access privileges. Our objective is to streamline the process by minimizing the necessity of being familiar with the complete set of permission URLs. In most instances, utilizing BASIC and ELEVATED permissions, as well as PM/PG, should be sufficient. However, it is also feasible to directly incorporate any API URL into the permissions.
In truth, what you are doing is writing an expression. In this case, it would read that if the variable 'permissions_group' type string is equal to 'permissions' variable of type string then set the 'permission_url' equal to the associated value. In truth, what you are doing is writing an expression.
In this case, it would read that if the variable 'permissions_group' type string is equal to 'permissions' variable of type string then set the 'permission_url' equal to the associated value.
```{admonition} Note ```{admonition} Note
If you find coding more familiar and preferable to constructing DMN tables, you may notice similarities between this DMN table and the shared permission configuration file. This similarity can help clarify or make it easier for you to understand the DMN table structure and its relation to the permission configuration. If you find coding more familiar and preferable to constructing DMN tables, you may notice similarities between this DMN table and the shared permission configuration file. This similarity can help clarify or make it easier for you to understand the DMN table structure and its relation to the permission configuration.
@ -155,4 +162,5 @@ If you find coding more familiar and preferable to constructing DMN tables, you
### Step 5: Start Process ### Step 5: Start Process
To ensure that User Groups and Permissions take effect, it is necessary to run the process at least once. Whenever changes are made to any of these diagrams, like adding a user group or permission, the process should be started and completed successfully in order for the changes to be applied. To ensure that User Groups and Permissions take effect, it is necessary to run the process at least once.
Whenever changes are made to any of these diagrams, like adding a user group or permission, the process should be started and completed successfully in order for the changes to be applied.

View File

@ -2,7 +2,9 @@
## Setting the Environment Variable ## Setting the Environment Variable
Once a `Connector Proxy` has been deployed, to integrate it with SpiffArena, we simply need to update an environment variable and restart the backend. If you're using the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/), open the docker-compose.yml file, otherwise edit the environment variable in the way that is appropriate for your deployment. The variable we need to change is called `SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL`. Once a `Connector Proxy` has been deployed, to integrate it with SpiffArena, we simply need to update an environment variable and restart the backend.
If you're using the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/), open the docker-compose.yml file, otherwise edit the environment variable in the way that is appropriate for your deployment.
The variable we need to change is called `SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL`.
Here's an example diff using the function URL from the AWS tutorial: Here's an example diff using the function URL from the AWS tutorial:
@ -32,11 +34,14 @@ docker compose up -d
## Testing ## Testing
Create a new process model as described in the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/). Add a `Service Task` and in its properties panel, you will see a dropdown from which you can select the connector in your `Connector Proxy` to call. In this demo, we deployed HTTP GET and POST connectors: Create a new process model as described in the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/).
Add a `Service Task` and in its properties panel, you will see a dropdown from which you can select the connector in your `Connector Proxy` to call.
In this demo, we deployed HTTP GET and POST connectors:
![Screenshot from 2023-04-06 16-38-02](https://user-images.githubusercontent.com/100367399/230489492-63cf88bf-7533-4160-95cb-d6194506dd5d.png) ![Screenshot from 2023-04-06 16-38-02](https://user-images.githubusercontent.com/100367399/230489492-63cf88bf-7533-4160-95cb-d6194506dd5d.png)
Choose the `http/GetRequest` operator ID and enter the [dog fact API](https://dog-api.kinduff.com/api/facts) URL. Remember to quote it since parameters are evaluated as Python expressions. Choose the `http/GetRequest` operator ID and enter the [dog fact API](https://dog-api.kinduff.com/api/facts) URL.
Remember to quote it since parameters are evaluated as Python expressions.
![Screenshot from 2023-04-06 16-50-42](https://user-images.githubusercontent.com/100367399/230491661-abdfdd3a-48f5-4f50-b6e5-9e3a5f562961.png) ![Screenshot from 2023-04-06 16-50-42](https://user-images.githubusercontent.com/100367399/230491661-abdfdd3a-48f5-4f50-b6e5-9e3a5f562961.png)
@ -44,4 +49,6 @@ Run the process and once it's complete, you can see the response in the workflow
![Screenshot from 2023-04-06 16-49-53](https://user-images.githubusercontent.com/100367399/230491713-9d3f9bd0-f284-4004-b00c-cb6dc94b53df.png) ![Screenshot from 2023-04-06 16-49-53](https://user-images.githubusercontent.com/100367399/230491713-9d3f9bd0-f284-4004-b00c-cb6dc94b53df.png)
You have successfully configured a `Connector Proxy` for use with `SpiffArena`. You made a call from a workflow to get a dog fact. Now, imagine if that call was to communicate with an external system relevant to your business processes. You have successfully configured a `Connector Proxy` for use with `SpiffArena`.
You made a call from a workflow to get a dog fact.
Now, imagine if that call was to communicate with an external system relevant to your business processes.

View File

@ -1,18 +1,21 @@
# Deploying a Connector Proxy as an AWS Lambda Function # Deploying a Connector Proxy as an AWS Lambda Function
This guide shows you how to deploy the demo `Connector Proxy` as an `AWS Lambda Function` and integrate it with [SpiffArena](https://www.spiffworkflow.org/pages/spiffarena/). We will use the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/) as the basis for integration, but the steps should easily map to any custom installation. This guide shows you how to deploy the demo `Connector Proxy` as an `AWS Lambda Function` and integrate it with [SpiffArena](https://www.spiffworkflow.org/pages/spiffarena/).
We will use the [Getting Started Guide](https://www.spiffworkflow.org/posts/articles/get_started/) as the basis for integration, but the steps should easily map to any custom installation.
It is assumed that you have access to log in to the AWS Console and can create/deploy Lambda functions. It is assumed that you have access to log in to the AWS Console and can create/deploy Lambda functions.
## Building the Zip ## Building the Zip
One method of deploying a Lambda function is by uploading a zip file containing the source code or executable. Run the following command in the root of [this repository](https://github.com/sartography/connector-proxy-lambda-demo): One method of deploying a Lambda function is by uploading a zip file containing the source code or executable.
Run the following command in the root of [this repository](https://github.com/sartography/connector-proxy-lambda-demo):
``` ```
make zip make zip
``` ```
This will create a zip file containing the [lambda entry point function](https://github.com/sartography/connector-proxy-lambda-demo/blob/main/connector_proxy_lambda_demo/lambda_function.py#L5) and all the dependencies needed to execute the connectors. For this example, the libraries [spiffworkflow-proxy](https://github.com/sartography/spiffworkflow-proxy) for discovering connectors and [connector-http](https://github.com/sartography/connector-http), an example connector that provides HTTP get and post requests, are used. This will create a zip file containing the [lambda entry point function](https://github.com/sartography/connector-proxy-lambda-demo/blob/main/connector_proxy_lambda_demo/lambda_function.py#L5) and all the dependencies needed to execute the connectors.
For this example, the libraries [spiffworkflow-proxy](https://github.com/sartography/spiffworkflow-proxy) for discovering connectors and [connector-http](https://github.com/sartography/connector-http), an example connector that provides HTTP get and post requests, are used.
Once `make zip` completes, `connector_proxy_lambda_demo.zip` will be available in the repository root. Once `make zip` completes, `connector_proxy_lambda_demo.zip` will be available in the repository root.
@ -30,7 +33,8 @@ Opt for `Author from scratch` and select the most recent Python runtime.
![Screenshot from 2023-04-06 15-23-19](https://user-images.githubusercontent.com/100367399/230482609-8bece818-a41f-4f37-99c4-d9d10bef4d54.png) ![Screenshot from 2023-04-06 15-23-19](https://user-images.githubusercontent.com/100367399/230482609-8bece818-a41f-4f37-99c4-d9d10bef4d54.png)
Under `Advanced Settings`, check `Enable function URL`. For this demo, we will use the `NONE` auth type to keep things simple. Under `Advanced Settings`, check `Enable function URL`.
For this demo, we will use the `NONE` auth type to keep things simple.
![Screenshot from 2023-04-06 15-24-12](https://user-images.githubusercontent.com/100367399/230482613-8fa6c8ef-5035-4a77-9670-f7211bf92cc0.png) ![Screenshot from 2023-04-06 15-24-12](https://user-images.githubusercontent.com/100367399/230482613-8fa6c8ef-5035-4a77-9670-f7211bf92cc0.png)
@ -38,13 +42,16 @@ After clicking the `Create function` button, you will be taken to your new Lambd
![Screenshot from 2023-04-06 16-02-11](https://user-images.githubusercontent.com/100367399/230482618-cf4cf088-3629-4832-9a3d-d81f29842aff.png) ![Screenshot from 2023-04-06 16-02-11](https://user-images.githubusercontent.com/100367399/230482618-cf4cf088-3629-4832-9a3d-d81f29842aff.png)
In the bottom right of the first section is a link to your Lambda's function URL. Click it for a hello world response. In the bottom right of the first section is a link to your Lambda's function URL.
Click it for a hello world response.
![Screenshot from 2023-04-06 16-09-08](https://user-images.githubusercontent.com/100367399/230484874-7529b786-da15-4a2c-8731-3780712bc0ef.png) ![Screenshot from 2023-04-06 16-09-08](https://user-images.githubusercontent.com/100367399/230484874-7529b786-da15-4a2c-8731-3780712bc0ef.png)
## Deploying the Lambda Function ## Deploying the Lambda Function
If you scroll down, you will see a section with the example code created with your Lambda function. We are going to replace this with the contents of our zip file. Choose `Upload from` and select `.zip file`. If you scroll down, you will see a section with the example code created with your Lambda function.
We are going to replace this with the contents of our zip file.
Choose `Upload from` and select `.zip file`.
![Screenshot from 2023-04-06 16-09-34](https://user-images.githubusercontent.com/100367399/230484774-c0b93e1a-e34d-47b3-813f-03598d5bd631.png) ![Screenshot from 2023-04-06 16-09-34](https://user-images.githubusercontent.com/100367399/230484774-c0b93e1a-e34d-47b3-813f-03598d5bd631.png)
@ -56,4 +63,5 @@ Click your function URL again to see a greeting from our deployed Connector Prox
## Integrating With SpiffArena ## Integrating With SpiffArena
Congratulations, your Connector Proxy has been deployed as a Lambda function. For information on configuring SpiffArena to use the new Connector Proxy URL, please see [Configure a Connector Proxy](configure_connector_proxy). Congratulations, your Connector Proxy has been deployed as a Lambda function.
For information on configuring SpiffArena to use the new Connector Proxy URL, please see [Configure a Connector Proxy](configure_connector_proxy).

View File

@ -109,7 +109,8 @@ Choose the process you want to initiate and click “Start”.
You have successfully started a new process instance in SpiffWorkflow. You have successfully started a new process instance in SpiffWorkflow.
If a process model doesn't have an associated BPMN file, the system will not display a start button. This is to prevent confusion and errors that might arise from attempting to start an incomplete process model. If a process model doesn't have an associated BPMN file, the system will not display a start button.
This is to prevent confusion and errors that might arise from attempting to start an incomplete process model.
--- ---
@ -372,7 +373,9 @@ Essentially, a milestone is an event that hasn't been set to something specific.
### Events ### Events
Events provide a detailed log of everything that happens in a process. They record every task and its execution time.
Events provide a detailed log of everything that happens in a process.
They record every task and its execution time.
![Events](images/Events.png) ![Events](images/Events.png)
@ -381,7 +384,9 @@ It can be noisy due to the granularity of the information, but it's essential fo
--- ---
## How to check messages ## How to check messages
Messages in BPMN allow processes to communicate with each other. This communication can take various forms:
Messages in BPMN allow processes to communicate with each other.
This communication can take various forms:
- Two processes running concurrently, exchanging messages. - Two processes running concurrently, exchanging messages.
- One process initiating another through a message. - One process initiating another through a message.
@ -434,7 +439,9 @@ For a more visual understanding and a step-by-step walkthrough, you can watch Da
--- ---
## How to share process instance with Short Links ## How to share process instance with Short Links
The short link feature provides a convenient way to share process instances with others without the need to copy and paste lengthy URLs. This feature is especially useful for quick sharing via email, messaging apps, or within documentation.
The short link feature provides a convenient way to share process instances with others without the need to copy and paste lengthy URLs.
This feature is especially useful for quick sharing via email, messaging apps, or within documentation.
To copy the short link: To copy the short link:
@ -465,7 +472,8 @@ This approach ensures you can monitor and review the progress of user forms with
--- ---
## How to view task instance history ## How to view task instance history
Monitoring the history of task instances is helpful for tracking the progress and execution details of a workflow. This guide provides a step-by-step approach to access and understand the task instance history, including the interpretation of task statuses. Monitoring the history of task instances is helpful for tracking the progress and execution details of a workflow.
This guide provides a step-by-step approach to access and understand the task instance history, including the interpretation of task statuses.
### Steps to Access Task Instance History ### Steps to Access Task Instance History
@ -489,4 +497,5 @@ For example:
- **COMPLETED Status**: Tasks marked as 'COMPLETED' have finished their execution successfully and have moved the workflow forward. - **COMPLETED Status**: Tasks marked as 'COMPLETED' have finished their execution successfully and have moved the workflow forward.
- **MAYBE Status**: Indicates that the task still exists within SpiffWorkflow. While these tasks could be omitted for clarity, retaining them provides a complete picture of the workflow's execution. - **MAYBE Status**: Indicates that the task still exists within SpiffWorkflow. While these tasks could be omitted for clarity, retaining them provides a complete picture of the workflow's execution.
Viewing task instance history in SpiffWorkflow is now more streamlined and informative, thanks to recent updates. Users can effectively track each task's execution, status, and timing, gaining insights into the workflow's overall performance. Viewing task instance history in SpiffWorkflow is now more streamlined and informative, thanks to recent updates.
Users can effectively track each task's execution, status, and timing, gaining insights into the workflow's overall performance.

View File

@ -8,6 +8,7 @@
![Flask](images/Flask.png) ![Flask](images/Flask.png)
### **2. Adding Python Libraries to SpiffWorkflow** ### **2. Adding Python Libraries to SpiffWorkflow**
**Q:** Is there documentation available for adding Python libraries to SpiffWorkflow? For example, if I want to run a process to send emails, I would need `smtplib`. **Q:** Is there documentation available for adding Python libraries to SpiffWorkflow? For example, if I want to run a process to send emails, I would need `smtplib`.
**A:** The default answer for something like sending emails would be to use a service task. We have an SMTP connector designed for this purpose. If you're using SpiffArena, a connector proxy can provide a nice integration into the UI. Here are some helpful links: **A:** The default answer for something like sending emails would be to use a service task. We have an SMTP connector designed for this purpose. If you're using SpiffArena, a connector proxy can provide a nice integration into the UI. Here are some helpful links:
- [SMTP Connector](https://github.com/sartography/connector-smtp) - [SMTP Connector](https://github.com/sartography/connector-smtp)
@ -15,6 +16,7 @@
- [BPMN, DMN samples for SpiffWorkflow](https://github.com/sartography/sample-process-models/tree/jon/misc/jonjon/smtp) - [BPMN, DMN samples for SpiffWorkflow](https://github.com/sartography/sample-process-models/tree/jon/misc/jonjon/smtp)
### **3. Tutorials on Using SpiffWorkflow** ### **3. Tutorials on Using SpiffWorkflow**
**Q:** Are there any tutorials available on how to use SpiffWorkflow? **Q:** Are there any tutorials available on how to use SpiffWorkflow?
**A:** Yes, here are some references: **A:** Yes, here are some references:
- [SpiffExample CLI](https://github.com/sartography/spiff-example-cli) - [SpiffExample CLI](https://github.com/sartography/spiff-example-cli)
@ -23,20 +25,24 @@
- [Getting Started with SpiffWorkflow](https://www.spiffworkflow.org/posts/articles/get_started/) - [Getting Started with SpiffWorkflow](https://www.spiffworkflow.org/posts/articles/get_started/)
### **4. Understanding Task Data in Custom Connectors** ### **4. Understanding Task Data in Custom Connectors**
**Q:** What kind of data can I expect from `task_data`? **Q:** What kind of data can I expect from `task_data`?
**A:** The `task_data` param contains data comprised of variables/values from prior tasks. For instance, if you have a script task before your service task that sets `x=1`, then in the `task_data` param, you should see `{"x": 1}`. **A:** The `task_data` param contains data comprised of variables/values from prior tasks. For instance, if you have a script task before your service task that sets `x=1`, then in the `task_data` param, you should see `{"x": 1}`.
### **5. Understanding and Using Custom Connectors** ### **5. Understanding and Using Custom Connectors**
**Q:** What are custom connectors and how do I use them? **Q:** What are custom connectors and how do I use them?
**A:** Custom connectors in SpiffWorkflow allow for integration with external systems or services. They enable the workflow to interact with other platforms, fetch data, or trigger actions. To use them, you'll typically define the connector's behavior, specify its inputs and outputs, and then use it within your BPMN process as a service task. **A:** Custom connectors in SpiffWorkflow allow for integration with external systems or services. They enable the workflow to interact with other platforms, fetch data, or trigger actions. To use them, you'll typically define the connector's behavior, specify its inputs and outputs, and then use it within your BPMN process as a service task.
### **6. Using Data Object Reference and Data Store Reference** ### **6. Using Data Object Reference and Data Store Reference**
**Q:** What are some good references for "Data Object Reference" and "Data Store Reference" in SpiffWorkFlow? **Q:** What are some good references for "Data Object Reference" and "Data Store Reference" in SpiffWorkFlow?
**A:** Here are some references to help you understand and implement "Data Object Reference" and "Data Store Reference" in SpiffWorkflow: **A:** Here are some references to help you understand and implement "Data Object Reference" and "Data Store Reference" in SpiffWorkflow:
- [Understanding BPMN's Data Objects with SpiffWorkflow](https://medium.com/@danfunk/understanding-bpmns-data-objects-with-spiffworkflow-26e195e23398) - [Understanding BPMN's Data Objects with SpiffWorkflow](https://medium.com/@danfunk/understanding-bpmns-data-objects-with-spiffworkflow-26e195e23398)
- [Data Encapsulation with SpiffWorkflow Video](https://youtu.be/0_PgaaI3WIg) - [Data Encapsulation with SpiffWorkflow Video](https://youtu.be/0_PgaaI3WIg)
### **7. Resetting a Workflow** ### **7. Resetting a Workflow**
**Q:** Is there a way of "resetting" a workflow without reloading the BPMN and DMN files? **Q:** Is there a way of "resetting" a workflow without reloading the BPMN and DMN files?
**A:** Yes, you can reset a workflow using the following code: **A:** Yes, you can reset a workflow using the following code:
```python ```python
@ -45,55 +51,67 @@ workflow.reset_from_task_id(start.id)
``` ```
### **8. Integrating SpiffWorkflow with other Python code** ### **8. Integrating SpiffWorkflow with other Python code**
**Q:** How do you integrate your workflow with other Python code? **Q:** How do you integrate your workflow with other Python code?
**A:** Integrating SpiffWorkflow with other Python code is straightforward. You have two primary methods: **A:** Integrating SpiffWorkflow with other Python code is straightforward. You have two primary methods:
1. **Script Tasks**: These allow you to execute Python code directly within your workflow. This method is suitable for simpler integrations where the code logic is not too complex. 1. **Script Tasks**: These allow you to execute Python code directly within your workflow. This method is suitable for simpler integrations where the code logic is not too complex.
2. **Service Tasks**: For more complex integrations, you can write services that can be called via service tasks within your workflow. This method provides more flexibility and is ideal for scenarios where you need to interface with external systems or perform more intricate operations. 2. **Service Tasks**: For more complex integrations, you can write services that can be called via service tasks within your workflow. This method provides more flexibility and is ideal for scenarios where you need to interface with external systems or perform more intricate operations.
### **9. Using Call Activity for preconfigured modular subprocesses** ### **9. Using Call Activity for preconfigured modular subprocesses**
**Q:** I need my users to generate many BPMN workflows by dropping preconfigured subprocesses into their workflows. Is this possible? **Q:** I need my users to generate many BPMN workflows by dropping preconfigured subprocesses into their workflows. Is this possible?
**A:** Yes, you can use a "Call Activity" in SpiffArena to reference other processes in your diagram. SpiffArena provides a way to search for other processes in the system that can be used as Call Activities. This means you can create modular workflows by designing subprocesses (like send to accounts) and then incorporating them into multiple main workflows as needed. This modular approach not only streamlines the design process but also ensures consistency across different workflows. **A:** Yes, you can use a "Call Activity" in SpiffArena to reference other processes in your diagram. SpiffArena provides a way to search for other processes in the system that can be used as Call Activities. This means you can create modular workflows by designing subprocesses (like send to accounts) and then incorporating them into multiple main workflows as needed. This modular approach not only streamlines the design process but also ensures consistency across different workflows.
### **10. Integrating SpiffWorkflow with External Role Management** ### **10. Integrating SpiffWorkflow with External Role Management**
**Q:** How do I make external application user roles affect permissions on a task? **Q:** How do I make external application user roles affect permissions on a task?
**A:** You can manage the roles externally in an OpenID system and access the user and group information in the Lanes of your BPMN diagram. **A:** You can manage the roles externally in an OpenID system and access the user and group information in the Lanes of your BPMN diagram.
### **11. Understanding Workflow Data vs Task Data** ### **11. Understanding Workflow Data vs Task Data**
**Q:** What is the difference between the `workflow.data` and `task.data`? **Q:** What is the difference between the `workflow.data` and `task.data`?
**A:** Task data is stored on each task, and each task has its own copy. Workflow data is stored on the workflow. If you use BPMN DataObjects, that data is stored in workflow data. **A:** Task data is stored on each task, and each task has its own copy. Workflow data is stored on the workflow. If you use BPMN DataObjects, that data is stored in workflow data.
### **12. Understanding Secrets and Authentications in SpiffArena** ### **12. Understanding Secrets and Authentications in SpiffArena**
**Q:** What are 'Secrets' and 'Authentications' used for in SpiffArena? **Q:** What are 'Secrets' and 'Authentications' used for in SpiffArena?
**A:** Secrets are used for communicating with external services when you use service tasks and connectors. Authentications are used when you need to OAuth into an external service. Check out more information [here](https://spiff-arena.readthedocs.io/en/latest/DevOps_installation_integration/Secrets.html). **A:** Secrets are used for communicating with external services when you use service tasks and connectors. Authentications are used when you need to OAuth into an external service. Check out more information [here](https://spiff-arena.readthedocs.io/en/latest/DevOps_installation_integration/Secrets.html).
### **13. Determining the Lane of a Task in a Workflow** ### **13. Determining the Lane of a Task in a Workflow**
**Q:** In the pre/post script of a task in a workflow, how do I determine what lane the current task is in? **Q:** In the pre/post script of a task in a workflow, how do I determine what lane the current task is in?
**A:** You can access the task and use `task.task_spec.lane` to get the lane as a string. This allows you to programmatically determine which lane a task belongs to during its execution. **A:** You can access the task and use `task.task_spec.lane` to get the lane as a string. This allows you to programmatically determine which lane a task belongs to during its execution.
### **14. Understanding Script Attributes Context** ### **14. Understanding Script Attributes Context**
**Q:** I'm trying to understand the details of `script_attributes_context`. Where can I find more information? **Q:** I'm trying to understand the details of `script_attributes_context`. Where can I find more information?
**A:** The `ScriptAttributesContext` class is defined [here](https://github.com/sartography/spiff-arena/blob/deploy-mod-prod/spiffworkflow-backend/src/spiffworkflow_backend/models/script_attributes_context.py#L9). **A:** The `ScriptAttributesContext` class is defined [here](https://github.com/sartography/spiff-arena/blob/deploy-mod-prod/spiffworkflow-backend/src/spiffworkflow_backend/models/script_attributes_context.py#L9).
### **15. Using Message Start Event to Kick Off a Process** ### **15. Using Message Start Event to Kick Off a Process**
**Q:** How do I use a message start event to kick off a process? **Q:** How do I use a message start event to kick off a process?
**A:** This [script](https://github.com/sartography/spiff-arena/blob/main/spiffworkflow-backend/bin/run_message_start_event_with_api#L39) is an example of using a message start event to kick off a process. **A:** This [script](https://github.com/sartography/spiff-arena/blob/main/spiffworkflow-backend/bin/run_message_start_event_with_api#L39) is an example of using a message start event to kick off a process.
### **16. Making REST API Calls in SpiffArena** ### **16. Making REST API Calls in SpiffArena**
**Q:** How do I make REST API calls in SpiffArena? **Q:** How do I make REST API calls in SpiffArena?
**A:** You can use Service Tasks driven by a Connector Proxy. Check out the [Connector Proxy Demo](https://github.com/sartography/connector-proxy-demo) for more details. **A:** You can use Service Tasks driven by a Connector Proxy. Check out the [Connector Proxy Demo](https://github.com/sartography/connector-proxy-demo) for more details.
### **17. Assigning User Tasks in SpiffWorkflow** ### **17. Assigning User Tasks in SpiffWorkflow**
**Q:** How does one use camunda:assignee="test" in a userTask with Spiff? **Q:** How does one use camunda:assignee="test" in a userTask with Spiff?
**A:** In SpiffWorkflow, user task assignments can be managed using Lanes in your BPMN diagram. Each Lane can designate which individual or group can execute the tasks within that Lane. If you're looking to interface permissions based on external application user roles, you can manage roles externally and pass the user and group information to assign them to the Lanes. **A:** In SpiffWorkflow, user task assignments can be managed using Lanes in your BPMN diagram. Each Lane can designate which individual or group can execute the tasks within that Lane. If you're looking to interface permissions based on external application user roles, you can manage roles externally and pass the user and group information to assign them to the Lanes.
### **18. Mimicking an Inclusive Gateway in SpiffWorkflow** ### **18. Mimicking an Inclusive Gateway in SpiffWorkflow**
**Q:** How can we mimic an inclusive gateway since SpiffWorkflow doesn't support it? **Q:** How can we mimic an inclusive gateway since SpiffWorkflow doesn't support it?
**A:** You can work around the absence of an inclusive gateway in SpiffWorkflow by using a Parallel Gateway. Within each path following the Parallel Gateway, you can place an Exclusive Gateway to check for the conditions that are or are not required. This approach is effective if the flows can eventually be merged back together. **A:** You can work around the absence of an inclusive gateway in SpiffWorkflow by using a Parallel Gateway. Within each path following the Parallel Gateway, you can place an Exclusive Gateway to check for the conditions that are or are not required. This approach is effective if the flows can eventually be merged back together.
![Mimicking Inclusive Gateway](images/Mimicking_inclusive_gateway.png) ![Mimicking Inclusive Gateway](images/Mimicking_inclusive_gateway.png)
### **19. Designing an Approval Process in SpiffWorkflow** ### **19. Designing an Approval Process in SpiffWorkflow**
**Q:** I am designing an approval process using SpiffWorkflow. Can SpiffWorkflow handle scenarios where a task should complete **Q:** I am designing an approval process using SpiffWorkflow. Can SpiffWorkflow handle scenarios where a task should complete
if more than 2 users approve out of 3 assignees? if more than 2 users approve out of 3 assignees?
@ -101,6 +119,7 @@ if more than 2 users approve out of 3 assignees?
### **20. Process Instances in SpiffArena After Docker Compose Restart** ### **20. Process Instances in SpiffArena After Docker Compose Restart**
**Q:** I restarted docker-compose, and my process instances in SpiffArena aren't persistent. How can I ensure they remain after a restart? **Q:** I restarted docker-compose, and my process instances in SpiffArena aren't persistent. How can I ensure they remain after a restart?
**A:** Make sure you're using the updated "getting started" `docker-compose.yml` file that uses sqlite to persist the database between docker compose restarts. **A:** Make sure you're using the updated "getting started" `docker-compose.yml` file that uses sqlite to persist the database between docker compose restarts.
@ -109,12 +128,14 @@ This will ensure that your process instances remain after a restart.
If you're still facing issues, refer to the provided documentation on admin and permissions for further guidance. If you're still facing issues, refer to the provided documentation on admin and permissions for further guidance.
### **21: Downloading and Re-uploading Process Models in SpiffArena** ### **21: Downloading and Re-uploading Process Models in SpiffArena**
**Q:** Is it possible to download a process model in SpiffArena and then re-upload it? **Q:** Is it possible to download a process model in SpiffArena and then re-upload it?
**A:** Yes, in SpiffArena, you can download a process model and then re-upload it. However, it's essential to note that all process IDs must be unique across the system. If you're re-uploading a process model, its ID might need to be modified to ensure uniqueness. **A:** Yes, in SpiffArena, you can download a process model and then re-upload it. However, it's essential to note that all process IDs must be unique across the system. If you're re-uploading a process model, its ID might need to be modified to ensure uniqueness.
### **22: Understanding "Notification Addresses" and "Metadata Extractions" in SpiffArena** ### **22: Understanding "Notification Addresses" and "Metadata Extractions" in SpiffArena**
**Q:** What are the "notification addresses" and "metadata extractions" fields when creating a new process model in SpiffArena? **Q:** What are the "notification addresses" and "metadata extractions" fields when creating a new process model in SpiffArena?
**A:** When creating a new process model in SpiffArena, the "notification addresses" field is used to specify recipients for notifications related to that process. **A:** When creating a new process model in SpiffArena, the "notification addresses" field is used to specify recipients for notifications related to that process.
@ -123,6 +144,7 @@ Detailed documentation for both fields is available.
It's worth noting that the functionality of "Notification Addresses" might undergo changes in the future to centralize the logic and avoid splitting configurations. It's worth noting that the functionality of "Notification Addresses" might undergo changes in the future to centralize the logic and avoid splitting configurations.
### **23: Issues with SpiffArena Frontend Loading** ### **23: Issues with SpiffArena Frontend Loading**
**Q:** Why doesn't the SpiffArena frontend always load completely? **Q:** Why doesn't the SpiffArena frontend always load completely?
**A:** The issue might arise when the frontend cannot communicate with the backend. **A:** The issue might arise when the frontend cannot communicate with the backend.
@ -130,7 +152,8 @@ Recent updates have been made to address this specific problem.
Previously, the backend could deadlock when it received a high number of concurrent requests, exhausting the available worker processes. Previously, the backend could deadlock when it received a high number of concurrent requests, exhausting the available worker processes.
Since it uses built-in openid, each request would need to communicate with the backend itself. Since it uses built-in openid, each request would need to communicate with the backend itself.
This issue has been resolved in the newer versions. To potentially fix this, you can update your setup by running the following commands in the directory where you downloaded the `docker-compose.yml` file: This issue has been resolved in the newer versions.
To potentially fix this, you can update your setup by running the following commands in the directory where you downloaded the `docker-compose.yml` file:
``` ```
docker compose pull docker compose pull
@ -142,6 +165,7 @@ By doing this, you'll pull the latest images, shut down the current containers,
This should help in ensuring that the frontend loads completely and communicates effectively with the backend. This should help in ensuring that the frontend loads completely and communicates effectively with the backend.
### **24: Resolving Docker Compose Issues on M1/M2 Mac in SpiffArena** ### **24: Resolving Docker Compose Issues on M1/M2 Mac in SpiffArena**
**Q:** I'm using an M1/M2 Mac and facing issues with docker-compose in SpiffArena. How can I resolve this? **Q:** I'm using an M1/M2 Mac and facing issues with docker-compose in SpiffArena. How can I resolve this?
**A:** Ensure that you're using the latest versions of Docker and docker-compose. **A:** Ensure that you're using the latest versions of Docker and docker-compose.
@ -150,6 +174,7 @@ Update your images and restart the containers as needed.
Instructions in the getting started guide reference `curl`, but if that is not working for you, `wget` may be an option that is already installed on your system. Instructions in the getting started guide reference `curl`, but if that is not working for you, `wget` may be an option that is already installed on your system.
### **25: Importing External Modules in Script Tasks in SpiffArena** ### **25: Importing External Modules in Script Tasks in SpiffArena**
**Q:** Why can't I import an external module in a script task in SpiffArena? **Q:** Why can't I import an external module in a script task in SpiffArena?
**A:** In SpiffArena, script tasks are designed for lightweight scripting and do not support importing external modules. **A:** In SpiffArena, script tasks are designed for lightweight scripting and do not support importing external modules.
@ -160,38 +185,45 @@ Detailed documentation available [here](https://spiff-arena.readthedocs.io/en/la
If you want to bypass security features of the restricted script engine and import modules from your script tasks, you can set the environment variable: `SPIFFWORKFLOW_BACKEND_USE_RESTRICTED_SCRIPT_ENGINE=false` If you want to bypass security features of the restricted script engine and import modules from your script tasks, you can set the environment variable: `SPIFFWORKFLOW_BACKEND_USE_RESTRICTED_SCRIPT_ENGINE=false`
### **26: Storage of Properties Data in SpiffArena** ### **26: Storage of Properties Data in SpiffArena**
**Q:** Where is the properties data stored in the properties panel? **Q:** Where is the properties data stored in the properties panel?
**A:** The properties data is stored directly within the XML of the BPMN diagram. Some of this data is stored in extension elements. **A:** The properties data is stored directly within the XML of the BPMN diagram. Some of this data is stored in extension elements.
For instance, the configuration for a service task can be found [here](https://github.com/sartography/sample-process-models/blob/sample-models-1/misc/jonjon/ham/ham.bpmn#L13) and instructions can be found [here](https://github.com/sartography/sample-process-models/blob/sample-models-1/misc/documentation/user-guide-basics/user-guide-basics.bpmn#L24). If you're considering bypassing the properties panel, it's essential to ensure that the XML output remains consistent with the expected format. For instance, the configuration for a service task can be found [here](https://github.com/sartography/sample-process-models/blob/sample-models-1/misc/jonjon/ham/ham.bpmn#L13) and instructions can be found [here](https://github.com/sartography/sample-process-models/blob/sample-models-1/misc/documentation/user-guide-basics/user-guide-basics.bpmn#L24). If you're considering bypassing the properties panel, it's essential to ensure that the XML output remains consistent with the expected format.
### **27: Starting a Task in SpiffArena** ### **27: Starting a Task in SpiffArena**
**Q:** How do I start a task? What do I need besides BPMN? **Q:** How do I start a task? What do I need besides BPMN?
**A:** To start a task, you'll need to have a proper BPMN diagram and a configured environment. The docker compose file, as mentioned on the [spiffworkflow.org](https://www.spiffworkflow.org/posts/articles/get_started/) website, provides a containerized environment for both the API and asynchronous processing. For a more robust production deployment, it's recommended to use separate containers for different functionalities. **A:** To start a task, you'll need to have a proper BPMN diagram and a configured environment. The docker compose file, as mentioned on the [spiffworkflow.org](https://www.spiffworkflow.org/posts/articles/get_started/) website, provides a containerized environment for both the API and asynchronous processing. For a more robust production deployment, it's recommended to use separate containers for different functionalities.
### **28: Setting Up Own OpenID Provider** ### **28: Setting Up Own OpenID Provider**
**Q:** Any documentation on how to set up our own openid provider? **Q:** Any documentation on how to set up our own openid provider?
**A:** If you're using the spiff-arena/spiffworkflow-backend, there's a script named `./keycloak/bin/start_keycloak` that can initiate a container serving as an example OpenID provider. This can be a good starting point if you're looking to set up your own OpenID provider. **A:** If you're using the spiff-arena/spiffworkflow-backend, there's a script named `./keycloak/bin/start_keycloak` that can initiate a container serving as an example OpenID provider. This can be a good starting point if you're looking to set up your own OpenID provider.
### **29: Configuring SMTP Server for Email Notifications in SpiffWorkflow** ### **29: Configuring SMTP Server for Email Notifications in SpiffWorkflow**
**Q:** Where can I configure an SMTP server for Spiffworkflow to send email notifications? **Q:** Where can I configure an SMTP server for Spiffworkflow to send email notifications?
**A:** To configure an SMTP server for email notifications, you can utilize connectors and service tasks within SpiffWorkflow. For instance, connectors can be set up to send notifications to platforms like Slack. **A:** To configure an SMTP server for email notifications, you can utilize connectors and service tasks within SpiffWorkflow. For instance, connectors can be set up to send notifications to platforms like Slack.
### **30: Accessing Timer Event Value/Expression in Code** ### **30: Accessing Timer Event Value/Expression in Code**
**Q:** Is there any way to access the timer event value/expression in my code? **Q:** Is there any way to access the timer event value/expression in my code?
**A:** Yes, in SpiffWorkflow, you can access timer event values directly from the backend. There are specific sections in the codebase where timer event values are checked and utilized for various functionalities. **A:** Yes, in SpiffWorkflow, you can access timer event values directly from the backend. There are specific sections in the codebase where timer event values are checked and utilized for various functionalities.
### **31: Creating New Users in SpiffWorkflow** ### **31: Creating New Users in SpiffWorkflow**
**Q:** How can I create new users for my co-workers in SpiffWorkflow? **Q:** How can I create new users for my co-workers in SpiffWorkflow?
**A:** There are multiple methods to manage this, such as using OpenID or the process model. However, for beginners eager to add a user quickly, you can adjust the 'example.yml' configuration file within the app identified as `` 'SPIFFWORKFLOW_BACKEND_PERMISSIONS_FILE_NAME: "example.yml"`` **A:** There are multiple methods to manage this, such as using OpenID or the process model. However, for beginners eager to add a user quickly, you can adjust the 'example.yml' configuration file within the app identified as `` 'SPIFFWORKFLOW_BACKEND_PERMISSIONS_FILE_NAME: "example.yml"``
After making changes, restart the container to update user details. For more information, refer to the [Spiff-Arena documentation](https://spiff-arena.readthedocs.io/en/latest/installation_integration/admin_and_permissions.html). The mentioned file can be found [here](https://github.com/sartography/spiff-arena/tree/main/spiffworkflow-backend/src/spiffworkflow_backend/config/permissions). After making changes, restart the container to update user details. For more information, refer to the [Spiff-Arena documentation](https://spiff-arena.readthedocs.io/en/latest/installation_integration/admin_and_permissions.html). The mentioned file can be found [here](https://github.com/sartography/spiff-arena/tree/main/spiffworkflow-backend/src/spiffworkflow_backend/config/permissions).
### **32: Understanding the Collaboration Flag in Spiff-Example-CLI** ### **32: Understanding the Collaboration Flag in Spiff-Example-CLI**
**Q:** Explain the functionality and purpose of the collaboration flag in spiff-example-cli? **Q:** Explain the functionality and purpose of the collaboration flag in spiff-example-cli?
**A:** The collaboration flag enables the simultaneous loading of multiple top-level processes within a single workflow. This results in the creation of a subworkflow for each process, allowing them to initiate concurrently. **A:** The collaboration flag enables the simultaneous loading of multiple top-level processes within a single workflow. This results in the creation of a subworkflow for each process, allowing them to initiate concurrently.
@ -199,6 +231,7 @@ After making changes, restart the container to update user details. For more inf
A practical application of this might be when two processes need to interact but remain independent of each other. A practical application of this might be when two processes need to interact but remain independent of each other.
### **33: Custom Tasks and Services in Modeler** ### **33: Custom Tasks and Services in Modeler**
**Q:** How can I draw custom tasks or services in the modeler that can run in Python SpiffWorkflow when loaded using the BPMN YAML file? **Q:** How can I draw custom tasks or services in the modeler that can run in Python SpiffWorkflow when loaded using the BPMN YAML file?
**A:** To create custom tasks or services in SpiffWorkflow, you have several options: **A:** To create custom tasks or services in SpiffWorkflow, you have several options:
@ -210,6 +243,7 @@ A practical application of this might be when two processes need to interact but
By following these guidelines, you can create custom tasks or services tailored to your specific workflow requirements in SpiffWorkflow. By following these guidelines, you can create custom tasks or services tailored to your specific workflow requirements in SpiffWorkflow.
### **34: configure SpiffWorkflow to work with hostname instead of "localhost** ### **34: configure SpiffWorkflow to work with hostname instead of "localhost**
**Q:** How can I configure SpiffWorkflow to work with my computer's hostname instead of "localhost"? **Q:** How can I configure SpiffWorkflow to work with my computer's hostname instead of "localhost"?
**A:** To configure SpiffWorkflow to work with your computer's hostname, follow these steps: **A:** To configure SpiffWorkflow to work with your computer's hostname, follow these steps:
@ -260,6 +294,7 @@ By following these guidelines, you can create custom tasks or services tailored
By following these steps, you can successfully configure SpiffWorkflow to work with your computer's hostname, ensuring smooth operation outside the "host" computer environment. By following these steps, you can successfully configure SpiffWorkflow to work with your computer's hostname, ensuring smooth operation outside the "host" computer environment.
### **35: Approval process in SpiffWorkflow** ### **35: Approval process in SpiffWorkflow**
**Q:** How do I model an approval process in SpiffWorkflow where each task may require a different approver? **Q:** How do I model an approval process in SpiffWorkflow where each task may require a different approver?
**A:** To model an approval process in SpiffWorkflow with multiple tasks, each requiring a different approver, follow these steps: **A:** To model an approval process in SpiffWorkflow with multiple tasks, each requiring a different approver, follow these steps:
@ -281,6 +316,7 @@ By following these steps, you can successfully configure SpiffWorkflow to work w
By following these steps and utilizing the features of SpiffWorkflow, you can effectively model an approval process with multiple tasks and approvers, ensuring a smooth and efficient workflow. By following these steps and utilizing the features of SpiffWorkflow, you can effectively model an approval process with multiple tasks and approvers, ensuring a smooth and efficient workflow.
### **36: Timer Start Event and internal scheduler** ### **36: Timer Start Event and internal scheduler**
**Q:** How does the Timer Start Event work in SpiffWorkflow, and is there an internal scheduler to execute workflows at the specified timer value? **Q:** How does the Timer Start Event work in SpiffWorkflow, and is there an internal scheduler to execute workflows at the specified timer value?
**A:** In SpiffWorkflow, Timer Start Events are managed by an internal scheduler within the backend. This scheduler is **A:** In SpiffWorkflow, Timer Start Events are managed by an internal scheduler within the backend. This scheduler is
@ -291,6 +327,7 @@ Additionally, there are environment variables like `SPIFFWORKFLOW_BACKEND_BACKGR
scheduler. For more details, you can refer to the [SpiffWorkflow Backend Initialization Code](https://github.com/sartography/spiff-arena/blob/main/spiffworkflow-backend/src/spiffworkflow_backend/__init__.py). scheduler. For more details, you can refer to the [SpiffWorkflow Backend Initialization Code](https://github.com/sartography/spiff-arena/blob/main/spiffworkflow-backend/src/spiffworkflow_backend/__init__.py).
### **37: Authentication tokens in SpiffWorkflow** ### **37: Authentication tokens in SpiffWorkflow**
**Q:** How does authentication work in SpiffWorkflow, particularly regarding the use of authentication tokens? **Q:** How does authentication work in SpiffWorkflow, particularly regarding the use of authentication tokens?
**A:** A common approach for handling authentication involves using bearer tokens in the "Authorization" header of API **A:** A common approach for handling authentication involves using bearer tokens in the "Authorization" header of API
@ -300,6 +337,7 @@ can copy any request to the backend as a curl command or inspect the headers to
If the standard openid flow is not ideal for your use case, Service Account / API Token management can be implemented using a process model. If the standard openid flow is not ideal for your use case, Service Account / API Token management can be implemented using a process model.
### **38: Configure SpiffArena to run behind a proxy server** ### **38: Configure SpiffArena to run behind a proxy server**
**Q:** How can I configure SpiffArena to run behind a proxy server, such as Traefik, and resolve issues with redirects and OpenID provider authentication? **Q:** How can I configure SpiffArena to run behind a proxy server, such as Traefik, and resolve issues with redirects and OpenID provider authentication?
**A:** Running SpiffArena behind a proxy server like Traefik involves several configuration steps to ensure proper communication between the frontend, backend, and the OpenID provider. Here are key points to consider: **A:** Running SpiffArena behind a proxy server like Traefik involves several configuration steps to ensure proper communication between the frontend, backend, and the OpenID provider. Here are key points to consider:
@ -318,10 +356,10 @@ If the standard openid flow is not ideal for your use case, Service Account / AP
For more detailed guidance and examples of SpiffArena deployment configurations, you can refer to resources like the [Terraform Kubernetes Modules](https://github.com/mingfang/terraform-k8s-modules/blob/master/examples/spiffworkflow/README.md). For more detailed guidance and examples of SpiffArena deployment configurations, you can refer to resources like the [Terraform Kubernetes Modules](https://github.com/mingfang/terraform-k8s-modules/blob/master/examples/spiffworkflow/README.md).
Remember, each deployment scenario can be unique, so it's important to tailor these guidelines to your specific setup and Remember, each deployment scenario can be unique, so it's important to tailor these guidelines to your specific setup and requirements.
requirements.
### **39: Change in states of script tasks** ### **39: Change in states of script tasks**
**Q:** Why does my script task in SpiffWorkflow change to the "STARTED" state instead of "COMPLETED" after execution, and how **Q:** Why does my script task in SpiffWorkflow change to the "STARTED" state instead of "COMPLETED" after execution, and how
can I resolve this? can I resolve this?
@ -340,6 +378,7 @@ can I resolve this?
By adjusting the return value of your script task's `execute` method and understanding the underlying mechanics of task state management in SpiffWorkflow, you can effectively control the flow of your workflow processes. By adjusting the return value of your script task's `execute` method and understanding the underlying mechanics of task state management in SpiffWorkflow, you can effectively control the flow of your workflow processes.
### **40: Event Design in SpiffWorkflow** ### **40: Event Design in SpiffWorkflow**
**Q:** How are internal and external events managed within SpiffWorkflow? **Q:** How are internal and external events managed within SpiffWorkflow?
**A:** **Event Handling in SpiffWorkflow:** **A:** **Event Handling in SpiffWorkflow:**

View File

@ -1,16 +1,19 @@
# Troubleshooting: Running Server Locally # Troubleshooting: Running Server Locally
When setting up the SpiffWorkflow backend project locally, you might encounter issues related to the `sample-process-models` directory. This documentation aims to address those concerns. When setting up the SpiffWorkflow backend project locally, you might encounter issues related to the `sample-process-models` directory.
This documentation aims to address those concerns.
## Problem ## Problem
While following the instructions provided in the repository to set up the SpiffWorkflow backend project locally, you may find that the script `./bin/run_server_locally` expects the `sample-process-models` directory to be present. However, this directory might not be immediately available in the repository. While following the instructions provided in the repository to set up the SpiffWorkflow backend project locally, you may find that the script `./bin/run_server_locally` expects the `sample-process-models` directory to be present.
However, this directory might not be immediately available in the repository.
## Solutions ## Solutions
### 1. Clone the `sample-process-models` Repository ### 1. Clone the `sample-process-models` Repository
The `sample-process-models` directory refers to a separate repository. To resolve the issue: The `sample-process-models` directory refers to a separate repository.
To resolve the issue:
- Navigate to [https://github.com/sartography/sample-process-models](https://github.com/sartography/sample-process-models). - Navigate to [https://github.com/sartography/sample-process-models](https://github.com/sartography/sample-process-models).
@ -65,6 +68,7 @@ If you prefer not to install anything locally:
- Navigate to [https://www.spiffworkflow.org/posts/articles/get_started](https://www.spiffworkflow.org/posts/articles/get_started). - Navigate to [https://www.spiffworkflow.org/posts/articles/get_started](https://www.spiffworkflow.org/posts/articles/get_started).
- Access a version of Spiff hosted on the internet. - Access a version of Spiff hosted on the internet.
Setting up the SpiffWorkflow backend project locally can be straightforward once you're aware of the dependencies and options available. Whether you choose to clone the `sample-process-models` repository, use a different git repository, or opt for Docker Compose, the solutions provided should help you get started without any hitches. Setting up the SpiffWorkflow backend project locally can be straightforward once you're aware of the dependencies and options available.
Whether you choose to clone the `sample-process-models` repository, use a different git repository, or opt for Docker Compose, the solutions provided should help you get started without any hitches.
If you encounter further issues, always refer back to the repository's README or seek assistance from our discord community. If you encounter further issues, always refer back to the repository's README or seek assistance from our discord community.

View File

@ -2,7 +2,9 @@
## Introduction ## Introduction
The welcome message is the greeting users see when they log into the platform. It sets the tone for their experience and can be customized to fit the needs of your organization or specific user groups. By customizing the welcome message, **administrators** can tailor the user experience, making it more personalized and relevant to the audience. The welcome message is the greeting users see when they log into the platform.
It sets the tone for their experience and can be customized to fit the needs of your organization or specific user groups.
By customizing the welcome message, **administrators** can tailor the user experience, making it more personalized and relevant to the audience.
This guide will walk you through the steps to modify this message in SpiffWorkflow. This guide will walk you through the steps to modify this message in SpiffWorkflow.
@ -30,4 +32,5 @@ Navigate to the onboarding process model within SpiffWorkflow.
After making your desired modifications, save the changes to update the welcome message. After making your desired modifications, save the changes to update the welcome message.
Once you've updated the welcome message, it will be displayed prominently on the home page after users log in. The message will be positioned in a way that it's one of the first things users see, ensuring they receive the intended greeting every time they access the platform. Once you've updated the welcome message, it will be displayed prominently on the home page after users log in.
The message will be positioned in a way that it's one of the first things users see, ensuring they receive the intended greeting every time they access the platform.

View File

@ -7,7 +7,8 @@
## Suspend a Process ## Suspend a Process
By suspending a process instance, you temporarily halt its execution, allowing you to access and modify the necessary data or configurations associated with that specific instance. This feature is not only useful for making updates, but also enables the possibility to redo a previous step with different metadata if needed. By suspending a process instance, you temporarily halt its execution, allowing you to access and modify the necessary data or configurations associated with that specific instance.
This feature is not only useful for making updates, but also enables the possibility to redo a previous step with different metadata if needed.
> **Step 1: Find the Active Process Instance** > **Step 1: Find the Active Process Instance**
@ -15,7 +16,8 @@ By suspending a process instance, you temporarily halt its execution, allowing y
```{admonition} Note ```{admonition} Note
⚠ Note that the suspension of a process instance is only applicable to active instances. If an instance is not active, it indicates that the process has already been completed, and therefore, it cannot be suspended. ⚠ Note that the suspension of a process instance is only applicable to active instances.
If an instance is not active, it indicates that the process has already been completed, and therefore, it cannot be suspended.
``` ```
@ -27,7 +29,10 @@ By suspending a process instance, you temporarily halt its execution, allowing y
> **Step 3: Select Suspend Button** > **Step 3: Select Suspend Button**
Click on the 'Suspend' icon. This action will pause the process instance, granting you the ability to make edits and modifications. When ready, the process instance can be resumed. The process instance remains highlighted in yellow. Click on the 'Suspend' icon.
This action will pause the process instance, granting you the ability to make edits and modifications.
When ready, the process instance can be resumed.
The process instance remains highlighted in yellow.
![suspend](images/active_process_instance.png) ![suspend](images/active_process_instance.png)
@ -61,7 +66,8 @@ Resuming a process is essential for ensuring that the process can continue its e
## Terminate a Process Instance ## Terminate a Process Instance
Terminating refers to ending the execution of a specific occurrence of a process before it reaches its natural completion or final outcome. There are various reasons for terminating a process instance such as the instance is no longer required or it's in an error state. Terminating refers to ending the execution of a specific occurrence of a process before it reaches its natural completion or final outcome.
There are various reasons for terminating a process instance such as the instance is no longer required or it's in an error state.
> **Step 1: Locate Terminate Icon** > **Step 1: Locate Terminate Icon**

View File

@ -2,46 +2,63 @@
**[#1 - Build your own Low-Code Business Applications with SpiffWorkflow](https://medium.com/@danfunk/build-your-own-low-code-business-applications-with-spiffworkflow-1d0730acc1f3)** **[#1 - Build your own Low-Code Business Applications with SpiffWorkflow](https://medium.com/@danfunk/build-your-own-low-code-business-applications-with-spiffworkflow-1d0730acc1f3)**
This article introduces SpiffWorkflow as a low-code application development tool that utilizes flow-chart diagrams. It emphasizes the involvement of clients in the architectural decision-making process and highlights the use of BPMN diagrams, user tasks, script tasks, and exclusive gateways. The article concludes by showcasing a Python code example for loading and executing BPMN diagrams with SpiffWorkflow. This article introduces SpiffWorkflow as a low-code application development tool that utilizes flow-chart diagrams.
It emphasizes the involvement of clients in the architectural decision-making process and highlights the use of BPMN diagrams, user tasks, script tasks, and exclusive gateways.
The article concludes by showcasing a Python code example for loading and executing BPMN diagrams with SpiffWorkflow.
---- ----
**[#2 - SpiffWorkflow, A New Hope](https://medium.com/@danfunk/spiffworkflow-a-new-hope-3f0c1dc72adb)** **[#2 - SpiffWorkflow, A New Hope](https://medium.com/@danfunk/spiffworkflow-a-new-hope-3f0c1dc72adb)**
SpiffWorkflow empowers non-coders to control app logic using flow-chart diagrams. It emphasizes collaboration, integration, and future goals such as custom diagram tools and process management. The article introduces components of the system and invites reader involvement. SpiffWorkflow empowers non-coders to control app logic using flow-chart diagrams.
It emphasizes collaboration, integration, and future goals such as custom diagram tools and process management.
The article introduces components of the system and invites reader involvement.
---- ----
**[#3 - Understanding BPMNs Data Objects with SpiffWorkflow](https://medium.com/@danfunk/understanding-bpmns-data-objects-with-spiffworkflow-26e195e23398)** **[#3 - Understanding BPMNs Data Objects with SpiffWorkflow](https://medium.com/@danfunk/understanding-bpmns-data-objects-with-spiffworkflow-26e195e23398)**
This article explains BPMN Data Objects and their implementation in the SpiffWorkflow project. It covers the default data flow behavior, the use of Data Objects for controlling data access, and the benefits of using them. The article also mentions data transformations and the two types of data in SpiffWorkflow. It concludes with information on upcoming releases and encourages reader involvement. This article explains BPMN Data Objects and their implementation in the SpiffWorkflow project.
It covers the default data flow behavior, the use of Data Objects for controlling data access, and the benefits of using them.
The article also mentions data transformations and the two types of data in SpiffWorkflow.
It concludes with information on upcoming releases and encourages reader involvement.
---- ----
**[#4 - Understanding BPMN Messages](https://medium.com/@danfunk/understanding-bpmn-messages-7b0fee2d6a81)** **[#4 - Understanding BPMN Messages](https://medium.com/@danfunk/understanding-bpmn-messages-7b0fee2d6a81)**
This article explores BPMN 2.0 Messages and their implementation in the SpiffWorkflow project. It covers concepts like messages, collaborations, and correlations in BPMN diagrams. The article discusses changes made to SpiffWorkflow applications to support message handling and communication between processes. It emphasizes the importance of maintaining correlation consistency and highlights the benefits of enabling communication between complex systems. This article explores BPMN 2.0 Messages and their implementation in the SpiffWorkflow project.
It covers concepts like messages, collaborations, and correlations in BPMN diagrams.
The article discusses changes made to SpiffWorkflow applications to support message handling and communication between processes.
It emphasizes the importance of maintaining correlation consistency and highlights the benefits of enabling communication between complex systems.
---- ----
**[#5 - Getting Started with SpiffArena](https://medium.com/@danfunk/getting-started-8ec59afe3a48)** **[#5 - Getting Started with SpiffArena](https://medium.com/@danfunk/getting-started-8ec59afe3a48)**
The article offers a tutorial on using SpiffArena to create and run executable SpiffWorkflow diagrams. It covers installation, building a simple workflow process using BPMN diagrams and tasks, and concludes with shutting down the application. The tutorial highlights the potential of SpiffWorkflow in enhancing transparency and collaboration in complex business processes. The article offers a tutorial on using SpiffArena to create and run executable SpiffWorkflow diagrams.
It covers installation, building a simple workflow process using BPMN diagrams and tasks, and concludes with shutting down the application.
The tutorial highlights the potential of SpiffWorkflow in enhancing transparency and collaboration in complex business processes.
---- ----
**[#6 - The Low Code Wall](https://medium.com/@danfunk/the-low-code-wall-fa2e3476cc10)** **[#6 - The Low Code Wall](https://medium.com/@danfunk/the-low-code-wall-fa2e3476cc10)**
The article discusses low-code and no-code tools for software development, emphasizing the challenges and pitfalls that may arise. It suggests considering open standards, collaboration with IT departments, and introduces SpiffArena as an open-source alternative that encourages collaboration and utilizes visual tools and familiar interfaces. The article discusses low-code and no-code tools for software development, emphasizing the challenges and pitfalls that may arise.
It suggests considering open standards, collaboration with IT departments, and introduces SpiffArena as an open-source alternative that encourages collaboration and utilizes visual tools and familiar interfaces.
---- ----
**[#7 - SpiffArena, the low-code visual workflow builder, awaits you like a clean canvas…](https://medium.com/@danfunk/spiffarena-the-low-code-visual-workflow-builder-awaits-you-like-a-clean-canvas-e7b9bd20ae71)** **[#7 - SpiffArena, the low-code visual workflow builder, awaits you like a clean canvas…](https://medium.com/@danfunk/spiffarena-the-low-code-visual-workflow-builder-awaits-you-like-a-clean-canvas-e7b9bd20ae71)**
SpiffArena is a browser-based tool that allows you to create flow-chart-like diagrams for automating business workflows. It offers features such as form builders, application connections, timers, custom APIs, and robust permissions. SpiffArena combines BPMN and Python to provide a powerful and configurable solution. SpiffArena is a browser-based tool that allows you to create flow-chart-like diagrams for automating business workflows.
It offers features such as form builders, application connections, timers, custom APIs, and robust permissions.
SpiffArena combines BPMN and Python to provide a powerful and configurable solution.
---- ----
**[#8 - A Visual Workflow Library for Python](https://medium.com/@danfunk/a-visual-workflow-library-for-python-d19e1387653)** **[#8 - A Visual Workflow Library for Python](https://medium.com/@danfunk/a-visual-workflow-library-for-python-d19e1387653)**
SpiffWorkflow is a Python library that simplifies complex business logic by using BPMN diagrams, allowing non-developers to make changes to application flows. It improves communication within teams, increases contributions, and adapts to changing requirements. Visual software development environments like SpiffWorkflow represent the future of solving complex problems. SpiffWorkflow is a Python library that simplifies complex business logic by using BPMN diagrams, allowing non-developers to make changes to application flows.
It improves communication within teams, increases contributions, and adapts to changing requirements.
Visual software development environments like SpiffWorkflow represent the future of solving complex problems.

View File

@ -2,11 +2,15 @@
## Activity ## Activity
This refers to the work carried out by an individual or an organization within a process. Activities can be classified into three categories: Task, Subprocess, and Call Activity. These activities can be either atomic or non-atomic. Atomic activities are indivisible and represent single tasks, while non-atomic activities involve multiple steps or subprocesses that work together to achieve a larger objective. This refers to the work carried out by an individual or an organization within a process.
Activities can be classified into three categories: Task, Subprocess, and Call Activity.
These activities can be either atomic or non-atomic.
Atomic activities are indivisible and represent single tasks, while non-atomic activities involve multiple steps or subprocesses that work together to achieve a larger objective.
## Boundary Event ## Boundary Event
This refers to an event that can be triggered while an activity is in progress. Boundary events are utilized for error and exception handling purposes. This refers to an event that can be triggered while an activity is in progress.
Boundary events are utilized for error and exception handling purposes.
## BPMN Model ## BPMN Model
@ -14,7 +18,8 @@ This is a visual depiction of a business process designed to be both human-reada
## Business Process ## Business Process
This is a sequence of interconnected activities conducted by individuals and systems, following a defined order, with the aim of delivering a service or product, or accomplishing a specific business objective. These processes involve the receipt, processing, and transfer of information and resources to generate desired outputs. This is a sequence of interconnected activities conducted by individuals and systems, following a defined order, with the aim of delivering a service or product, or accomplishing a specific business objective.
These processes involve the receipt, processing, and transfer of information and resources to generate desired outputs.
## Diagram ## Diagram
@ -22,7 +27,8 @@ This is the visual platform where business processes are represented and mapped
## Call Activity ## Call Activity
This refers to the act of a parent or higher-level process invoking a pre-defined or reusable child process, which is represented in another process diagram. This invocation allows for the utilization of the child process multiple times, enhancing reusability within the overall model. This refers to the act of a parent or higher-level process invoking a pre-defined or reusable child process, which is represented in another process diagram.
This invocation allows for the utilization of the child process multiple times, enhancing reusability within the overall model.
## Collapsed Subprocess ## Collapsed Subprocess
@ -30,15 +36,18 @@ This is a Subprocess that conceals the underlying process it includes.
## Connecting Element ## Connecting Element
These are lines that establish connections between Flow Elements within a process, creating a Flow. There are four distinct types of connecting elements: Sequence Flows, Message Flows, Associations, and Data Associations. These are lines that establish connections between Flow Elements within a process, creating a Flow.
There are four distinct types of connecting elements: Sequence Flows, Message Flows, Associations, and Data Associations.
## Elements ## Elements
These are the fundamental components used to construct processes. These elements encompass Flow Elements, Connecting Elements, Data Elements, Artifacts, and Swimlanes. These are the fundamental components used to construct processes.
These elements encompass Flow Elements, Connecting Elements, Data Elements, Artifacts, and Swimlanes.
## End Event ## End Event
This marks the conclusion of a process. An End Event can result in a Message, Error, or Signal outcome. This marks the conclusion of a process.
An End Event can result in a Message, Error, or Signal outcome.
## Error ## Error
@ -46,7 +55,8 @@ This denotes a significant issue encountered during the execution of an Activity
## Event ## Event
This is an occurrence within a process that influences the Flow and typically involves a trigger and/or a result. Events can be categorized into four types: Start, Intermediate, End, and Boundary. This is an occurrence within a process that influences the Flow and typically involves a trigger and/or a result.
Events can be categorized into four types: Start, Intermediate, End, and Boundary.
## Event-Based Gateway ## Event-Based Gateway
@ -54,11 +64,13 @@ This marks a specific point within the process where alternative paths are initi
## Exception ## Exception
This is an Event within the process that deviates from the normal flow of execution. Exceptions can be triggered by Time, Error, or Message Events. This is an Event within the process that deviates from the normal flow of execution.
Exceptions can be triggered by Time, Error, or Message Events.
## Exclusive Gateway ## Exclusive Gateway
This denotes a juncture within the process where multiple alternative paths are available, but only one path can be chosen. The decision regarding the chosen path is determined by a condition. This denotes a juncture within the process where multiple alternative paths are available, but only one path can be chosen.
The decision regarding the chosen path is determined by a condition.
## Expanded Subprocess ## Expanded Subprocess
@ -66,11 +78,15 @@ This is a Subprocess that shows the process it contains.
## Gateway ## Gateway
This is a component that governs the available paths within a process. Gateways can merge or diverge paths, or introduce additional paths based on conditions or Events. There are four types of Gateways: Exclusive, Parallel, Inclusive, and Event-Based. This is a component that governs the available paths within a process.
Gateways can merge or diverge paths, or introduce additional paths based on conditions or Events.
There are four types of Gateways: Exclusive, Parallel, Inclusive, and Event-Based.
## Intermediate Event ## Intermediate Event
This is an event that occurs within the middle of a process, neither at the start nor the end. It can be connected to other tasks through connectors or placed on the border of a task. It evaluates conditions and circumstances, triggering events and enabling the initiation of alternative paths within the process. This is an event that occurs within the middle of a process, neither at the start nor the end.
It can be connected to other tasks through connectors or placed on the border of a task.
It evaluates conditions and circumstances, triggering events and enabling the initiation of alternative paths within the process.
## Join ## Join
@ -82,15 +98,19 @@ These are subdivisions within a Pool that are utilized to assign activities to s
## Merge ## Merge
This is the process in which two or more parallel Sequence Flow paths converge into a single path, achieved either through multiple incoming Sequence Flows or by utilizing an Exclusive Gateway. This merging of paths is also commonly referred to as an "OR-Join". This is the process in which two or more parallel Sequence Flow paths converge into a single path, achieved either through multiple incoming Sequence Flows or by utilizing an Exclusive Gateway.
This merging of paths is also commonly referred to as an "OR-Join".
## Message ## Message
This signifies the content of a communication exchanged between two Participants. The message is transmitted through a Message Flow. This signifies the content of a communication exchanged between two Participants.
The message is transmitted through a Message Flow.
## Non-atomic Activity ## Non-atomic Activity
This refers to an Activity that can be further decomposed into more detailed steps or subtasks. A Subprocess is an example of a non-atomic Activity. It is also commonly referred to as a "compound" Activity. This refers to an Activity that can be further decomposed into more detailed steps or subtasks.
A Subprocess is an example of a non-atomic Activity.
It is also commonly referred to as a "compound" Activity.
## Parallel Gateway ## Parallel Gateway
@ -126,7 +146,8 @@ This is a self-contained and compound Activity incorporated within a process, ca
## Swimlane ## Swimlane
This is a visual representation that separates processes based on the Participants responsible for performing them. Swimlanes are comprised of Pools and Lanes. This is a visual representation that separates processes based on the Participants responsible for performing them.
Swimlanes are comprised of Pools and Lanes.
## Task ## Task

View File

@ -2,7 +2,10 @@
## Activity ## Activity
Refers to the work carried out by an individual or an organization within a process. Activities can be classified into three categories: Task, Subprocess, and Call Activity. These activities can be either atomic or non-atomic. Atomic activities are indivisible and represent single tasks, while non-atomic activities involve multiple steps or subprocesses that work together to achieve a larger objective. Refers to the work carried out by an individual or an organization within a process.
Activities can be classified into three categories: Task, Subprocess, and Call Activity.
These activities can be either atomic or non-atomic.
Atomic activities are indivisible and represent single tasks, while non-atomic activities involve multiple steps or subprocesses that work together to achieve a larger objective.
## Workflow ## Workflow

View File

@ -7,11 +7,7 @@ function error_handler() {
trap 'error_handler ${LINENO} $?' ERR trap 'error_handler ${LINENO} $?' ERR
set -o errtrace -o errexit -o nounset -o pipefail set -o errtrace -o errexit -o nounset -o pipefail
# find all markdown files at any depth using find and a while read loop since this is fragile:
# for file in $(find . -type f -name "*.md"); do
# ./bin/edit "$file"
# done
while IFS= read -r -d '' file; do while IFS= read -r -d '' file; do
./bin/edit "$file" markdown_to_ventilated_prose.py "$file" "$file"
# ./bin/edit "$file"
done < <(find . -type f -name "*.md" -print0) done < <(find . -type f -name "*.md" -print0)

View File

@ -25,7 +25,8 @@ This is called [Ventilated Prose](https://vanemden.wordpress.com/2009/01/01/vent
You won't be writing the documentation in a word processor, but in simple plain text, and some special syntax that will consistently and professionally format that text. You won't be writing the documentation in a word processor, but in simple plain text, and some special syntax that will consistently and professionally format that text.
The basic Markdown syntax is very simple. The basic Markdown syntax is very simple.
Here are some [quick examples](https://commonmark.org/help/). And here is a great [10 minute tutorial](https://commonmark.org/help/tutorial/). Here are some [quick examples](https://commonmark.org/help/).
And here is a great [10 minute tutorial](https://commonmark.org/help/tutorial/).
This will cover a lot of the basics, like bolding text, italics, paragraphs, lists and other common formatting techniques. This will cover a lot of the basics, like bolding text, italics, paragraphs, lists and other common formatting techniques.
![Markdown screenshot](./images/markdown.png "Markdown example") ![Markdown screenshot](./images/markdown.png "Markdown example")
@ -45,7 +46,8 @@ This is a large documentation effort.
Many different Markdown pages will together make up the full website. Many different Markdown pages will together make up the full website.
You will mostly use Sphinx in the background - you won't even be aware of it. You will mostly use Sphinx in the background - you won't even be aware of it.
But if you decide that you want to alter the theme (the colors, styles, etc.) of the final website, Sphinx controls this and offers [themes](https://sphinx-themes.org/) and the ability to change styles/colors and formatting throughout the site. But if you decide that you want to alter the theme (the colors, styles, etc.)
of the final website, Sphinx controls this and offers [themes](https://sphinx-themes.org/) and the ability to change styles/colors and formatting throughout the site.
You just need to learn a little CSS to control it. You just need to learn a little CSS to control it.

View File

@ -8,6 +8,7 @@ The following is a list of enhancements we wish to complete in the near (or even
Automated tests that ensure our performance remains consistent as we add features and functionality. Automated tests that ensure our performance remains consistent as we add features and functionality.
### Support Multiple Connector Proxies ### Support Multiple Connector Proxies
Service Tasks have been a huge win, there are multiple reasons why supporting more than one Connector Proxy would be beneficial: Service Tasks have been a huge win, there are multiple reasons why supporting more than one Connector Proxy would be beneficial:
1. Connect to several separately hosted services 1. Connect to several separately hosted services
@ -16,28 +17,40 @@ Service Tasks have been a huge win, there are multiple reasons why supporting mo
4. Could support non-http based connectors (git interactions could be a workflow) 4. Could support non-http based connectors (git interactions could be a workflow)
### Interstitial Performance ### Interstitial Performance
Push all processing to the background so the interstitial is just querying, not running (new item) Push all processing to the background so the interstitial is just querying, not running (new item)
### Authentication Keys ### Authentication Keys
Provide the ability to access API endpoints using an access key - or authentication process that is specifically designed for API calls. (we currently rely on grabbing the JSON token to do this, which is not a real solution)
Provide the ability to access API endpoints using an access key - or authentication process that is specifically designed for API calls.
(we currently rely on grabbing the JSON token to do this, which is not a real solution)
### Core BPMN features ### Core BPMN features
There are a number of useful BPMN components that we do not currently support. We should evaluate these and determine which ones we should support and how we should support them. We should consider creating a list of unsupported items.
There are a number of useful BPMN components that we do not currently support.
We should evaluate these and determine which ones we should support and how we should support them.
We should consider creating a list of unsupported items.
* Compensation Events (valuable, but difficult) * Compensation Events (valuable, but difficult)
* Conditional events. * Conditional events.
* Event Sub-Processes are not currently supported (low-hanging fruit, easy to add) * Event Sub-Processes are not currently supported (low-hanging fruit, easy to add)
### Decentralized / Distributed Deployments ### Decentralized / Distributed Deployments
This is a broad topic and will be covered in a separate document. But consider a SpiffWorkflow implementation that is deployed across a cluster of systems - and manages transactions on a shared Blockchain implementation. Such a structure could assure compliance to a set of blessed BPMN diagrams. Such a system could support highly transparent and auditable processes that could drive a DAO-based organization.
This is a broad topic and will be covered in a separate document.
But consider a SpiffWorkflow implementation that is deployed across a cluster of systems - and manages transactions on a shared Blockchain implementation.
Such a structure could assure compliance to a set of blessed BPMN diagrams.
Such a system could support highly transparent and auditable processes that could drive a DAO-based organization.
### Improve Parallel Processing ### Improve Parallel Processing
We should support the parallel execution of tasks within a single process whenever possible. This is not as far-fetched or difficult as it may initially seem. While Python is notoriously bad at parallel execution (the lovely GIL) - we have already taken the most critical steps to ensuring it is possible: We should support the parallel execution of tasks within a single process whenever possible. This is not as far-fetched or difficult as it may initially seem. While Python is notoriously bad at parallel execution (the lovely GIL) - we have already taken the most critical steps to ensuring it is possible:
1. A team has demonstrated parallel execution using the cure SpiffWorkflow library. 1. A team has demonstrated parallel execution using the cure SpiffWorkflow library.
2. We can keep a configurable number of "background" SpiffArena processes running that can pick up waiting tasks. 2. We can keep a configurable number of "background" SpiffArena processes running that can pick up waiting tasks.
Given these things are already in place, we just need to lock processes at the task or branch level - so that ready tasks on parallel branches can be picked up by different background processes at the same time. Given these things are already in place, we just need to lock processes at the task or branch level - so that ready tasks on parallel branches can be picked up by different background processes at the same time.
### BPMN Definitions at save time vs run time ### BPMN Definitions at save time vs run time
Improve performance by pre-processing the BPMN Specification and generating the internal JSON representation so we no longer incur the expense of doing this on a per-process basis. Improve performance by pre-processing the BPMN Specification and generating the internal JSON representation so we no longer incur the expense of doing this on a per-process basis.
This will also allow us to do some early and deep validation as well. This will also allow us to do some early and deep validation as well.
@ -47,20 +60,24 @@ This will also allow us to do some early and deep validation as well.
We could really use a good UI / UX review of the site and take a stab at cleaning up the whole site to follow some consistent design patterns and resolve potential issues. We could really use a good UI / UX review of the site and take a stab at cleaning up the whole site to follow some consistent design patterns and resolve potential issues.
### Customizable Home Page (non Status specific) ### Customizable Home Page (non Status specific)
Allow a way to define custom landing pages that create different experiences for different organizations / needs. Allow a way to define custom landing pages that create different experiences for different organizations / needs.
### Markdown rendering could be better ### Markdown rendering could be better
1. When creating a bulleted or numbered list, no bullets or numbers are displayed. This is a bug in our style sheets - or something that is clearing out all styles. 1. When creating a bulleted or numbered list, no bullets or numbers are displayed. This is a bug in our style sheets - or something that is clearing out all styles.
2. Limit the width of paragraphs to something reasonable. Having a line of text stretch across the entire screen is not a good experience. 2. Limit the width of paragraphs to something reasonable. Having a line of text stretch across the entire screen is not a good experience.
3. Add support for MyST - this provides a set of standard extensions to Markdown and is the extension we are using for our own documentation. 3. Add support for MyST - this provides a set of standard extensions to Markdown and is the extension we are using for our own documentation.
4. Add support for parsing and displaying task data / Jinja2 syntax - so you can immediately see how you are formatting the task data. Provide an additional area for setting the task data, and have it render that information in place. 4. Add support for parsing and displaying task data / Jinja2 syntax - so you can immediately see how you are formatting the task data. Provide an additional area for setting the task data, and have it render that information in place.
## Administrator / Support Contact Information ## Administrator / Support Contact Information
Allow defining contact information at the process group and process model level, perhaps at some very top level as well - which can be inherited unless overridden. Allow defining contact information at the process group and process model level, perhaps at some very top level as well - which can be inherited unless overridden.
This information could then be displayed when a process is in a non-functional state - such as an error, suspended, or terminated state. This information could then be displayed when a process is in a non-functional state - such as an error, suspended, or terminated state.
It might also be available in the footer or under a help icon when displaying a process instance. It might also be available in the footer or under a help icon when displaying a process instance.
### Process Heatmap ### Process Heatmap
Allow administrators to see an overlay of a BPMN diagram that shows all the process instances in the system and where they are (20 people are waiting on approval, 15 are in the re-review .....) Allow administrators to see an overlay of a BPMN diagram that shows all the process instances in the system and where they are (20 people are waiting on approval, 15 are in the re-review .....)
## Modeler Experience ## Modeler Experience
@ -69,21 +86,26 @@ Allow administrators to see an overlay of a BPMN diagram that shows all the proc
Can we build a better DMN editor? Trisotech seems to do it very well. Would love to have a day or two just to research this area and see if there is just another open source project we can leverage, or if we could build our own tool. Can we build a better DMN editor? Trisotech seems to do it very well. Would love to have a day or two just to research this area and see if there is just another open source project we can leverage, or if we could build our own tool.
### Modeler Checker ### Modeler Checker
At run time, or when you save it would be great if we could execute a: At run time, or when you save it would be great if we could execute a:
* Validation Report - what is wrong with the model? Is it Valid BPMN? Are there intrinsic errors? * Validation Report - what is wrong with the model? Is it Valid BPMN? Are there intrinsic errors?
* Linting Report! Does the model follow common naming conventions, styles, are there dead-locks, etc. Many of these tools already exist, we just need to integrate them! * Linting Report! Does the model follow common naming conventions, styles, are there dead-locks, etc. Many of these tools already exist, we just need to integrate them!
### Plugins and Extensions ### Plugins and Extensions
* Track down our previous research and add here. Color picker, etc.... * Track down our previous research and add here. Color picker, etc....
### Automated Testing ✔️ ### Automated Testing ✔️
Incorporate an end-to-end testing system that will allow you to quickly assure that
a bpmn model is working as expected. Imagine Cypress tests that you could define and execute in the modeler. Incorporate an end-to-end testing system that will allow you to quickly assure that a bpmn model is working as expected.
Imagine Cypress tests that you could define and execute in the modeler.
### Json Schemas Everywhere! ### Json Schemas Everywhere!
Our forms are Json Schemas (a description of the data structure) - we could do similar things for Service Tasks, Script Tasks ... such that the modeler is at all times aware of what data is available - making it possible to build and execute a task as it is created. Our forms are Json Schemas (a description of the data structure) - we could do similar things for Service Tasks, Script Tasks ... such that the modeler is at all times aware of what data is available - making it possible to build and execute a task as it is created.
### Markdown Support for Process Groups and Models ### Markdown Support for Process Groups and Models
Allow us to define a markdown file for a process group or process model, which would be displayed in the process group or process model in the tile view, or at the top of the details page when a group or model is selected. Allow us to define a markdown file for a process group or process model, which would be displayed in the process group or process model in the tile view, or at the top of the details page when a group or model is selected.
### Adding a unit test from within the script editor would be nice ### Adding a unit test from within the script editor would be nice
@ -93,12 +115,13 @@ Allow us to define a markdown file for a process group or process model, which w
2. RJSF says it supports markdown in the headers, but it doesn't work for us. 2. RJSF says it supports markdown in the headers, but it doesn't work for us.
### Text Area / Markdown / Select list for Service Task Parameters ### Text Area / Markdown / Select list for Service Task Parameters
The text box for entering parameters for a service task is uncomfortable, verging on maddening when trying to enter long parameters. It would also be wonderful to offer a selection list if we have a known set of options for a given parameter.
The text box for entering parameters for a service task is uncomfortable, verging on maddening when trying to enter long parameters.
It would also be wonderful to offer a selection list if we have a known set of options for a given parameter.
### Moving Models and Groups ### Moving Models and Groups
Right now we allow editing the Display name of a model or group, but it does
not change the name of the underlying directory, making it harder and harder Right now we allow editing the Display name of a model or group, but it does not change the name of the underlying directory, making it harder and harder over time to look at GitHub or the file system and find what you are seeing in the display.
over time to look at GitHub or the file system and find what you are seeing in the display.
### Fast feedback on errors with python expressions ### Fast feedback on errors with python expressions