merged in main, resolved conflicts, and updated keycloak realm file for new path vars w/ burnettk
This commit is contained in:
commit
5cb899ed97
|
@ -84,6 +84,7 @@ jobs:
|
|||
PRE_COMMIT_COLOR: "always"
|
||||
SPIFFWORKFLOW_BACKEND_DATABASE_PASSWORD: password
|
||||
SPIFFWORKFLOW_BACKEND_DATABASE_TYPE: ${{ matrix.database }}
|
||||
SPIFFWORKFLOW_BACKEND_RUNNING_IN_CI: 'true'
|
||||
|
||||
steps:
|
||||
- name: Check out the repository
|
||||
|
|
|
@ -5,4 +5,5 @@ t
|
|||
.dccache
|
||||
version_info.json
|
||||
.coverage*
|
||||
UNKNOWN.egg-info/
|
||||
UNKNOWN.egg-info/
|
||||
process_models/
|
|
@ -0,0 +1,28 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
#!/usr/bin/env bash
|
||||
|
||||
function error_handler() {
|
||||
>&2 echo "Exited with BAD EXIT CODE '${2}' in ${0} script at line: ${1}."
|
||||
exit "$2"
|
||||
}
|
||||
trap 'error_handler ${LINENO} $?' ERR
|
||||
set -o errtrace -o errexit -o nounset -o pipefail
|
||||
|
||||
if [[ -z "${1:-}" ]]; then
|
||||
>&2 echo "usage: $(basename "$0") [SPIFF_EDITOR_BPMN_SPEC_DIR]"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [[ ! -d "$1" ]]; then
|
||||
>&2 echo "ERROR: the first argument must be a directory."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
SPIFF_EDITOR_BPMN_SPEC_DIR=$1 \
|
||||
docker compose -f editor.docker-compose.yml up -d
|
||||
|
||||
echo ""
|
||||
echo "Spiff Editor is ready."
|
||||
echo ""
|
||||
echo "Please open ${SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND:-http://localhost:${SPIFFWORKFLOW_FRONTEND_PORT:-8001}}"
|
|
@ -0,0 +1,3 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
docker compose -f editor.docker-compose.yml down
|
|
@ -0,0 +1,3 @@
|
|||
#!/usr/bin/env bash
|
||||
|
||||
docker compose -f editor.docker-compose.yml pull
|
|
@ -0,0 +1,48 @@
|
|||
# BPMN Unit Tests
|
||||
|
||||
Software Engineers test their code.
|
||||
With this feature, BPMN authors can test their creations, too.
|
||||
These tests can provide you with faster feedback than you would get by simply running your process model, and they allow you to mock out form input and service task connections as well as provide specific input to exercise different branches of your process model.
|
||||
BPMN unit tests are designed to give you greater confidence that your process models will work as designed when they are run in the wild, both the first time they are used by real users and also after you make changes to them.
|
||||
|
||||
## Creating BPMN Unit Tests
|
||||
|
||||
First, create a process model that you want to test.
|
||||
Navigate to the process model and add a JSON file based on the name of one of the BPMN files.
|
||||
For example, if you have a process model that includes a file called `awesome_script_task.bpmn`, your test JSON file would be called `test_awesome_script_task.json`.
|
||||
If you have multiple BPMN files you want to test, you can have multiple test JSON files.
|
||||
The BPMN files you test do not have to be marked as the primary file for the process model in question.
|
||||
The structure of your json should be as follows:
|
||||
|
||||
{
|
||||
"test_case_1": {
|
||||
"tasks": {
|
||||
"ServiceTaskProcess:service_task_one": {
|
||||
"data": [{ "the_result": "result_from_service" }]
|
||||
}
|
||||
},
|
||||
"expected_output_json": { "the_result": "result_from_service" }
|
||||
}
|
||||
}
|
||||
|
||||
The top-level keys should be names of unit tests.
|
||||
In this example, the unit test is named "test_case_1."
|
||||
Under that, you can specify "tasks" and "expected_output_json."
|
||||
|
||||
Under "tasks," each key is the BPMN id of a specific task.
|
||||
If you are testing a file that uses Call Activities and therefore calls other processes, there can be conflicting BPMN ids.
|
||||
In this case, you can specify the unique activity by prepending the Process id (in the above example, that is "ServiceTaskProcess").
|
||||
For simple processes, "service_task_one" (for example) would be sufficient as the BPMN id.
|
||||
For User Tasks, the "data" (under a specific task) represents the data that will be entered by the user in the form.
|
||||
For Service Tasks, the data represents the data that will be returned by the service.
|
||||
Note that all User Tasks and Service Tasks must have their BPMN ids mentioned in the JSON file (with mock task data as desired), since otherwise we won't know what to do when the flow arrives at these types of tasks.
|
||||
|
||||
The "expected_output_json" represents the state of the task data that you expect when the process completes.
|
||||
When the test is run, if the actual task data differs from this expectation, the test will fail.
|
||||
The test will also fail if the process never completes or if an error occurs.
|
||||
|
||||
## Running BPMN Unit Tests
|
||||
|
||||
Go to a process model and either click “Run Unit Tests” to run all tests for the process model or click on the “play icon” next to a "test_something.json" file.
|
||||
Then you will get a green check mark or a red x.
|
||||
You can click on these colored icons to get more details about the passing or failing test.
|
|
@ -1,17 +1,16 @@
|
|||
Welcome to SpiffWorkflow's documentation!
|
||||
=======================================
|
||||
# Welcome to SpiffWorkflow's documentation
|
||||
|
||||
```{toctree}
|
||||
:maxdepth: 2
|
||||
:caption: Contents
|
||||
quick_start/quick_start.md
|
||||
documentation/documentation.md
|
||||
how_to/bpmn_unit_tests.md
|
||||
```
|
||||
|
||||
This is great!
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
## Indices and tables
|
||||
|
||||
* [](genindex)
|
||||
* [](modindex)
|
||||
|
|
|
@ -9,9 +9,9 @@
|
|||
|
||||
## 🚀 Getting Started with SpiffWorkflow
|
||||
|
||||
SpiffWorkflow is a platform that facilitates the execution of business processes performed within the Status platform.
|
||||
SpiffWorkflow is a platform that facilitates the execution of business processes. To begin using SpiffWorkflow, it is necessary to acquire the appropriate credentials and URL.
|
||||
|
||||
To access SpiffWorkflow, simply sign in using your Keycloak account. Once you have successfully signed in to the Spiff platform, it is crucial to familiarize yourself with the various sections within the SpiffWorkflow. This will enable you to gain a comprehensive understanding of the interface.
|
||||
Upon receiving the credentials, here is how you can get started with SpiffWorkflow!
|
||||
|
||||
```{image} images/Untitled.png
|
||||
:alt: Login Page
|
||||
|
@ -20,13 +20,7 @@ To access SpiffWorkflow, simply sign in using your Keycloak account. Once you ha
|
|||
|
||||
```{image} images/Untitled_1.png
|
||||
:alt: Home Page
|
||||
:width: 45%
|
||||
```
|
||||
|
||||
```{admonition} Signing In
|
||||
:class: warning
|
||||
|
||||
⚠️ In the event that you encounter any difficulties signing in to Spiff, please reach out to Jakub (**@jakubgs**) on Discord for assistance and further guidance.
|
||||
:width: 53%
|
||||
```
|
||||
|
||||
Here, we will provide a generic overview of each section step by step, allowing you to navigate and engage with the platform more effectively.
|
||||
|
@ -41,7 +35,7 @@ Once you are signed in, you can start exploring the home page. The home page has
|
|||
- The "Completed" section allows you to view all completed process instances, including those initiated by you, those initiated by other SpiffWorkflow users with tasks completed by you and if applicable, those with tasks completed by a group of which you are a member.
|
||||
- The “Start New” section displays the processes you are permitted to start according to your role.
|
||||
|
||||
```{admonition} Signing In
|
||||
```{admonition} Key terms
|
||||
:class: info
|
||||
💡 **Process:** A process is a sequence of tasks that must be completed to achieve a specific goal.
|
||||
|
||||
|
@ -58,26 +52,20 @@ If you are a member of a team, you may also have one or more Instances with task
|
|||
The process section provides a comprehensive view of the process ecosystem by showcasing process groups and process models.
|
||||
|
||||
```{admonition} Process Groups
|
||||
:class: info
|
||||
💡 A **process group** is a way of grouping a bunch of **process models.** A **process model** contains all the files necessary to execute a specific process.
|
||||
A **process group** is a way of grouping a bunch of **process models** and a **process model** contains all the files necessary to execute a specific process.
|
||||
```
|
||||
|
||||
--
|
||||
|
||||
![Untitled](images/Untitled_4.png)
|
||||
|
||||
### Step 3: Explore the Process Instances section
|
||||
|
||||
The Process Instance section provides a detailed view of individual process instances, allowing you to track their progress and manage them effectively. This section includes essential information such as the instance ID, process name, the individual who started the process, the end date, and the current status.
|
||||
The Process Instance section provides a detailed view of individual process instances, allowing you to track their progress and manage them effectively.
|
||||
|
||||
This section includes essential information such as the instance ID, process name, the individual who started the process, the end date, and the current status.
|
||||
|
||||
![Untitled](images/Untitled_5.png)
|
||||
|
||||
```{admonition} Desktop Notifications
|
||||
:class: info
|
||||
💡 To receive SpiffWorkflow notifications in StatusApp Desktop, Public key from your Status account should be added to your **Bamboo profile**. This will ensure that workflow-related notifications are sent to you.
|
||||
|
||||
```
|
||||
|
||||
When getting started with SpiffWorkflow, it's essential to take the time to explore and familiarize yourself with the platform's interface and features. Feel free to ask questions about the platform's features or how to get started. The PPG team is always on hand to provide assistance and support when needed.
|
||||
When getting started with SpiffWorkflow, it's essential to take the time to explore and familiarize yourself with the platform's interface and features. Feel free to ask questions about the platform's features or how to get started.
|
||||
|
||||
---
|
||||
|
||||
|
@ -306,5 +294,3 @@ Ensure that all required details have been included such as Process name, Proces
|
|||
![Untitled](images/Untitled_32.png)
|
||||
|
||||
By following these steps, you can request the special permissions needed to carry out your tasks effectively.
|
||||
|
||||
Changes added by Usama
|
|
@ -0,0 +1,66 @@
|
|||
services:
|
||||
spiffworkflow-frontend:
|
||||
container_name: spiffworkflow-frontend
|
||||
image: ghcr.io/sartography/spiffworkflow-frontend:main-latest
|
||||
depends_on:
|
||||
spiffworkflow-backend:
|
||||
condition: service_healthy
|
||||
environment:
|
||||
APPLICATION_ROOT: "/"
|
||||
PORT0: "${SPIFFWORKFLOW_FRONTEND_PORT:-8001}"
|
||||
ports:
|
||||
- "${SPIFFWORKFLOW_FRONTEND_PORT:-8001}:${SPIFFWORKFLOW_FRONTEND_PORT:-8001}/tcp"
|
||||
|
||||
spiffworkflow-backend:
|
||||
container_name: spiffworkflow-backend
|
||||
image: ghcr.io/sartography/spiffworkflow-backend:main-latest
|
||||
environment:
|
||||
SPIFFWORKFLOW_BACKEND_APPLICATION_ROOT: "/"
|
||||
SPIFFWORKFLOW_BACKEND_ENV: "local_development"
|
||||
FLASK_DEBUG: "0"
|
||||
FLASK_SESSION_SECRET_KEY: "${FLASK_SESSION_SECRET_KEY:-super_secret_key}"
|
||||
# WARNING: Frontend is a static site which assumes frontend port - 1 on localhost.
|
||||
SPIFFWORKFLOW_BACKEND_URL: "http://localhost:${SPIFF_BACKEND_PORT:-8000}"
|
||||
|
||||
SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR: "/app/process_models"
|
||||
SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL: "http://spiffworkflow-connector:8004"
|
||||
SPIFFWORKFLOW_BACKEND_DATABASE_TYPE: "sqlite"
|
||||
SPIFFWORKFLOW_BACKEND_LOAD_FIXTURE_DATA: "false"
|
||||
SPIFFWORKFLOW_BACKEND_OPEN_ID_CLIENT_ID: "spiffworkflow-backend"
|
||||
SPIFFWORKFLOW_BACKEND_OPEN_ID_CLIENT_SECRET_KEY: "my_open_id_secret_key"
|
||||
SPIFFWORKFLOW_BACKEND_OPEN_ID_SERVER_URL: "http://localhost:${SPIFF_BACKEND_PORT:-8000}/openid"
|
||||
SPIFFWORKFLOW_BACKEND_PERMISSIONS_FILE_NAME: "example.yml"
|
||||
SPIFFWORKFLOW_BACKEND_PORT: "${SPIFF_BACKEND_PORT:-8000}"
|
||||
SPIFFWORKFLOW_BACKEND_RUN_BACKGROUND_SCHEDULER: "false"
|
||||
SPIFFWORKFLOW_BACKEND_UPGRADE_DB: "true"
|
||||
SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND: "http://localhost:${SPIFFWORKFLOW_FRONTEND_PORT:-8001}"
|
||||
ports:
|
||||
- "${SPIFF_BACKEND_PORT:-8000}:${SPIFF_BACKEND_PORT:-8000}/tcp"
|
||||
volumes:
|
||||
- "${SPIFF_EDITOR_BPMN_SPEC_DIR:-./process_models}:/app/process_models"
|
||||
- ./log:/app/log
|
||||
healthcheck:
|
||||
test: "curl localhost:${SPIFF_BACKEND_PORT:-8000}/v1.0/status --fail"
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 20
|
||||
|
||||
spiffworkflow-connector:
|
||||
container_name: spiffworkflow-connector
|
||||
image: ghcr.io/sartography/connector-proxy-demo:latest
|
||||
environment:
|
||||
FLASK_ENV: "${FLASK_ENV:-development}"
|
||||
FLASK_DEBUG: "0"
|
||||
FLASK_SESSION_SECRET_KEY: "${FLASK_SESSION_SECRET_KEY:-super_secret_key}"
|
||||
CONNECTOR_PROXY_PORT: "${SPIFF_CONNECTOR_PORT:-8004}"
|
||||
ports:
|
||||
- "${SPIFF_CONNECTOR_PORT:-8004}:${SPIFF_CONNECTOR_PORT:-8004}/tcp"
|
||||
healthcheck:
|
||||
test: "curl localhost:${SPIFF_CONNECTOR_PORT:-8004}/liveness --fail"
|
||||
interval: 10s
|
||||
timeout: 5s
|
||||
retries: 20
|
||||
|
||||
volumes:
|
||||
spiffworkflow_backend:
|
||||
driver: local
|
|
@ -14,13 +14,13 @@ fi
|
|||
|
||||
# shellcheck disable=2016
|
||||
mysql -uroot "$database" -e '
|
||||
select u.username user, g.identifier group
|
||||
select u.username username, g.identifier group_name
|
||||
FROM `user` u
|
||||
JOIN `user_group_assignment` uga on uga.user_id = u.id
|
||||
JOIN `group` g on g.id = uga.group_id;
|
||||
JOIN `user_group_assignment` uga ON uga.user_id = u.id
|
||||
JOIN `group` g ON g.id = uga.group_id;
|
||||
|
||||
select pa.id, g.identifier group_identifier, pt.uri, permission from permission_assignment pa
|
||||
join principal p on p.id = pa.principal_id
|
||||
join `group` g on g.id = p.group_id
|
||||
join permission_target pt on pt.id = pa.permission_target_id;
|
||||
JOIN principal p ON p.id = pa.principal_id
|
||||
JOIN `group` g ON g.id = p.group_id
|
||||
JOIN permission_target pt ON pt.id = pa.permission_target_id;
|
||||
'
|
||||
|
|
|
@ -1,4 +1,4 @@
|
|||
"""Conftest."""
|
||||
# noqa
|
||||
import os
|
||||
import shutil
|
||||
|
||||
|
@ -25,8 +25,7 @@ from spiffworkflow_backend import create_app # noqa: E402
|
|||
|
||||
|
||||
@pytest.fixture(scope="session")
|
||||
def app() -> Flask:
|
||||
"""App."""
|
||||
def app() -> Flask: # noqa
|
||||
os.environ["SPIFFWORKFLOW_BACKEND_ENV"] = "unit_testing"
|
||||
os.environ["FLASK_SESSION_SECRET_KEY"] = "e7711a3ba96c46c68e084a86952de16f"
|
||||
app = create_app()
|
||||
|
@ -53,8 +52,12 @@ def with_db_and_bpmn_file_cleanup() -> None:
|
|||
|
||||
|
||||
@pytest.fixture()
|
||||
def with_super_admin_user() -> UserModel:
|
||||
"""With_super_admin_user."""
|
||||
user = BaseTest.find_or_create_user(username="testadmin1")
|
||||
AuthorizationService.import_permissions_from_yaml_file(user)
|
||||
def with_super_admin_user() -> UserModel: # noqa
|
||||
# this loads all permissions from yaml everytime this function is called which is slow
|
||||
# so default to just setting a simple super admin and only run with the "real" permissions in ci
|
||||
if os.environ.get("SPIFFWORKFLOW_BACKEND_RUNNING_IN_CI") == "true":
|
||||
user = BaseTest.find_or_create_user(username="testadmin1")
|
||||
AuthorizationService.import_permissions_from_yaml_file(user)
|
||||
else:
|
||||
user = BaseTest.create_user_with_permission("super_admin")
|
||||
return user
|
||||
|
|
|
@ -52,8 +52,8 @@ docker run \
|
|||
|
||||
script_dir="$( cd -- "$(dirname "$0")" >/dev/null 2>&1 ; pwd -P )"
|
||||
cp "${script_dir}/../realm_exports/${realm_name}-realm.json" /tmp/${realm_name}-realm.json
|
||||
spiff_subdomain="unused-for-local-dev"
|
||||
perl -pi -e "s/{{SPIFF_SUBDOMAIN}}/${spiff_subdomain}/g" /tmp/${realm_name}-realm.json
|
||||
spiff_subdomain="for-local-dev.spiffworkflow.org"
|
||||
perl -pi -e "s/replace-me-with-spiff-domain-and-api-path-prefix/${spiff_subdomain}/g" /tmp/${realm_name}-realm.json
|
||||
docker cp /tmp/${realm_name}-realm.json keycloak:/tmp
|
||||
|
||||
sleep 20
|
||||
|
|
|
@ -3250,7 +3250,7 @@
|
|||
"alwaysDisplayInConsole" : false,
|
||||
"clientAuthenticatorType" : "client-secret",
|
||||
"secret" : "JXeQExm0JhQPLumgHtIIqf52bDalHz0q",
|
||||
"redirectUris" : [ "http://localhost:7000/*", "https://api.unused-for-local-dev.spiffworkflow.org/*", "https://api.replace-me-with-spiff-domain/*", "http://67.205.133.116:7000/*", "http://167.172.242.138:7000/*" ],
|
||||
"redirectUris" : [ "http://localhost:7000/*", "https://replace-me-with-spiff-domain-and-api-path-prefix/*", "http://67.205.133.116:7000/*", "http://167.172.242.138:7000/*" ],
|
||||
"webOrigins" : [ ],
|
||||
"notBefore" : 0,
|
||||
"bearerOnly" : false,
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -79,6 +79,10 @@ SPIFFWORKFLOW_BACKEND_OPEN_ID_TENANT_SPECIFIC_FIELDS = environ.get(
|
|||
"SPIFFWORKFLOW_BACKEND_OPEN_ID_TENANT_SPECIFIC_FIELDS"
|
||||
)
|
||||
|
||||
SPIFFWORKFLOW_BACKEND_AUTHENTICATION_DISABLED = (
|
||||
environ.get("SPIFFWORKFLOW_BACKEND_AUTHENTICATION_DISABLED", default="false") == "true"
|
||||
)
|
||||
|
||||
# loggers to use is a comma separated list of logger prefixes that we will be converted to list of strings
|
||||
SPIFFWORKFLOW_BACKEND_LOGGERS_TO_USE = environ.get("SPIFFWORKFLOW_BACKEND_LOGGERS_TO_USE")
|
||||
|
||||
|
|
|
@ -29,16 +29,16 @@ SPIFFWORKFLOW_BACKEND_OPEN_ID_SERVER_URL = environ.get(
|
|||
)
|
||||
|
||||
SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND = environ.get(
|
||||
'SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND',
|
||||
default=f"https://{environment_identifier_for_this_config_file_only}.spiffworkflow.org"
|
||||
"SPIFFWORKFLOW_BACKEND_URL_FOR_FRONTEND",
|
||||
default=f"https://{environment_identifier_for_this_config_file_only}.spiffworkflow.org",
|
||||
)
|
||||
SPIFFWORKFLOW_BACKEND_URL = environ.get(
|
||||
'SPIFFWORKFLOW_BACKEND_URL',
|
||||
default=f"https://api.{environment_identifier_for_this_config_file_only}.spiffworkflow.org"
|
||||
"SPIFFWORKFLOW_BACKEND_URL",
|
||||
default=f"https://api.{environment_identifier_for_this_config_file_only}.spiffworkflow.org",
|
||||
)
|
||||
SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL = environ.get(
|
||||
'SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL',
|
||||
default=f"https://connector-proxy.{environment_identifier_for_this_config_file_only}.spiffworkflow.org"
|
||||
"SPIFFWORKFLOW_BACKEND_CONNECTOR_PROXY_URL",
|
||||
default=f"https://connector-proxy.{environment_identifier_for_this_config_file_only}.spiffworkflow.org",
|
||||
)
|
||||
SPIFFWORKFLOW_BACKEND_GIT_PUBLISH_CLONE_URL = environ.get(
|
||||
"SPIFFWORKFLOW_BACKEND_GIT_PUBLISH_CLONE_URL",
|
||||
|
|
|
@ -15,13 +15,14 @@ if TYPE_CHECKING:
|
|||
from spiffworkflow_backend.models.user import UserModel # noqa: F401
|
||||
|
||||
|
||||
SPIFF_NO_AUTH_ANONYMOUS_GROUP = "spiff_anonymous_group"
|
||||
|
||||
|
||||
class GroupNotFoundError(Exception):
|
||||
"""GroupNotFoundError."""
|
||||
pass
|
||||
|
||||
|
||||
class GroupModel(SpiffworkflowBaseDBModel):
|
||||
"""GroupModel."""
|
||||
|
||||
__tablename__ = "group"
|
||||
__table_args__ = {"extend_existing": True}
|
||||
|
||||
|
|
|
@ -63,7 +63,9 @@ class MessageInstanceModel(SpiffworkflowBaseDBModel):
|
|||
failure_cause: str = db.Column(db.Text())
|
||||
updated_at_in_seconds: int = db.Column(db.Integer)
|
||||
created_at_in_seconds: int = db.Column(db.Integer)
|
||||
correlation_rules = relationship("MessageInstanceCorrelationRuleModel", back_populates="message_instance")
|
||||
correlation_rules = relationship(
|
||||
"MessageInstanceCorrelationRuleModel", back_populates="message_instance", cascade="delete"
|
||||
)
|
||||
|
||||
@validates("message_type")
|
||||
def validate_message_type(self, key: str, value: Any) -> Any:
|
||||
|
|
|
@ -32,3 +32,5 @@ class PrincipalModel(SpiffworkflowBaseDBModel):
|
|||
|
||||
user = relationship("UserModel", viewonly=True)
|
||||
group = relationship("GroupModel", viewonly=True)
|
||||
|
||||
permission_assignments = relationship("PermissionAssignmentModel", cascade="delete") # type: ignore
|
||||
|
|
|
@ -58,19 +58,16 @@ class ProcessInstanceModel(SpiffworkflowBaseDBModel):
|
|||
process_model_identifier: str = db.Column(db.String(255), nullable=False, index=True)
|
||||
process_model_display_name: str = db.Column(db.String(255), nullable=False, index=True)
|
||||
process_initiator_id: int = db.Column(ForeignKey(UserModel.id), nullable=False, index=True) # type: ignore
|
||||
process_initiator = relationship("UserModel")
|
||||
|
||||
bpmn_process_definition_id: int | None = db.Column(
|
||||
ForeignKey(BpmnProcessDefinitionModel.id), nullable=True, index=True # type: ignore
|
||||
)
|
||||
bpmn_process_definition = relationship(BpmnProcessDefinitionModel)
|
||||
bpmn_process_id: int | None = db.Column(ForeignKey(BpmnProcessModel.id), nullable=True, index=True) # type: ignore
|
||||
bpmn_process = relationship(BpmnProcessModel, cascade="delete")
|
||||
tasks = relationship("TaskModel", cascade="delete") # type: ignore
|
||||
process_instance_events = relationship("ProcessInstanceEventModel", cascade="delete") # type: ignore
|
||||
|
||||
spiff_serializer_version = db.Column(db.String(50), nullable=True)
|
||||
|
||||
process_initiator = relationship("UserModel")
|
||||
bpmn_process_definition = relationship(BpmnProcessDefinitionModel)
|
||||
|
||||
active_human_tasks = relationship(
|
||||
"HumanTaskModel",
|
||||
primaryjoin=(
|
||||
|
@ -78,6 +75,10 @@ class ProcessInstanceModel(SpiffworkflowBaseDBModel):
|
|||
),
|
||||
) # type: ignore
|
||||
|
||||
bpmn_process = relationship(BpmnProcessModel, cascade="delete")
|
||||
tasks = relationship("TaskModel", cascade="delete") # type: ignore
|
||||
process_instance_events = relationship("ProcessInstanceEventModel", cascade="delete") # type: ignore
|
||||
process_instance_file_data = relationship("ProcessInstanceFileDataModel", cascade="delete") # type: ignore
|
||||
human_tasks = relationship(
|
||||
"HumanTaskModel",
|
||||
cascade="delete",
|
||||
|
|
|
@ -12,8 +12,6 @@ from spiffworkflow_backend.models.process_instance import ProcessInstanceModel
|
|||
|
||||
@dataclass
|
||||
class ProcessInstanceFileDataModel(SpiffworkflowBaseDBModel):
|
||||
"""ProcessInstanceFileDataModel."""
|
||||
|
||||
__tablename__ = "process_instance_file_data"
|
||||
|
||||
id: int = db.Column(db.Integer, primary_key=True)
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
"""Script_attributes_context."""
|
||||
from dataclasses import dataclass
|
||||
from typing import Optional
|
||||
|
||||
|
@ -7,8 +6,6 @@ from SpiffWorkflow.task import Task as SpiffTask # type: ignore
|
|||
|
||||
@dataclass
|
||||
class ScriptAttributesContext:
|
||||
"""ScriptAttributesContext."""
|
||||
|
||||
task: Optional[SpiffTask]
|
||||
environment_identifier: str
|
||||
process_instance_id: Optional[int]
|
||||
|
|
|
@ -15,14 +15,15 @@ from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel
|
|||
from spiffworkflow_backend.models.group import GroupModel
|
||||
|
||||
|
||||
SPIFF_NO_AUTH_ANONYMOUS_USER = "spiff_anonymous_user"
|
||||
|
||||
|
||||
class UserNotFoundError(Exception):
|
||||
"""UserNotFoundError."""
|
||||
pass
|
||||
|
||||
|
||||
@dataclass
|
||||
class UserModel(SpiffworkflowBaseDBModel):
|
||||
"""UserModel."""
|
||||
|
||||
__tablename__ = "user"
|
||||
__table_args__ = (db.UniqueConstraint("service", "service_id", name="service_key"),)
|
||||
|
||||
|
@ -47,9 +48,9 @@ class UserModel(SpiffworkflowBaseDBModel):
|
|||
secondary="user_group_assignment",
|
||||
overlaps="user_group_assignments,users",
|
||||
)
|
||||
principal = relationship("PrincipalModel", uselist=False) # type: ignore
|
||||
principal = relationship("PrincipalModel", uselist=False, cascade="delete") # type: ignore
|
||||
|
||||
def encode_auth_token(self) -> str:
|
||||
def encode_auth_token(self, extra_payload: dict | None = None) -> str:
|
||||
"""Generate the Auth Token.
|
||||
|
||||
:return: string
|
||||
|
@ -59,12 +60,16 @@ class UserModel(SpiffworkflowBaseDBModel):
|
|||
raise KeyError("we need current_app.config to have a SECRET_KEY")
|
||||
|
||||
# hours = float(app.config['TOKEN_AUTH_TTL_HOURS'])
|
||||
payload = {
|
||||
# 'exp': datetime.datetime.utcnow() + datetime.timedelta(hours=hours, minutes=0, seconds=0),
|
||||
# 'iat': datetime.datetime.utcnow(),
|
||||
base_payload = {
|
||||
"email": self.email,
|
||||
"preferred_username": self.username,
|
||||
"sub": f"service:{self.service}::service_id:{self.service_id}",
|
||||
"token_type": "internal",
|
||||
}
|
||||
|
||||
payload = base_payload
|
||||
if extra_payload is not None:
|
||||
payload = {**base_payload, **extra_payload}
|
||||
return jwt.encode(
|
||||
payload,
|
||||
secret_key,
|
||||
|
|
|
@ -6,7 +6,6 @@ import re
|
|||
from typing import Any
|
||||
from typing import Dict
|
||||
from typing import Optional
|
||||
from typing import Union
|
||||
|
||||
import flask
|
||||
import jwt
|
||||
|
@ -20,6 +19,10 @@ from werkzeug.wrappers import Response
|
|||
|
||||
from spiffworkflow_backend.exceptions.api_error import ApiError
|
||||
from spiffworkflow_backend.helpers.api_version import V1_API_PATH_PREFIX
|
||||
from spiffworkflow_backend.models.db import db
|
||||
from spiffworkflow_backend.models.group import GroupModel
|
||||
from spiffworkflow_backend.models.group import SPIFF_NO_AUTH_ANONYMOUS_GROUP
|
||||
from spiffworkflow_backend.models.user import SPIFF_NO_AUTH_ANONYMOUS_USER
|
||||
from spiffworkflow_backend.models.user import UserModel
|
||||
from spiffworkflow_backend.services.authentication_service import AuthenticationService
|
||||
from spiffworkflow_backend.services.authentication_service import (
|
||||
|
@ -27,6 +30,7 @@ from spiffworkflow_backend.services.authentication_service import (
|
|||
)
|
||||
from spiffworkflow_backend.services.authentication_service import TokenExpiredError
|
||||
from spiffworkflow_backend.services.authorization_service import AuthorizationService
|
||||
from spiffworkflow_backend.services.group_service import GroupService
|
||||
from spiffworkflow_backend.services.user_service import UserService
|
||||
|
||||
"""
|
||||
|
@ -36,9 +40,7 @@ from spiffworkflow_backend.services.user_service import UserService
|
|||
|
||||
|
||||
# authorization_exclusion_list = ['status']
|
||||
def verify_token(
|
||||
token: Optional[str] = None, force_run: Optional[bool] = False
|
||||
) -> Optional[Dict[str, Optional[Union[str, int]]]]:
|
||||
def verify_token(token: Optional[str] = None, force_run: Optional[bool] = False) -> None:
|
||||
"""Verify the token for the user (if provided).
|
||||
|
||||
If in production environment and token is not provided, gets user from the SSO headers and returns their token.
|
||||
|
@ -82,6 +84,22 @@ def verify_token(
|
|||
current_app.logger.error(
|
||||
f"Exception in verify_token getting user from decoded internal token. {e}"
|
||||
)
|
||||
|
||||
# if the user is the anonymous user and we have auth enabled then make sure we clean up the anonymouse user
|
||||
if (
|
||||
user_model
|
||||
and not current_app.config.get("SPIFFWORKFLOW_BACKEND_AUTHENTICATION_DISABLED")
|
||||
and user_model.username == SPIFF_NO_AUTH_ANONYMOUS_USER
|
||||
and user_model.service_id == "spiff_anonymous_service_id"
|
||||
):
|
||||
group_model = GroupModel.query.filter_by(identifier=SPIFF_NO_AUTH_ANONYMOUS_GROUP).first()
|
||||
db.session.delete(group_model)
|
||||
db.session.delete(user_model)
|
||||
db.session.commit()
|
||||
tld = current_app.config["THREAD_LOCAL_DATA"]
|
||||
tld.user_has_logged_out = True
|
||||
return None
|
||||
|
||||
elif "iss" in decoded_token.keys():
|
||||
user_info = None
|
||||
try:
|
||||
|
@ -196,29 +214,22 @@ def set_new_access_token_in_cookie(
|
|||
return response
|
||||
|
||||
|
||||
def encode_auth_token(sub: str, token_type: Optional[str] = None) -> str:
|
||||
"""Generates the Auth Token.
|
||||
|
||||
:return: string
|
||||
"""
|
||||
payload = {"sub": sub}
|
||||
if token_type is None:
|
||||
token_type = "internal" # noqa: S105
|
||||
payload["token_type"] = token_type
|
||||
if "SECRET_KEY" in current_app.config:
|
||||
secret_key = current_app.config.get("SECRET_KEY")
|
||||
else:
|
||||
current_app.logger.error("Missing SECRET_KEY in encode_auth_token")
|
||||
raise ApiError(error_code="encode_error", message="Missing SECRET_KEY in encode_auth_token")
|
||||
return jwt.encode(
|
||||
payload,
|
||||
str(secret_key),
|
||||
algorithm="HS256",
|
||||
)
|
||||
|
||||
|
||||
def login(redirect_url: str = "/") -> Response:
|
||||
"""Login."""
|
||||
if current_app.config.get("SPIFFWORKFLOW_BACKEND_AUTHENTICATION_DISABLED"):
|
||||
user = UserModel.query.filter_by(username=SPIFF_NO_AUTH_ANONYMOUS_USER).first()
|
||||
if user is None:
|
||||
user = UserService.create_user(
|
||||
SPIFF_NO_AUTH_ANONYMOUS_USER, "spiff_anonymous_service", "spiff_anonymous_service_id"
|
||||
)
|
||||
GroupService.add_user_to_group_or_add_to_waiting(user.username, SPIFF_NO_AUTH_ANONYMOUS_GROUP)
|
||||
AuthorizationService.add_permission_from_uri_or_macro(SPIFF_NO_AUTH_ANONYMOUS_GROUP, "all", "/*")
|
||||
g.user = user
|
||||
g.token = user.encode_auth_token({"authentication_disabled": True})
|
||||
tld = current_app.config["THREAD_LOCAL_DATA"]
|
||||
tld.new_access_token = g.token
|
||||
tld.new_id_token = g.token
|
||||
return redirect(redirect_url)
|
||||
|
||||
state = AuthenticationService.generate_state(redirect_url)
|
||||
login_redirect_url = AuthenticationService().get_login_redirect_url(state.decode("UTF-8"))
|
||||
return redirect(login_redirect_url)
|
||||
|
|
|
@ -1,4 +1,3 @@
|
|||
"""Authentication_service."""
|
||||
import base64
|
||||
import enum
|
||||
import json
|
||||
|
@ -16,7 +15,7 @@ from spiffworkflow_backend.models.refresh_token import RefreshTokenModel
|
|||
|
||||
|
||||
class MissingAccessTokenError(Exception):
|
||||
"""MissingAccessTokenError."""
|
||||
pass
|
||||
|
||||
|
||||
class NotAuthorizedError(Exception):
|
||||
|
@ -35,20 +34,22 @@ class UserNotLoggedInError(Exception):
|
|||
|
||||
|
||||
class TokenExpiredError(Exception):
|
||||
"""TokenExpiredError."""
|
||||
pass
|
||||
|
||||
|
||||
class TokenInvalidError(Exception):
|
||||
"""TokenInvalidError."""
|
||||
pass
|
||||
|
||||
|
||||
class TokenNotProvidedError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class AuthenticationProviderTypes(enum.Enum):
|
||||
"""AuthenticationServiceProviders."""
|
||||
class OpenIdConnectionError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class AuthenticationProviderTypes(enum.Enum):
|
||||
open_id = "open_id"
|
||||
internal = "internal"
|
||||
|
||||
|
@ -78,8 +79,11 @@ class AuthenticationService:
|
|||
"""All openid systems provide a mapping of static names to the full path of that endpoint."""
|
||||
openid_config_url = f"{cls.server_url()}/.well-known/openid-configuration"
|
||||
if name not in AuthenticationService.ENDPOINT_CACHE:
|
||||
response = requests.get(openid_config_url)
|
||||
AuthenticationService.ENDPOINT_CACHE = response.json()
|
||||
try:
|
||||
response = requests.get(openid_config_url)
|
||||
AuthenticationService.ENDPOINT_CACHE = response.json()
|
||||
except requests.exceptions.ConnectionError as ce:
|
||||
raise OpenIdConnectionError(f"Cannot connect to given open id url: {openid_config_url}") from ce
|
||||
if name not in AuthenticationService.ENDPOINT_CACHE:
|
||||
raise Exception(f"Unknown OpenID Endpoint: {name}. Tried to get from {openid_config_url}")
|
||||
return AuthenticationService.ENDPOINT_CACHE.get(name, "")
|
||||
|
|
|
@ -2,9 +2,13 @@
|
|||
from SpiffWorkflow.dmn.parser.BpmnDmnParser import BpmnDmnParser # type: ignore
|
||||
from SpiffWorkflow.spiff.parser.process import SpiffBpmnParser # type: ignore
|
||||
|
||||
from spiffworkflow_backend.specs.start_event import StartEvent
|
||||
|
||||
|
||||
class MyCustomParser(BpmnDmnParser): # type: ignore
|
||||
"""A BPMN and DMN parser that can also parse spiffworkflow-specific extensions."""
|
||||
|
||||
OVERRIDE_PARSER_CLASSES = BpmnDmnParser.OVERRIDE_PARSER_CLASSES
|
||||
OVERRIDE_PARSER_CLASSES.update(SpiffBpmnParser.OVERRIDE_PARSER_CLASSES)
|
||||
|
||||
StartEvent.register_parser_class(OVERRIDE_PARSER_CLASSES)
|
||||
|
|
|
@ -134,7 +134,6 @@ class MessageService:
|
|||
def get_process_instance_for_message_instance(
|
||||
message_instance_receive: MessageInstanceModel,
|
||||
) -> ProcessInstanceModel:
|
||||
"""Process_message_receive."""
|
||||
process_instance_receive: ProcessInstanceModel = ProcessInstanceModel.query.filter_by(
|
||||
id=message_instance_receive.process_instance_id
|
||||
).first()
|
||||
|
@ -157,7 +156,6 @@ class MessageService:
|
|||
message_model_name: str,
|
||||
message_payload: dict,
|
||||
) -> None:
|
||||
"""process_message_receive."""
|
||||
processor_receive = ProcessInstanceProcessor(process_instance_receive)
|
||||
processor_receive.bpmn_process_instance.catch_bpmn_message(message_model_name, message_payload)
|
||||
processor_receive.do_engine_steps(save=True)
|
||||
|
|
|
@ -101,7 +101,11 @@ from spiffworkflow_backend.services.workflow_execution_service import (
|
|||
from spiffworkflow_backend.services.workflow_execution_service import (
|
||||
WorkflowExecutionService,
|
||||
)
|
||||
from spiffworkflow_backend.specs.start_event import (
|
||||
StartEvent,
|
||||
)
|
||||
|
||||
StartEvent.register_converter(SPIFF_SPEC_CONFIG)
|
||||
|
||||
# Sorry about all this crap. I wanted to move this thing to another file, but
|
||||
# importing a bunch of types causes circular imports.
|
||||
|
|
|
@ -46,6 +46,7 @@ from spiffworkflow_backend.services.process_instance_queue_service import (
|
|||
ProcessInstanceQueueService,
|
||||
)
|
||||
from spiffworkflow_backend.services.process_model_service import ProcessModelService
|
||||
from spiffworkflow_backend.services.workflow_service import WorkflowService
|
||||
|
||||
|
||||
class ProcessInstanceService:
|
||||
|
@ -54,6 +55,17 @@ class ProcessInstanceService:
|
|||
FILE_DATA_DIGEST_PREFIX = "spifffiledatadigest+"
|
||||
TASK_STATE_LOCKED = "locked"
|
||||
|
||||
@staticmethod
|
||||
def calculate_start_delay_in_seconds(process_instance_model: ProcessInstanceModel) -> int:
|
||||
try:
|
||||
processor = ProcessInstanceProcessor(process_instance_model)
|
||||
delay_in_seconds = WorkflowService.calculate_run_at_delay_in_seconds(
|
||||
processor.bpmn_process_instance, datetime.now(timezone.utc)
|
||||
)
|
||||
except Exception:
|
||||
delay_in_seconds = 0
|
||||
return delay_in_seconds
|
||||
|
||||
@classmethod
|
||||
def create_process_instance(
|
||||
cls,
|
||||
|
@ -77,7 +89,8 @@ class ProcessInstanceService:
|
|||
)
|
||||
db.session.add(process_instance_model)
|
||||
db.session.commit()
|
||||
run_at_in_seconds = round(time.time())
|
||||
delay_in_seconds = cls.calculate_start_delay_in_seconds(process_instance_model)
|
||||
run_at_in_seconds = round(time.time()) + delay_in_seconds
|
||||
ProcessInstanceQueueService.enqueue_new_process_instance(process_instance_model, run_at_in_seconds)
|
||||
return process_instance_model
|
||||
|
||||
|
|
|
@ -3,10 +3,10 @@ import json
|
|||
import os
|
||||
import re
|
||||
import traceback
|
||||
from abc import abstractmethod
|
||||
from dataclasses import dataclass
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
from typing import Type
|
||||
from typing import Union
|
||||
|
||||
from lxml import etree # type: ignore
|
||||
|
@ -34,6 +34,14 @@ class MissingInputTaskData(Exception):
|
|||
pass
|
||||
|
||||
|
||||
class UnsupporterRunnerDelegateGiven(Exception):
|
||||
pass
|
||||
|
||||
|
||||
class BpmnFileMissingExecutableProcessError(Exception):
|
||||
pass
|
||||
|
||||
|
||||
@dataclass
|
||||
class TestCaseErrorDetails:
|
||||
error_messages: list[str]
|
||||
|
@ -53,6 +61,124 @@ class TestCaseResult:
|
|||
test_case_error_details: Optional[TestCaseErrorDetails] = None
|
||||
|
||||
|
||||
class ProcessModelTestRunnerDelegate:
|
||||
"""Abstract class for the process model test runner delegate.
|
||||
|
||||
All delegates MUST inherit from this class.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
process_model_directory_path: str,
|
||||
) -> None:
|
||||
self.process_model_directory_path = process_model_directory_path
|
||||
|
||||
@abstractmethod
|
||||
def instantiate_executer(self, bpmn_file: str) -> BpmnWorkflow:
|
||||
raise NotImplementedError("method instantiate_executer must be implemented")
|
||||
|
||||
@abstractmethod
|
||||
def execute_task(self, spiff_task: SpiffTask, task_data_for_submit: Optional[dict] = None) -> None:
|
||||
raise NotImplementedError("method execute_task must be implemented")
|
||||
|
||||
@abstractmethod
|
||||
def get_next_task(self, bpmn_process_instance: BpmnWorkflow) -> Optional[SpiffTask]:
|
||||
raise NotImplementedError("method get_next_task must be implemented")
|
||||
|
||||
|
||||
class ProcessModelTestRunnerMostlyPureSpiffDelegate(ProcessModelTestRunnerDelegate):
|
||||
def __init__(
|
||||
self,
|
||||
process_model_directory_path: str,
|
||||
) -> None:
|
||||
super().__init__(process_model_directory_path)
|
||||
self.bpmn_processes_to_file_mappings: dict[str, str] = {}
|
||||
self.bpmn_files_to_called_element_mappings: dict[str, list[str]] = {}
|
||||
self._discover_process_model_processes()
|
||||
|
||||
def instantiate_executer(self, bpmn_file: str) -> BpmnWorkflow:
|
||||
parser = MyCustomParser()
|
||||
bpmn_file_etree = self._get_etree_from_bpmn_file(bpmn_file)
|
||||
parser.add_bpmn_xml(bpmn_file_etree, filename=os.path.basename(bpmn_file))
|
||||
all_related = self._find_related_bpmn_files(bpmn_file)
|
||||
for related_file in all_related:
|
||||
related_file_etree = self._get_etree_from_bpmn_file(related_file)
|
||||
parser.add_bpmn_xml(related_file_etree, filename=os.path.basename(related_file))
|
||||
sub_parsers = list(parser.process_parsers.values())
|
||||
executable_process = None
|
||||
for sub_parser in sub_parsers:
|
||||
if sub_parser.process_executable:
|
||||
executable_process = sub_parser.bpmn_id
|
||||
if executable_process is None:
|
||||
raise BpmnFileMissingExecutableProcessError(
|
||||
f"Executable process cannot be found in {bpmn_file}. Test cannot run."
|
||||
)
|
||||
bpmn_process_spec = parser.get_spec(executable_process)
|
||||
bpmn_process_instance = BpmnWorkflow(bpmn_process_spec)
|
||||
return bpmn_process_instance
|
||||
|
||||
def execute_task(self, spiff_task: SpiffTask, task_data_for_submit: Optional[dict] = None) -> None:
|
||||
if task_data_for_submit is not None or spiff_task.task_spec.manual:
|
||||
if task_data_for_submit is not None:
|
||||
spiff_task.update_data(task_data_for_submit)
|
||||
spiff_task.complete()
|
||||
else:
|
||||
spiff_task.run()
|
||||
|
||||
def get_next_task(self, bpmn_process_instance: BpmnWorkflow) -> Optional[SpiffTask]:
|
||||
ready_tasks = list([t for t in bpmn_process_instance.get_tasks(TaskState.READY)])
|
||||
if len(ready_tasks) > 0:
|
||||
return ready_tasks[0]
|
||||
return None
|
||||
|
||||
def _get_etree_from_bpmn_file(self, bpmn_file: str) -> etree._Element:
|
||||
data = None
|
||||
with open(bpmn_file, "rb") as f_handle:
|
||||
data = f_handle.read()
|
||||
etree_xml_parser = etree.XMLParser(resolve_entities=False)
|
||||
return etree.fromstring(data, parser=etree_xml_parser)
|
||||
|
||||
def _find_related_bpmn_files(self, bpmn_file: str) -> list[str]:
|
||||
related_bpmn_files = []
|
||||
if bpmn_file in self.bpmn_files_to_called_element_mappings:
|
||||
for bpmn_process_identifier in self.bpmn_files_to_called_element_mappings[bpmn_file]:
|
||||
if bpmn_process_identifier in self.bpmn_processes_to_file_mappings:
|
||||
new_file = self.bpmn_processes_to_file_mappings[bpmn_process_identifier]
|
||||
related_bpmn_files.append(new_file)
|
||||
related_bpmn_files.extend(self._find_related_bpmn_files(new_file))
|
||||
return related_bpmn_files
|
||||
|
||||
def _discover_process_model_processes(
|
||||
self,
|
||||
) -> None:
|
||||
process_model_bpmn_file_glob = os.path.join(self.process_model_directory_path, "**", "*.bpmn")
|
||||
|
||||
for file in glob.glob(process_model_bpmn_file_glob, recursive=True):
|
||||
file_norm = os.path.normpath(file)
|
||||
if file_norm not in self.bpmn_files_to_called_element_mappings:
|
||||
self.bpmn_files_to_called_element_mappings[file_norm] = []
|
||||
with open(file_norm, "rb") as f:
|
||||
file_contents = f.read()
|
||||
etree_xml_parser = etree.XMLParser(resolve_entities=False)
|
||||
|
||||
# if we cannot load process model then ignore it since it can cause errors unrelated
|
||||
# to the test and if it is related, it will most likely be caught further along the test
|
||||
try:
|
||||
root = etree.fromstring(file_contents, parser=etree_xml_parser)
|
||||
except etree.XMLSyntaxError:
|
||||
continue
|
||||
|
||||
call_activities = root.findall(".//bpmn:callActivity", namespaces=DEFAULT_NSMAP)
|
||||
for call_activity in call_activities:
|
||||
if "calledElement" in call_activity.attrib:
|
||||
called_element = call_activity.attrib["calledElement"]
|
||||
self.bpmn_files_to_called_element_mappings[file_norm].append(called_element)
|
||||
bpmn_process_element = root.find('.//bpmn:process[@isExecutable="true"]', namespaces=DEFAULT_NSMAP)
|
||||
if bpmn_process_element is not None:
|
||||
bpmn_process_identifier = bpmn_process_element.attrib["id"]
|
||||
self.bpmn_processes_to_file_mappings[bpmn_process_identifier] = file_norm
|
||||
|
||||
|
||||
DEFAULT_NSMAP = {
|
||||
"bpmn": "http://www.omg.org/spec/BPMN/20100524/MODEL",
|
||||
"bpmndi": "http://www.omg.org/spec/BPMN/20100524/DI",
|
||||
|
@ -93,18 +219,16 @@ JSON file format:
|
|||
|
||||
|
||||
class ProcessModelTestRunner:
|
||||
"""Generic test runner code. May move into own library at some point.
|
||||
"""Runs the test case json files for a given process model directory.
|
||||
|
||||
KEEP THIS GENERIC. do not add backend specific code here.
|
||||
It searches for test case files recursively and will run all it finds by default.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
process_model_directory_path: str,
|
||||
process_model_test_runner_delegate_class: Type = ProcessModelTestRunnerMostlyPureSpiffDelegate,
|
||||
process_model_directory_for_test_discovery: Optional[str] = None,
|
||||
instantiate_executer_callback: Optional[Callable[[str], Any]] = None,
|
||||
execute_task_callback: Optional[Callable[[Any, Optional[str], Optional[dict]], Any]] = None,
|
||||
get_next_task_callback: Optional[Callable[[Any], Any]] = None,
|
||||
test_case_file: Optional[str] = None,
|
||||
test_case_identifier: Optional[str] = None,
|
||||
) -> None:
|
||||
|
@ -112,21 +236,24 @@ class ProcessModelTestRunner:
|
|||
self.process_model_directory_for_test_discovery = (
|
||||
process_model_directory_for_test_discovery or process_model_directory_path
|
||||
)
|
||||
self.instantiate_executer_callback = instantiate_executer_callback
|
||||
self.execute_task_callback = execute_task_callback
|
||||
self.get_next_task_callback = get_next_task_callback
|
||||
self.test_case_file = test_case_file
|
||||
self.test_case_identifier = test_case_identifier
|
||||
|
||||
# keep track of the current task data index
|
||||
self.task_data_index: dict[str, int] = {}
|
||||
if not issubclass(process_model_test_runner_delegate_class, ProcessModelTestRunnerDelegate):
|
||||
raise UnsupporterRunnerDelegateGiven(
|
||||
"Process model test runner delegate must inherit from ProcessModelTestRunnerDelegate. Given"
|
||||
f" class '{process_model_test_runner_delegate_class}' does not"
|
||||
)
|
||||
|
||||
self.test_case_results: list[TestCaseResult] = []
|
||||
self.bpmn_processes_to_file_mappings: dict[str, str] = {}
|
||||
self.bpmn_files_to_called_element_mappings: dict[str, list[str]] = {}
|
||||
self.process_model_test_runner_delegate = process_model_test_runner_delegate_class(
|
||||
process_model_directory_path
|
||||
)
|
||||
|
||||
self.test_mappings = self._discover_process_model_test_cases()
|
||||
self._discover_process_model_processes()
|
||||
self.test_case_results: list[TestCaseResult] = []
|
||||
|
||||
# keep track of the current task data index
|
||||
self.task_data_index: dict[str, int] = {}
|
||||
|
||||
def all_test_cases_passed(self) -> bool:
|
||||
failed_tests = self.failing_tests()
|
||||
|
@ -178,7 +305,9 @@ class ProcessModelTestRunner:
|
|||
test_case_task_properties = test_case_contents["tasks"][test_case_task_key]
|
||||
|
||||
task_type = next_task.task_spec.__class__.__name__
|
||||
if task_type in ["ServiceTask", "UserTask", "CallActivity"] and test_case_task_properties is None:
|
||||
if task_type in ["ServiceTask", "UserTask", "CallActivity"] and (
|
||||
test_case_task_properties is None or "data" not in test_case_task_properties
|
||||
):
|
||||
raise UnrunnableTestCaseError(
|
||||
f"Cannot run test case '{test_case_identifier}'. It requires task data for"
|
||||
f" {next_task.task_spec.bpmn_id} because it is of type '{task_type}'"
|
||||
|
@ -207,138 +336,29 @@ class ProcessModelTestRunner:
|
|||
]
|
||||
self._add_test_result(error_message is None, bpmn_file, test_case_identifier, error_message)
|
||||
|
||||
def _discover_process_model_test_cases(
|
||||
self,
|
||||
) -> dict[str, str]:
|
||||
test_mappings = {}
|
||||
|
||||
json_test_file_glob = os.path.join(self.process_model_directory_for_test_discovery, "**", "test_*.json")
|
||||
|
||||
for file in glob.glob(json_test_file_glob, recursive=True):
|
||||
file_norm = os.path.normpath(file)
|
||||
file_dir = os.path.dirname(file_norm)
|
||||
json_file_name = os.path.basename(file_norm)
|
||||
if self.test_case_file is None or json_file_name == self.test_case_file:
|
||||
bpmn_file_name = re.sub(r"^test_(.*)\.json", r"\1.bpmn", json_file_name)
|
||||
bpmn_file_path = os.path.join(file_dir, bpmn_file_name)
|
||||
if os.path.isfile(bpmn_file_path):
|
||||
test_mappings[file_norm] = bpmn_file_path
|
||||
else:
|
||||
raise MissingBpmnFileForTestCaseError(
|
||||
f"Cannot find a matching bpmn file for test case json file: '{file_norm}'"
|
||||
)
|
||||
return test_mappings
|
||||
|
||||
def _discover_process_model_processes(
|
||||
self,
|
||||
) -> None:
|
||||
process_model_bpmn_file_glob = os.path.join(self.process_model_directory_path, "**", "*.bpmn")
|
||||
|
||||
for file in glob.glob(process_model_bpmn_file_glob, recursive=True):
|
||||
file_norm = os.path.normpath(file)
|
||||
if file_norm not in self.bpmn_files_to_called_element_mappings:
|
||||
self.bpmn_files_to_called_element_mappings[file_norm] = []
|
||||
with open(file_norm, "rb") as f:
|
||||
file_contents = f.read()
|
||||
etree_xml_parser = etree.XMLParser(resolve_entities=False)
|
||||
|
||||
# if we cannot load process model then ignore it since it can cause errors unrelated
|
||||
# to the test and if it is related, it will most likely be caught further along the test
|
||||
try:
|
||||
root = etree.fromstring(file_contents, parser=etree_xml_parser)
|
||||
except etree.XMLSyntaxError:
|
||||
continue
|
||||
|
||||
call_activities = root.findall(".//bpmn:callActivity", namespaces=DEFAULT_NSMAP)
|
||||
for call_activity in call_activities:
|
||||
if "calledElement" in call_activity.attrib:
|
||||
called_element = call_activity.attrib["calledElement"]
|
||||
self.bpmn_files_to_called_element_mappings[file_norm].append(called_element)
|
||||
bpmn_process_element = root.find('.//bpmn:process[@isExecutable="true"]', namespaces=DEFAULT_NSMAP)
|
||||
if bpmn_process_element is not None:
|
||||
bpmn_process_identifier = bpmn_process_element.attrib["id"]
|
||||
self.bpmn_processes_to_file_mappings[bpmn_process_identifier] = file_norm
|
||||
|
||||
def _execute_task(
|
||||
self, spiff_task: SpiffTask, test_case_task_key: Optional[str], test_case_task_properties: Optional[dict]
|
||||
) -> None:
|
||||
if self.execute_task_callback:
|
||||
self.execute_task_callback(spiff_task, test_case_task_key, test_case_task_properties)
|
||||
self._default_execute_task(spiff_task, test_case_task_key, test_case_task_properties)
|
||||
task_data_for_submit = None
|
||||
if test_case_task_key and test_case_task_properties and "data" in test_case_task_properties:
|
||||
if test_case_task_key not in self.task_data_index:
|
||||
self.task_data_index[test_case_task_key] = 0
|
||||
task_data_length = len(test_case_task_properties["data"])
|
||||
test_case_index = self.task_data_index[test_case_task_key]
|
||||
if task_data_length <= test_case_index:
|
||||
raise MissingInputTaskData(
|
||||
f"Missing input task data for task: {test_case_task_key}. "
|
||||
f"Only {task_data_length} given in the json but task was called {test_case_index + 1} times"
|
||||
)
|
||||
task_data_for_submit = test_case_task_properties["data"][test_case_index]
|
||||
self.task_data_index[test_case_task_key] += 1
|
||||
self.process_model_test_runner_delegate.execute_task(spiff_task, task_data_for_submit)
|
||||
|
||||
def _get_next_task(self, bpmn_process_instance: BpmnWorkflow) -> Optional[SpiffTask]:
|
||||
if self.get_next_task_callback:
|
||||
return self.get_next_task_callback(bpmn_process_instance)
|
||||
return self._default_get_next_task(bpmn_process_instance)
|
||||
return self.process_model_test_runner_delegate.get_next_task(bpmn_process_instance)
|
||||
|
||||
def _instantiate_executer(self, bpmn_file: str) -> BpmnWorkflow:
|
||||
if self.instantiate_executer_callback:
|
||||
return self.instantiate_executer_callback(bpmn_file)
|
||||
return self._default_instantiate_executer(bpmn_file)
|
||||
|
||||
def _default_get_next_task(self, bpmn_process_instance: BpmnWorkflow) -> Optional[SpiffTask]:
|
||||
ready_tasks = list([t for t in bpmn_process_instance.get_tasks(TaskState.READY)])
|
||||
if len(ready_tasks) > 0:
|
||||
return ready_tasks[0]
|
||||
return None
|
||||
|
||||
def _default_execute_task(
|
||||
self, spiff_task: SpiffTask, test_case_task_key: Optional[str], test_case_task_properties: Optional[dict]
|
||||
) -> None:
|
||||
if spiff_task.task_spec.manual or spiff_task.task_spec.__class__.__name__ == "ServiceTask":
|
||||
if test_case_task_key and test_case_task_properties and "data" in test_case_task_properties:
|
||||
if test_case_task_key not in self.task_data_index:
|
||||
self.task_data_index[test_case_task_key] = 0
|
||||
task_data_length = len(test_case_task_properties["data"])
|
||||
test_case_index = self.task_data_index[test_case_task_key]
|
||||
if task_data_length <= test_case_index:
|
||||
raise MissingInputTaskData(
|
||||
f"Missing input task data for task: {test_case_task_key}. "
|
||||
f"Only {task_data_length} given in the json but task was called {test_case_index + 1} times"
|
||||
)
|
||||
spiff_task.update_data(test_case_task_properties["data"][test_case_index])
|
||||
self.task_data_index[test_case_task_key] += 1
|
||||
spiff_task.complete()
|
||||
else:
|
||||
spiff_task.run()
|
||||
|
||||
def _find_related_bpmn_files(self, bpmn_file: str) -> list[str]:
|
||||
related_bpmn_files = []
|
||||
if bpmn_file in self.bpmn_files_to_called_element_mappings:
|
||||
for bpmn_process_identifier in self.bpmn_files_to_called_element_mappings[bpmn_file]:
|
||||
if bpmn_process_identifier in self.bpmn_processes_to_file_mappings:
|
||||
new_file = self.bpmn_processes_to_file_mappings[bpmn_process_identifier]
|
||||
related_bpmn_files.append(new_file)
|
||||
related_bpmn_files.extend(self._find_related_bpmn_files(new_file))
|
||||
return related_bpmn_files
|
||||
|
||||
def _get_etree_from_bpmn_file(self, bpmn_file: str) -> etree._Element:
|
||||
data = None
|
||||
with open(bpmn_file, "rb") as f_handle:
|
||||
data = f_handle.read()
|
||||
etree_xml_parser = etree.XMLParser(resolve_entities=False)
|
||||
return etree.fromstring(data, parser=etree_xml_parser)
|
||||
|
||||
def _default_instantiate_executer(self, bpmn_file: str) -> BpmnWorkflow:
|
||||
parser = MyCustomParser()
|
||||
bpmn_file_etree = self._get_etree_from_bpmn_file(bpmn_file)
|
||||
parser.add_bpmn_xml(bpmn_file_etree, filename=os.path.basename(bpmn_file))
|
||||
all_related = self._find_related_bpmn_files(bpmn_file)
|
||||
for related_file in all_related:
|
||||
related_file_etree = self._get_etree_from_bpmn_file(related_file)
|
||||
parser.add_bpmn_xml(related_file_etree, filename=os.path.basename(related_file))
|
||||
sub_parsers = list(parser.process_parsers.values())
|
||||
executable_process = None
|
||||
for sub_parser in sub_parsers:
|
||||
if sub_parser.process_executable:
|
||||
executable_process = sub_parser.bpmn_id
|
||||
if executable_process is None:
|
||||
raise BpmnFileMissingExecutableProcessError(
|
||||
f"Executable process cannot be found in {bpmn_file}. Test cannot run."
|
||||
)
|
||||
bpmn_process_spec = parser.get_spec(executable_process)
|
||||
bpmn_process_instance = BpmnWorkflow(bpmn_process_spec)
|
||||
return bpmn_process_instance
|
||||
return self.process_model_test_runner_delegate.instantiate_executer(bpmn_file)
|
||||
|
||||
def _get_relative_path_of_bpmn_file(self, bpmn_file: str) -> str:
|
||||
return os.path.relpath(bpmn_file, start=self.process_model_directory_path)
|
||||
|
@ -382,8 +402,29 @@ class ProcessModelTestRunner:
|
|||
)
|
||||
self.test_case_results.append(test_result)
|
||||
|
||||
def _discover_process_model_test_cases(
|
||||
self,
|
||||
) -> dict[str, str]:
|
||||
test_mappings = {}
|
||||
json_test_file_glob = os.path.join(self.process_model_directory_for_test_discovery, "**", "test_*.json")
|
||||
|
||||
class BpmnFileMissingExecutableProcessError(Exception):
|
||||
for file in glob.glob(json_test_file_glob, recursive=True):
|
||||
file_norm = os.path.normpath(file)
|
||||
file_dir = os.path.dirname(file_norm)
|
||||
json_file_name = os.path.basename(file_norm)
|
||||
if self.test_case_file is None or json_file_name == self.test_case_file:
|
||||
bpmn_file_name = re.sub(r"^test_(.*)\.json", r"\1.bpmn", json_file_name)
|
||||
bpmn_file_path = os.path.join(file_dir, bpmn_file_name)
|
||||
if os.path.isfile(bpmn_file_path):
|
||||
test_mappings[file_norm] = bpmn_file_path
|
||||
else:
|
||||
raise MissingBpmnFileForTestCaseError(
|
||||
f"Cannot find a matching bpmn file for test case json file: '{file_norm}'"
|
||||
)
|
||||
return test_mappings
|
||||
|
||||
|
||||
class ProcessModeltTestRunnerBackendDelegate(ProcessModelTestRunnerMostlyPureSpiffDelegate):
|
||||
pass
|
||||
|
||||
|
||||
|
@ -398,9 +439,7 @@ class ProcessModelTestRunnerService:
|
|||
process_model_directory_path,
|
||||
test_case_file=test_case_file,
|
||||
test_case_identifier=test_case_identifier,
|
||||
# instantiate_executer_callback=self._instantiate_executer_callback,
|
||||
# execute_task_callback=self._execute_task_callback,
|
||||
# get_next_task_callback=self._get_next_task_callback,
|
||||
process_model_test_runner_delegate_class=ProcessModeltTestRunnerBackendDelegate,
|
||||
)
|
||||
|
||||
def run(self) -> None:
|
||||
|
|
|
@ -33,7 +33,6 @@ class UserService:
|
|||
tenant_specific_field_2: Optional[str] = None,
|
||||
tenant_specific_field_3: Optional[str] = None,
|
||||
) -> UserModel:
|
||||
"""Create_user."""
|
||||
user_model: Optional[UserModel] = (
|
||||
UserModel.query.filter(UserModel.service == service).filter(UserModel.service_id == service_id).first()
|
||||
)
|
||||
|
|
|
@ -484,11 +484,11 @@ class WorkflowExecutionService:
|
|||
)
|
||||
for correlation_property in event["value"]:
|
||||
message_correlation = MessageInstanceCorrelationRuleModel(
|
||||
message_instance_id=message_instance.id,
|
||||
message_instance=message_instance,
|
||||
name=correlation_property.name,
|
||||
retrieval_expression=correlation_property.retrieval_expression,
|
||||
)
|
||||
message_instance.correlation_rules.append(message_correlation)
|
||||
db.session.add(message_correlation)
|
||||
db.session.add(message_instance)
|
||||
|
||||
bpmn_process = self.process_instance_model.bpmn_process
|
||||
|
|
|
@ -0,0 +1,39 @@
|
|||
"""workflow_service."""
|
||||
from datetime import datetime
|
||||
|
||||
from SpiffWorkflow.bpmn.workflow import BpmnWorkflow # type: ignore
|
||||
from SpiffWorkflow.task import Task as SpiffTask # type: ignore
|
||||
from SpiffWorkflow.task import TaskState
|
||||
|
||||
from spiffworkflow_backend.specs.start_event import StartEvent
|
||||
|
||||
|
||||
class WorkflowService:
|
||||
"""WorkflowService."""
|
||||
|
||||
@classmethod
|
||||
def future_start_events(cls, workflow: BpmnWorkflow) -> list[SpiffTask]:
|
||||
return [t for t in workflow.get_tasks(TaskState.FUTURE) if isinstance(t.task_spec, StartEvent)]
|
||||
|
||||
@classmethod
|
||||
def next_start_event_delay_in_seconds(cls, workflow: BpmnWorkflow, now_in_utc: datetime) -> int:
|
||||
start_events = cls.future_start_events(workflow)
|
||||
start_delays: list[int] = []
|
||||
for start_event in start_events:
|
||||
start_delay = start_event.task_spec.start_delay_in_seconds(start_event, now_in_utc)
|
||||
start_delays.append(start_delay)
|
||||
start_delays.sort()
|
||||
return start_delays[0] if len(start_delays) > 0 else 0
|
||||
|
||||
@classmethod
|
||||
def calculate_run_at_delay_in_seconds(cls, workflow: BpmnWorkflow, now_in_utc: datetime) -> int:
|
||||
# TODO: for now we are using the first start time because I am not sure how multiple
|
||||
# start events should work. I think the right answer is to take the earliest start
|
||||
# time and have later start events stay FUTURE/WAITING?, then we need to be able
|
||||
# to respect the other start events when enqueue'ing.
|
||||
#
|
||||
# TODO: this method should also expand to include other FUTURE/WAITING timers when
|
||||
# enqueue'ing so that we don't have to check timers every 10 or whatever seconds
|
||||
# right now we assume that this is being called to create a process
|
||||
|
||||
return cls.next_start_event_delay_in_seconds(workflow, now_in_utc)
|
|
@ -0,0 +1 @@
|
|||
"""docstring."""
|
|
@ -0,0 +1,62 @@
|
|||
from datetime import datetime
|
||||
from typing import Any
|
||||
from typing import Dict
|
||||
|
||||
from SpiffWorkflow.bpmn.parser.util import full_tag # type: ignore
|
||||
from SpiffWorkflow.bpmn.serializer.task_spec import EventConverter # type: ignore
|
||||
from SpiffWorkflow.bpmn.serializer.task_spec import StartEventConverter as DefaultStartEventConverter
|
||||
from SpiffWorkflow.bpmn.specs.defaults import StartEvent as DefaultStartEvent # type: ignore
|
||||
from SpiffWorkflow.bpmn.specs.event_definitions import CycleTimerEventDefinition # type: ignore
|
||||
from SpiffWorkflow.bpmn.specs.event_definitions import DurationTimerEventDefinition
|
||||
from SpiffWorkflow.bpmn.specs.event_definitions import NoneEventDefinition
|
||||
from SpiffWorkflow.bpmn.specs.event_definitions import TimeDateEventDefinition
|
||||
from SpiffWorkflow.bpmn.specs.event_definitions import TimerEventDefinition
|
||||
from SpiffWorkflow.spiff.parser.event_parsers import SpiffStartEventParser # type: ignore
|
||||
from SpiffWorkflow.task import Task as SpiffTask # type: ignore
|
||||
|
||||
|
||||
# TODO: cylce timers and repeat counts?
|
||||
class StartEvent(DefaultStartEvent): # type: ignore
|
||||
def __init__(self, wf_spec, bpmn_id, event_definition, **kwargs): # type: ignore
|
||||
if isinstance(event_definition, TimerEventDefinition):
|
||||
super().__init__(wf_spec, bpmn_id, NoneEventDefinition(), **kwargs)
|
||||
self.timer_definition = event_definition
|
||||
else:
|
||||
super().__init__(wf_spec, bpmn_id, event_definition, **kwargs)
|
||||
self.timer_definition = None
|
||||
|
||||
@staticmethod
|
||||
def register_converter(spec_config: Dict[str, Any]) -> None:
|
||||
spec_config["task_specs"].remove(DefaultStartEventConverter)
|
||||
spec_config["task_specs"].append(StartEventConverter)
|
||||
|
||||
@staticmethod
|
||||
def register_parser_class(parser_config: Dict[str, Any]) -> None:
|
||||
parser_config[full_tag("startEvent")] = (SpiffStartEventParser, StartEvent)
|
||||
|
||||
def start_delay_in_seconds(self, my_task: SpiffTask, now_in_utc: datetime) -> int:
|
||||
script_engine = my_task.workflow.script_engine
|
||||
evaluated_expression = None
|
||||
parsed_duration = None
|
||||
|
||||
if isinstance(self.timer_definition, TimerEventDefinition) and script_engine is not None:
|
||||
evaluated_expression = script_engine.evaluate(my_task, self.timer_definition.expression)
|
||||
|
||||
if evaluated_expression is not None:
|
||||
if isinstance(self.timer_definition, TimeDateEventDefinition):
|
||||
parsed_duration = TimerEventDefinition.parse_time_or_duration(evaluated_expression)
|
||||
time_delta = parsed_duration - now_in_utc
|
||||
return time_delta.seconds # type: ignore
|
||||
elif isinstance(self.timer_definition, DurationTimerEventDefinition):
|
||||
parsed_duration = TimerEventDefinition.parse_iso_duration(evaluated_expression)
|
||||
time_delta = TimerEventDefinition.get_timedelta_from_start(parsed_duration, now_in_utc)
|
||||
return time_delta.seconds # type: ignore
|
||||
elif isinstance(self.timer_definition, CycleTimerEventDefinition):
|
||||
return 0
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
class StartEventConverter(EventConverter): # type: ignore
|
||||
def __init__(self, registry): # type: ignore
|
||||
super().__init__(StartEvent, registry)
|
|
@ -34,8 +34,8 @@ class TestMessageService(BaseTest):
|
|||
"amount": "100.00",
|
||||
}
|
||||
|
||||
# Load up the definition for the receiving process (it has a message start event that should cause it to
|
||||
# fire when a unique message comes through.
|
||||
# Load up the definition for the receiving process
|
||||
# It has a message start event that should cause it to fire when a unique message comes through
|
||||
# Fire up the first process
|
||||
load_test_spec(
|
||||
"test_group/message_receive",
|
||||
|
|
|
@ -8,6 +8,7 @@ from tests.spiffworkflow_backend.helpers.base_test import BaseTest
|
|||
|
||||
from spiffworkflow_backend.services.process_model_test_runner_service import NoTestCasesFoundError
|
||||
from spiffworkflow_backend.services.process_model_test_runner_service import ProcessModelTestRunner
|
||||
from spiffworkflow_backend.services.process_model_test_runner_service import UnsupporterRunnerDelegateGiven
|
||||
|
||||
|
||||
class TestProcessModelTestRunner(BaseTest):
|
||||
|
@ -29,6 +30,16 @@ class TestProcessModelTestRunner(BaseTest):
|
|||
process_model_test_runner.run()
|
||||
assert process_model_test_runner.all_test_cases_passed(), process_model_test_runner.test_case_results
|
||||
|
||||
def test_will_raise_if_bad_delegate_is_given(
|
||||
self,
|
||||
app: Flask,
|
||||
with_db_and_bpmn_file_cleanup: None,
|
||||
) -> None:
|
||||
with pytest.raises(UnsupporterRunnerDelegateGiven):
|
||||
ProcessModelTestRunner(
|
||||
os.path.join(self.root_path(), "DNE"), process_model_test_runner_delegate_class=NoTestCasesFoundError
|
||||
)
|
||||
|
||||
def test_can_test_multiple_process_models_with_all_passing_tests(
|
||||
self,
|
||||
app: Flask,
|
||||
|
|
|
@ -0,0 +1,135 @@
|
|||
"""Test_workflow_service."""
|
||||
from datetime import datetime
|
||||
from datetime import timedelta
|
||||
from datetime import timezone
|
||||
from typing import Generator
|
||||
|
||||
import pytest
|
||||
from SpiffWorkflow.bpmn.workflow import BpmnWorkflow # type: ignore
|
||||
from SpiffWorkflow.dmn.parser.BpmnDmnParser import BpmnDmnParser # type: ignore
|
||||
from SpiffWorkflow.spiff.parser.process import SpiffBpmnParser # type: ignore
|
||||
from tests.spiffworkflow_backend.helpers.base_test import BaseTest
|
||||
|
||||
from spiffworkflow_backend.services.workflow_service import (
|
||||
WorkflowService,
|
||||
)
|
||||
from spiffworkflow_backend.specs.start_event import StartEvent
|
||||
|
||||
BPMN_WRAPPER = """
|
||||
<bpmn:definitions xmlns:bpmn="http://www.omg.org/spec/BPMN/20100524/MODEL"
|
||||
xmlns:bpmndi="http://www.omg.org/spec/BPMN/20100524/DI"
|
||||
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||
xmlns:dc="http://www.omg.org/spec/DD/20100524/DC"
|
||||
xmlns:di="http://www.omg.org/spec/DD/20100524/DI"
|
||||
id="Definitions_96f6665"
|
||||
targetNamespace="http://bpmn.io/schema/bpmn"
|
||||
exporter="Camunda Modeler"
|
||||
exporterVersion="3.0.0-dev"
|
||||
>
|
||||
{}
|
||||
</bpmn:definitions>
|
||||
"""
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def now_in_utc() -> Generator[datetime, None, None]:
|
||||
yield datetime.now(timezone.utc)
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def example_start_datetime_in_utc_str() -> Generator[str, None, None]:
|
||||
yield "2019-10-01T12:00:00+00:00"
|
||||
|
||||
|
||||
@pytest.fixture()
|
||||
def example_start_datetime_minus_5_mins_in_utc(
|
||||
example_start_datetime_in_utc_str: str,
|
||||
) -> Generator[datetime, None, None]:
|
||||
example_datetime = datetime.fromisoformat(example_start_datetime_in_utc_str)
|
||||
yield example_datetime - timedelta(minutes=5)
|
||||
|
||||
|
||||
class CustomBpmnDmnParser(BpmnDmnParser): # type: ignore
|
||||
OVERRIDE_PARSER_CLASSES = {}
|
||||
OVERRIDE_PARSER_CLASSES.update(BpmnDmnParser.OVERRIDE_PARSER_CLASSES)
|
||||
OVERRIDE_PARSER_CLASSES.update(SpiffBpmnParser.OVERRIDE_PARSER_CLASSES)
|
||||
|
||||
StartEvent.register_parser_class(OVERRIDE_PARSER_CLASSES)
|
||||
|
||||
|
||||
def workflow_from_str(bpmn_str: str, process_id: str) -> BpmnWorkflow:
|
||||
parser = CustomBpmnDmnParser()
|
||||
parser.add_bpmn_str(bpmn_str)
|
||||
top_level = parser.get_spec(process_id)
|
||||
subprocesses = parser.get_subprocess_specs(process_id)
|
||||
return BpmnWorkflow(top_level, subprocesses)
|
||||
|
||||
|
||||
def workflow_from_fragment(bpmn_fragment: str, process_id: str) -> BpmnWorkflow:
|
||||
return workflow_from_str(BPMN_WRAPPER.format(bpmn_fragment), process_id)
|
||||
|
||||
|
||||
class TestWorkflowService(BaseTest):
|
||||
"""TestWorkflowService."""
|
||||
|
||||
def test_run_at_delay_is_0_for_regular_start_events(self, now_in_utc: datetime) -> None:
|
||||
workflow = workflow_from_fragment(
|
||||
"""
|
||||
<bpmn:process id="no_tasks" name="No Tasks" isExecutable="true">
|
||||
<bpmn:startEvent id="StartEvent_1">
|
||||
<bpmn:outgoing>Flow_184umot</bpmn:outgoing>
|
||||
</bpmn:startEvent>
|
||||
<bpmn:endEvent id="Event_0qq9il3">
|
||||
<bpmn:incoming>Flow_184umot</bpmn:incoming>
|
||||
</bpmn:endEvent>
|
||||
<bpmn:sequenceFlow id="Flow_184umot" sourceRef="StartEvent_1" targetRef="Event_0qq9il3" />
|
||||
</bpmn:process>
|
||||
""",
|
||||
"no_tasks",
|
||||
)
|
||||
delay = WorkflowService.calculate_run_at_delay_in_seconds(workflow, now_in_utc)
|
||||
assert delay == 0
|
||||
|
||||
def test_run_at_delay_is_30_for_30_second_duration_start_timer_event(self, now_in_utc: datetime) -> None:
|
||||
workflow = workflow_from_fragment(
|
||||
"""
|
||||
<bpmn:process id="Process_aldvgey" isExecutable="true">
|
||||
<bpmn:startEvent id="StartEvent_1">
|
||||
<bpmn:outgoing>Flow_1x1o335</bpmn:outgoing>
|
||||
<bpmn:timerEventDefinition id="TimerEventDefinition_1vi6a54">
|
||||
<bpmn:timeDuration xsi:type="bpmn:tFormalExpression">"PT30S"</bpmn:timeDuration>
|
||||
</bpmn:timerEventDefinition>
|
||||
</bpmn:startEvent>
|
||||
<bpmn:sequenceFlow id="Flow_1x1o335" sourceRef="StartEvent_1" targetRef="Event_0upbokh" />
|
||||
<bpmn:endEvent id="Event_0upbokh">
|
||||
<bpmn:incoming>Flow_1x1o335</bpmn:incoming>
|
||||
</bpmn:endEvent>
|
||||
</bpmn:process>
|
||||
""",
|
||||
"Process_aldvgey",
|
||||
)
|
||||
delay = WorkflowService.calculate_run_at_delay_in_seconds(workflow, now_in_utc)
|
||||
assert delay == 30
|
||||
|
||||
def test_run_at_delay_is_300_if_5_mins_before_date_start_timer_event(
|
||||
self, example_start_datetime_in_utc_str: str, example_start_datetime_minus_5_mins_in_utc: datetime
|
||||
) -> None:
|
||||
workflow = workflow_from_fragment(
|
||||
f"""
|
||||
<bpmn:process id="Process_aldvgey" isExecutable="true">
|
||||
<bpmn:startEvent id="StartEvent_1">
|
||||
<bpmn:outgoing>Flow_1x1o335</bpmn:outgoing>
|
||||
<bpmn:timerEventDefinition id="TimerEventDefinition_1vi6a54">
|
||||
<bpmn:timeDate xsi:type="bpmn:tFormalExpression">"{example_start_datetime_in_utc_str}"</bpmn:timeDate>
|
||||
</bpmn:timerEventDefinition>
|
||||
</bpmn:startEvent>
|
||||
<bpmn:sequenceFlow id="Flow_1x1o335" sourceRef="StartEvent_1" targetRef="Event_0upbokh" />
|
||||
<bpmn:endEvent id="Event_0upbokh">
|
||||
<bpmn:incoming>Flow_1x1o335</bpmn:incoming>
|
||||
</bpmn:endEvent>
|
||||
</bpmn:process>
|
||||
""",
|
||||
"Process_aldvgey",
|
||||
)
|
||||
delay = WorkflowService.calculate_run_at_delay_in_seconds(workflow, example_start_datetime_minus_5_mins_in_utc)
|
||||
assert delay == 300
|
|
@ -10,7 +10,6 @@ import HomePageRoutes from './routes/HomePageRoutes';
|
|||
import About from './routes/About';
|
||||
import ErrorBoundary from './components/ErrorBoundary';
|
||||
import AdminRoutes from './routes/AdminRoutes';
|
||||
import ProcessRoutes from './routes/ProcessRoutes';
|
||||
|
||||
import { AbilityContext } from './contexts/Can';
|
||||
import UserService from './services/UserService';
|
||||
|
@ -41,7 +40,6 @@ export default function App() {
|
|||
<Route path="/*" element={<HomePageRoutes />} />
|
||||
<Route path="/about" element={<About />} />
|
||||
<Route path="/tasks/*" element={<HomePageRoutes />} />
|
||||
<Route path="/process/*" element={<ProcessRoutes />} />
|
||||
<Route path="/admin/*" element={<AdminRoutes />} />
|
||||
</Routes>
|
||||
</ErrorBoundary>
|
||||
|
|
|
@ -120,15 +120,19 @@ export default function NavigationBar() {
|
|||
<a target="_blank" href={documentationUrl} rel="noreferrer">
|
||||
Documentation
|
||||
</a>
|
||||
<hr />
|
||||
<Button
|
||||
data-qa="logout-button"
|
||||
className="button-link"
|
||||
onClick={handleLogout}
|
||||
>
|
||||
<Logout />
|
||||
Sign out
|
||||
</Button>
|
||||
{!UserService.authenticationDisabled() ? (
|
||||
<>
|
||||
<hr />
|
||||
<Button
|
||||
data-qa="logout-button"
|
||||
className="button-link"
|
||||
onClick={handleLogout}
|
||||
>
|
||||
<Logout />
|
||||
Sign out
|
||||
</Button>
|
||||
</>
|
||||
) : null}
|
||||
</ToggletipContent>
|
||||
</Toggletip>
|
||||
</div>
|
||||
|
|
|
@ -782,7 +782,7 @@ export default function ProcessInstanceListTable({
|
|||
undefined,
|
||||
paginationQueryParamPrefix
|
||||
);
|
||||
page = 1; // Reset page back to 0
|
||||
page = 1;
|
||||
|
||||
const newReportMetadata = getNewReportMetadataBasedOnPageWidgets();
|
||||
setReportMetadata(newReportMetadata);
|
||||
|
@ -1590,9 +1590,7 @@ export default function ProcessInstanceListTable({
|
|||
});
|
||||
if (showActionsColumn) {
|
||||
let buttonElement = null;
|
||||
const interstitialUrl = `/process/${modifyProcessIdentifierForPathParam(
|
||||
processInstance.process_model_identifier
|
||||
)}/${processInstance.id}/interstitial`;
|
||||
const taskShowUrl = `/tasks/${processInstance.id}/${processInstance.task_id}`;
|
||||
const regex = new RegExp(`\\b(${preferredUsername}|${userEmail})\\b`);
|
||||
let hasAccessToCompleteTask = false;
|
||||
if (
|
||||
|
@ -1601,21 +1599,19 @@ export default function ProcessInstanceListTable({
|
|||
) {
|
||||
hasAccessToCompleteTask = true;
|
||||
}
|
||||
let buttonText = 'View';
|
||||
buttonElement = null;
|
||||
if (hasAccessToCompleteTask && processInstance.task_id) {
|
||||
buttonText = 'Go';
|
||||
buttonElement = (
|
||||
<Button
|
||||
kind="secondary"
|
||||
href={taskShowUrl}
|
||||
style={{ width: '60px' }}
|
||||
>
|
||||
Go
|
||||
</Button>
|
||||
);
|
||||
}
|
||||
|
||||
buttonElement = (
|
||||
<Button
|
||||
kind="secondary"
|
||||
href={interstitialUrl}
|
||||
style={{ width: '60px' }}
|
||||
>
|
||||
{buttonText}
|
||||
</Button>
|
||||
);
|
||||
|
||||
if (
|
||||
processInstance.status === 'not_started' ||
|
||||
processInstance.status === 'user_input_required' ||
|
||||
|
|
|
@ -96,7 +96,7 @@ export default function ProcessInstanceRun({
|
|||
const onProcessInstanceRun = (processInstance: any) => {
|
||||
const processInstanceId = (processInstance as any).id;
|
||||
navigate(
|
||||
`/process/${modifyProcessIdentifierForPathParam(
|
||||
`/admin/process-instances/${modifyProcessIdentifierForPathParam(
|
||||
processModel.id
|
||||
)}/${processInstanceId}/interstitial`
|
||||
);
|
||||
|
|
|
@ -1,80 +1,103 @@
|
|||
import React, { useCallback, useEffect, useMemo, useState } from 'react';
|
||||
import { useNavigate, useParams } from 'react-router-dom';
|
||||
import { useNavigate } from 'react-router-dom';
|
||||
import { fetchEventSource } from '@microsoft/fetch-event-source';
|
||||
// @ts-ignore
|
||||
import { Loading, Button } from '@carbon/react';
|
||||
import { Loading } from '@carbon/react';
|
||||
import { BACKEND_BASE_URL } from '../config';
|
||||
import { getBasicHeaders } from '../services/HttpService';
|
||||
|
||||
// @ts-ignore
|
||||
import InstructionsForEndUser from '../components/InstructionsForEndUser';
|
||||
import ProcessBreadcrumb from '../components/ProcessBreadcrumb';
|
||||
import InstructionsForEndUser from './InstructionsForEndUser';
|
||||
import { ProcessInstance, ProcessInstanceTask } from '../interfaces';
|
||||
import useAPIError from '../hooks/UseApiError';
|
||||
|
||||
export default function ProcessInterstitial() {
|
||||
type OwnProps = {
|
||||
processInstanceId: number;
|
||||
processInstanceShowPageUrl: string;
|
||||
allowRedirect: boolean;
|
||||
};
|
||||
|
||||
export default function ProcessInterstitial({
|
||||
processInstanceId,
|
||||
allowRedirect,
|
||||
processInstanceShowPageUrl,
|
||||
}: OwnProps) {
|
||||
const [data, setData] = useState<any[]>([]);
|
||||
const [lastTask, setLastTask] = useState<any>(null);
|
||||
const [state, setState] = useState<string>('RUNNING');
|
||||
const [processInstance, setProcessInstance] =
|
||||
useState<ProcessInstance | null>(null);
|
||||
const [state, setState] = useState<string>('RUNNING');
|
||||
const params = useParams();
|
||||
|
||||
const navigate = useNavigate();
|
||||
const userTasks = useMemo(() => {
|
||||
return ['User Task', 'Manual Task'];
|
||||
}, []);
|
||||
const { addError } = useAPIError();
|
||||
|
||||
const processInstanceShowPageBaseUrl = `/admin/process-instances/for-me/${params.modified_process_model_identifier}`;
|
||||
|
||||
useEffect(() => {
|
||||
fetchEventSource(
|
||||
`${BACKEND_BASE_URL}/tasks/${params.process_instance_id}`,
|
||||
{
|
||||
headers: getBasicHeaders(),
|
||||
onmessage(ev) {
|
||||
const retValue = JSON.parse(ev.data);
|
||||
if (retValue.type === 'error') {
|
||||
addError(retValue.error);
|
||||
} else if (retValue.type === 'task') {
|
||||
setData((prevData) => [retValue.task, ...prevData]);
|
||||
setLastTask(retValue.task);
|
||||
} else if (retValue.type === 'unrunnable_instance') {
|
||||
setProcessInstance(retValue.unrunnable_instance);
|
||||
}
|
||||
},
|
||||
onclose() {
|
||||
setState('CLOSED');
|
||||
},
|
||||
}
|
||||
);
|
||||
fetchEventSource(`${BACKEND_BASE_URL}/tasks/${processInstanceId}`, {
|
||||
headers: getBasicHeaders(),
|
||||
onmessage(ev) {
|
||||
const retValue = JSON.parse(ev.data);
|
||||
if (retValue.type === 'error') {
|
||||
addError(retValue.error);
|
||||
} else if (retValue.type === 'task') {
|
||||
setData((prevData) => [retValue.task, ...prevData]);
|
||||
setLastTask(retValue.task);
|
||||
} else if (retValue.type === 'unrunnable_instance') {
|
||||
setProcessInstance(retValue.unrunnable_instance);
|
||||
}
|
||||
},
|
||||
onclose() {
|
||||
console.log('The state is closed.');
|
||||
setState('CLOSED');
|
||||
},
|
||||
});
|
||||
// eslint-disable-next-line react-hooks/exhaustive-deps
|
||||
}, []); // it is critical to only run this once.
|
||||
|
||||
const shouldRedirect = useCallback(
|
||||
const shouldRedirectToTask = useCallback(
|
||||
(myTask: ProcessInstanceTask): boolean => {
|
||||
return (
|
||||
allowRedirect &&
|
||||
!processInstance &&
|
||||
myTask &&
|
||||
myTask.can_complete &&
|
||||
userTasks.includes(myTask.type)
|
||||
);
|
||||
},
|
||||
[userTasks, processInstance]
|
||||
[allowRedirect, processInstance, userTasks]
|
||||
);
|
||||
|
||||
const shouldRedirectToProcessInstance = useCallback((): boolean => {
|
||||
return allowRedirect && state === 'CLOSED';
|
||||
}, [allowRedirect, state]);
|
||||
|
||||
useEffect(() => {
|
||||
// Added this seperate use effect so that the timer interval will be cleared if
|
||||
// we end up redirecting back to the TaskShow page.
|
||||
if (shouldRedirect(lastTask)) {
|
||||
if (shouldRedirectToTask(lastTask)) {
|
||||
lastTask.properties.instructionsForEndUser = '';
|
||||
const timerId = setInterval(() => {
|
||||
navigate(`/tasks/${lastTask.process_instance_id}/${lastTask.id}`);
|
||||
}, 2000);
|
||||
return () => clearInterval(timerId);
|
||||
}
|
||||
if (shouldRedirectToProcessInstance()) {
|
||||
// Navigate without pause as we will be showing the same information.
|
||||
navigate(processInstanceShowPageUrl);
|
||||
}
|
||||
return undefined;
|
||||
}, [lastTask, navigate, userTasks, shouldRedirect]);
|
||||
}, [
|
||||
lastTask,
|
||||
navigate,
|
||||
userTasks,
|
||||
shouldRedirectToTask,
|
||||
processInstanceId,
|
||||
processInstanceShowPageUrl,
|
||||
state,
|
||||
shouldRedirectToProcessInstance,
|
||||
]);
|
||||
|
||||
const getStatus = (): string => {
|
||||
if (processInstance) {
|
||||
|
@ -95,35 +118,13 @@ export default function ProcessInterstitial() {
|
|||
<Loading
|
||||
description="Active loading indicator"
|
||||
withOverlay={false}
|
||||
style={{ margin: 'auto' }}
|
||||
style={{ margin: '50px 0 50px 50px' }}
|
||||
/>
|
||||
);
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
const getReturnHomeButton = (index: number) => {
|
||||
if (
|
||||
index === 0 &&
|
||||
!shouldRedirect(lastTask) &&
|
||||
['WAITING', 'ERROR', 'LOCKED', 'COMPLETED', 'READY'].includes(getStatus())
|
||||
) {
|
||||
return (
|
||||
<div style={{ padding: '10px 0 0 0' }}>
|
||||
<Button
|
||||
kind="secondary"
|
||||
data-qa="return-to-home-button"
|
||||
onClick={() => navigate(`/tasks`)}
|
||||
style={{ marginBottom: 30 }}
|
||||
>
|
||||
Return to Home
|
||||
</Button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
return '';
|
||||
};
|
||||
|
||||
const userMessage = (myTask: ProcessInstanceTask) => {
|
||||
if (!processInstance || processInstance.status === 'completed') {
|
||||
if (!myTask.can_complete && userTasks.includes(myTask.type)) {
|
||||
|
@ -134,9 +135,12 @@ export default function ProcessInterstitial() {
|
|||
</p>
|
||||
);
|
||||
}
|
||||
if (shouldRedirect(myTask)) {
|
||||
if (shouldRedirectToTask(myTask)) {
|
||||
return <div>Redirecting you to the next task now ...</div>;
|
||||
}
|
||||
if (myTask && myTask.can_complete && userTasks.includes(myTask.type)) {
|
||||
return `The task ${myTask.title} is ready for you to complete.`;
|
||||
}
|
||||
if (myTask.error_message) {
|
||||
return <div>{myTask.error_message}</div>;
|
||||
}
|
||||
|
@ -161,40 +165,24 @@ export default function ProcessInterstitial() {
|
|||
navigate(`/tasks`);
|
||||
}
|
||||
|
||||
let displayableData = data;
|
||||
if (state === 'CLOSED') {
|
||||
displayableData = [data[0]];
|
||||
}
|
||||
|
||||
if (lastTask) {
|
||||
return (
|
||||
<>
|
||||
<ProcessBreadcrumb
|
||||
hotCrumbs={[
|
||||
['Process Groups', '/admin'],
|
||||
{
|
||||
entityToExplode: lastTask.process_model_identifier,
|
||||
entityType: 'process-model-id',
|
||||
linkLastItem: true,
|
||||
},
|
||||
[
|
||||
`Process Instance: ${params.process_instance_id}`,
|
||||
`${processInstanceShowPageBaseUrl}/${params.process_instance_id}`,
|
||||
],
|
||||
]}
|
||||
/>
|
||||
{getLoadingIcon()}
|
||||
<div style={{ maxWidth: 800, margin: 'auto', padding: 50 }}>
|
||||
{data.map((d, index) => (
|
||||
<>
|
||||
<div
|
||||
className={
|
||||
index < 4
|
||||
? `user_instructions_${index}`
|
||||
: `user_instructions_4`
|
||||
}
|
||||
>
|
||||
{userMessage(d)}
|
||||
</div>
|
||||
{getReturnHomeButton(index)}
|
||||
</>
|
||||
))}
|
||||
</div>
|
||||
{displayableData.map((d, index) => (
|
||||
<div
|
||||
className={
|
||||
index < 4 ? `user_instructions_${index}` : `user_instructions_4`
|
||||
}
|
||||
>
|
||||
{userMessage(d)}
|
||||
</div>
|
||||
))}
|
||||
</>
|
||||
);
|
||||
}
|
|
@ -23,6 +23,7 @@ export interface RecentProcessModel {
|
|||
|
||||
export interface TaskPropertiesJson {
|
||||
parent: string;
|
||||
last_state_change: number;
|
||||
}
|
||||
|
||||
export interface TaskDefinitionPropertiesJson {
|
||||
|
|
|
@ -22,6 +22,7 @@ import Configuration from './Configuration';
|
|||
import JsonSchemaFormBuilder from './JsonSchemaFormBuilder';
|
||||
import ProcessModelNewExperimental from './ProcessModelNewExperimental';
|
||||
import ProcessInstanceFindById from './ProcessInstanceFindById';
|
||||
import ProcessInterstitialPage from './ProcessInterstitialPage';
|
||||
|
||||
export default function AdminRoutes() {
|
||||
const location = useLocation();
|
||||
|
@ -75,6 +76,14 @@ export default function AdminRoutes() {
|
|||
path="process-instances/for-me/:process_model_id/:process_instance_id/:to_task_guid"
|
||||
element={<ProcessInstanceShow variant="for-me" />}
|
||||
/>
|
||||
<Route
|
||||
path="process-instances/for-me/:process_model_id/:process_instance_id/interstitial"
|
||||
element={<ProcessInterstitialPage variant="for-me" />}
|
||||
/>
|
||||
<Route
|
||||
path="process-instances/:process_model_id/:process_instance_id/interstitial"
|
||||
element={<ProcessInterstitialPage variant="all" />}
|
||||
/>
|
||||
<Route
|
||||
path="process-instances/:process_model_id/:process_instance_id"
|
||||
element={<ProcessInstanceShow variant="all" />}
|
||||
|
|
|
@ -53,6 +53,7 @@ import { usePermissionFetcher } from '../hooks/PermissionService';
|
|||
import ProcessInstanceClass from '../classes/ProcessInstanceClass';
|
||||
import TaskListTable from '../components/TaskListTable';
|
||||
import useAPIError from '../hooks/UseApiError';
|
||||
import ProcessInterstitial from '../components/ProcessInterstitial';
|
||||
|
||||
type OwnProps = {
|
||||
variant: string;
|
||||
|
@ -1109,6 +1110,11 @@ export default function ProcessInstanceShow({ variant }: OwnProps) {
|
|||
</h1>
|
||||
{buttonIcons()}
|
||||
</Stack>
|
||||
<ProcessInterstitial
|
||||
processInstanceId={processInstance.id}
|
||||
processInstanceShowPageUrl={processInstanceShowPageBaseUrl}
|
||||
allowRedirect={false}
|
||||
/>
|
||||
<br />
|
||||
<br />
|
||||
<Grid condensed fullWidth>
|
||||
|
|
|
@ -0,0 +1,41 @@
|
|||
import React from 'react';
|
||||
import { useParams } from 'react-router-dom';
|
||||
// @ts-ignore
|
||||
import ProcessInterstitial from '../components/ProcessInterstitial';
|
||||
import ProcessBreadcrumb from '../components/ProcessBreadcrumb';
|
||||
|
||||
type OwnProps = {
|
||||
variant: string;
|
||||
};
|
||||
|
||||
export default function ProcessInterstitialPage({ variant }: OwnProps) {
|
||||
const params = useParams();
|
||||
let processInstanceShowPageUrl = `/admin/process-instances/for-me/${params.process_model_id}/${params.process_instance_id}`;
|
||||
if (variant === 'all') {
|
||||
processInstanceShowPageUrl = `/admin/process-instances/${params.process_model_id}/${params.process_instance_id}`;
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
<ProcessBreadcrumb
|
||||
hotCrumbs={[
|
||||
['Process Groups', '/admin'],
|
||||
{
|
||||
entityToExplode: String(params.process_model_id),
|
||||
entityType: 'process-model-id',
|
||||
linkLastItem: true,
|
||||
},
|
||||
[
|
||||
`Process Instance: ${params.process_instance_id}`,
|
||||
`${processInstanceShowPageUrl}`,
|
||||
],
|
||||
]}
|
||||
/>
|
||||
<ProcessInterstitial
|
||||
processInstanceId={Number(params.process_instance_id)}
|
||||
processInstanceShowPageUrl={processInstanceShowPageUrl}
|
||||
allowRedirect
|
||||
/>
|
||||
</>
|
||||
);
|
||||
}
|
|
@ -1,14 +0,0 @@
|
|||
import { Route, Routes } from 'react-router-dom';
|
||||
// @ts-ignore
|
||||
import ProcessInterstitial from './ProcessInterstitial';
|
||||
|
||||
export default function ProcessRoutes() {
|
||||
return (
|
||||
<Routes>
|
||||
<Route
|
||||
path=":modified_process_model_identifier/:process_instance_id/interstitial"
|
||||
element={<ProcessInterstitial />}
|
||||
/>
|
||||
</Routes>
|
||||
);
|
||||
}
|
|
@ -102,7 +102,7 @@ export default function TaskShow() {
|
|||
|
||||
const navigateToInterstitial = (myTask: Task) => {
|
||||
navigate(
|
||||
`/process/${modifyProcessIdentifierForPathParam(
|
||||
`/admin/process-instances/${modifyProcessIdentifierForPathParam(
|
||||
myTask.process_model_identifier
|
||||
)}/${myTask.process_instance_id}/interstitial`
|
||||
);
|
||||
|
@ -265,37 +265,104 @@ export default function TaskShow() {
|
|||
return null;
|
||||
};
|
||||
|
||||
const formatDateString = (dateString?: string) => {
|
||||
let dateObject = new Date();
|
||||
if (dateString) {
|
||||
dateObject = new Date(dateString);
|
||||
}
|
||||
return dateObject.toISOString().split('T')[0];
|
||||
};
|
||||
|
||||
const checkFieldComparisons = (
|
||||
formData: any,
|
||||
propertyKey: string,
|
||||
propertyMetadata: any,
|
||||
formattedDateString: string,
|
||||
errors: any
|
||||
) => {
|
||||
const fieldIdentifierToCompareWith = propertyMetadata.minimumDate.replace(
|
||||
/^field:/,
|
||||
''
|
||||
);
|
||||
if (fieldIdentifierToCompareWith in formData) {
|
||||
const dateToCompareWith = formData[fieldIdentifierToCompareWith];
|
||||
if (dateToCompareWith) {
|
||||
const dateStringToCompareWith = formatDateString(dateToCompareWith);
|
||||
if (dateStringToCompareWith > formattedDateString) {
|
||||
errors[propertyKey].addError(
|
||||
`must be equal to or greater than '${fieldIdentifierToCompareWith}'`
|
||||
);
|
||||
}
|
||||
} else {
|
||||
errors[propertyKey].addError(
|
||||
`was supposed to be compared against '${fieldIdentifierToCompareWith}' but that field did not have a value`
|
||||
);
|
||||
}
|
||||
} else {
|
||||
errors[propertyKey].addError(
|
||||
`was supposed to be compared against '${fieldIdentifierToCompareWith}' but it either doesn't have a value or does not exist`
|
||||
);
|
||||
}
|
||||
};
|
||||
|
||||
const checkMinimumDate = (
|
||||
formData: any,
|
||||
propertyKey: string,
|
||||
propertyMetadata: any,
|
||||
errors: any
|
||||
) => {
|
||||
const dateString = formData[propertyKey];
|
||||
if (dateString) {
|
||||
const formattedDateString = formatDateString(dateString);
|
||||
if (propertyMetadata.minimumDate === 'today') {
|
||||
const dateTodayString = formatDateString();
|
||||
if (dateTodayString > formattedDateString) {
|
||||
errors[propertyKey].addError('must be today or after');
|
||||
}
|
||||
} else if (propertyMetadata.minimumDate.startsWith('field:')) {
|
||||
checkFieldComparisons(
|
||||
formData,
|
||||
propertyKey,
|
||||
propertyMetadata,
|
||||
formattedDateString,
|
||||
errors
|
||||
);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
const getFieldsWithDateValidations = (
|
||||
jsonSchema: any,
|
||||
formData: any,
|
||||
errors: any
|
||||
// eslint-disable-next-line sonarjs/cognitive-complexity
|
||||
) => {
|
||||
if ('properties' in jsonSchema) {
|
||||
Object.keys(jsonSchema.properties).forEach((propertyKey: string) => {
|
||||
const propertyMetadata = jsonSchema.properties[propertyKey];
|
||||
if (
|
||||
typeof propertyMetadata === 'object' &&
|
||||
'minimumDate' in propertyMetadata &&
|
||||
propertyMetadata.minimumDate === 'today'
|
||||
) {
|
||||
const dateToday = new Date();
|
||||
const dateValue = formData[propertyKey];
|
||||
if (dateValue) {
|
||||
const dateValueObject = new Date(dateValue);
|
||||
const dateValueString = dateValueObject.toISOString().split('T')[0];
|
||||
const dateTodayString = dateToday.toISOString().split('T')[0];
|
||||
if (dateTodayString > dateValueString) {
|
||||
errors[propertyKey].addError('must be today or after');
|
||||
}
|
||||
}
|
||||
// if the jsonSchema has an items attribute then assume the element itself
|
||||
// doesn't have a custom validation but it's children could so use that
|
||||
const jsonSchemaToUse =
|
||||
'items' in jsonSchema ? jsonSchema.items : jsonSchema;
|
||||
|
||||
if ('properties' in jsonSchemaToUse) {
|
||||
Object.keys(jsonSchemaToUse.properties).forEach((propertyKey: string) => {
|
||||
const propertyMetadata = jsonSchemaToUse.properties[propertyKey];
|
||||
if ('minimumDate' in propertyMetadata) {
|
||||
checkMinimumDate(formData, propertyKey, propertyMetadata, errors);
|
||||
}
|
||||
|
||||
// recurse through all nested properties as well
|
||||
getFieldsWithDateValidations(
|
||||
propertyMetadata,
|
||||
formData[propertyKey],
|
||||
errors[propertyKey]
|
||||
);
|
||||
let formDataToSend = formData[propertyKey];
|
||||
if (formDataToSend) {
|
||||
if (formDataToSend.constructor.name !== 'Array') {
|
||||
formDataToSend = [formDataToSend];
|
||||
}
|
||||
formDataToSend.forEach((item: any, index: number) => {
|
||||
let errorsToSend = errors[propertyKey];
|
||||
if (index in errorsToSend) {
|
||||
errorsToSend = errorsToSend[index];
|
||||
}
|
||||
getFieldsWithDateValidations(propertyMetadata, item, errorsToSend);
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
return errors;
|
||||
|
|
|
@ -62,6 +62,15 @@ const getUserEmail = () => {
|
|||
return null;
|
||||
};
|
||||
|
||||
const authenticationDisabled = () => {
|
||||
const idToken = getIdToken();
|
||||
if (idToken) {
|
||||
const idObject = jwt(idToken);
|
||||
return (idObject as any).authentication_disabled;
|
||||
}
|
||||
return false;
|
||||
};
|
||||
|
||||
const getPreferredUsername = () => {
|
||||
const idToken = getIdToken();
|
||||
if (idToken) {
|
||||
|
@ -82,14 +91,15 @@ const hasRole = (_roles: string[]): boolean => {
|
|||
};
|
||||
|
||||
const UserService = {
|
||||
authenticationDisabled,
|
||||
doLogin,
|
||||
doLogout,
|
||||
isLoggedIn,
|
||||
getAccessToken,
|
||||
loginIfNeeded,
|
||||
getPreferredUsername,
|
||||
getUserEmail,
|
||||
hasRole,
|
||||
isLoggedIn,
|
||||
loginIfNeeded,
|
||||
};
|
||||
|
||||
export default UserService;
|
||||
|
|
Loading…
Reference in New Issue