From c4142702d403668629c418208bf505d86f8ac24e Mon Sep 17 00:00:00 2001 From: burnettk Date: Mon, 27 Feb 2023 14:59:37 -0500 Subject: [PATCH] Squashed 'spiffworkflow-backend/' changes from 6cae736ac..0bfe4400f 0bfe4400f Merge branch 'main' of github.com:sartography/spiff-arena into main fbe97dd90 Point to the latest spiffworkflow 805400312 fixed conflict with db migrations w/ burnettk 6e48c8205 Merge remote-tracking branch 'origin/main' into feature/script_get_last_user_completing_task 3a6cc9c2b added script to get process initiator w/ burnettk 37770dbca run_pyl 8a068cdb2 Merging main bf25c1cef run_pyl 4ca6e3f27 Needed an additional check for empty correlation keys - which on a RECEIVE message, should always match anything. 0fb88c50d remove unwanted test files w/ burnettk 89d92687d script to get last user completing a task is working w/ burnettk 98f056e5c Merge remote-tracking branch 'origin/main' into feature/script_get_last_user_completing_task 46af55a60 poetry remove orjson 28a0db097 wip for get_last_user_completing_task script task 2596cfeb1 postgres really will just order however it wants if you do not specify an order_by clause 5e339b2fb add ppg.ba4.sme and ba5 873cfdfc1 fix postgres db name and comment out debug job f54d6ae4d removed some unused code from task and fixed the logs table a bit w/ burnettk 7c74e216a run_pyl f53c85924 skip failing test if postgres and added comment about cause w/ burnettk fbe2237f1 # SpiffWorkflow: 1) Type Safe checking on correlation properties (no more str()) 2) A running workflows Correlations are once again at the key level. 2677736c2 look users up by service and username instead of service_id since usernames have to be unique anyway w/ burnettk 22d1f8bbb avoid using task-data endpoint for task data and only use it to get tasks based on spiff step instead 698fb5d5c put back the task data code when getting tasks b52f97453 Merge remote-tracking branch 'origin/main' into feature/task_data_api_refactor 9c4a17c93 removed commented out code w/ burnettk 3357fbef4 removed task-data endpoints since we no longer need them w/ burnettk 3b64afb65 turn off profiling again fcbcfd0ae add two users and update one 2ee7cba09 BPMN Parser was returning all retrieval expressions, rather than the ones specific to a correlation property, as was intended. Adding a correlation cache - so we have a reference of all the messages and properties (though still lacking a description of keys) Adding yet another migration, maybe should squash em. 6658e26bb added api to get task data and do not return from task data list anymore w/ burnettk 1fd3261d7 run_pyl (part 2) f6c63eb1c Merge branch 'main' of github.com:sartography/spiff-arena 5c5262a31 added comment about refactoring getting task data w/ burnettk jbirddog 6d4aa9043 lint e4c0ed7e1 add test users adfb0644f Adding Migration. ec36290f2 remove task size check since it can take a long time to run and we do not do anything with it w/ burnettk jbirddog 9b0b95f95 Merge remote-tracking branch 'origin/main' into feature/message_fixes f3d124a70 run_pyl be6ac8743 BPMN.io -- Just show the message names not the ids - to assure we are only exposing the names. SpiffWorkflow - - start_messages function should return message names, not ids. - don't catch external thrown messages within the same workflow process - add an expected value to the Correlation Property Model so we can use this well defined class as an external communication tool (rather than building an arbitrary dictionary) - Added a "get_awaiting_correlations" to an event, so we can get a list of the correlation properties related to the workflows currently defined correlation values. - workflows.waiting_events() function now returns the above awaiting correlations as the value on returned message events Backend - Dropping MessageModel and MessageCorrelationProperties - at least for now. We don't need them to send / receive messages though we may eventually want to track the messages and correlations defined across the system - these things (which are ever changing) should not be directly connected to the Messages which may be in flux - and the cross relationships between the tables could cause unexpected and unceissary errors. Commented out the caching logic so we can turn this back on later. - Slight improvement to API Errors - MessageInstances are no longer in a many-to-many relationship with Correlations - Each message instance has a unique set of message correlations specific to the instance. - Message Instances have users, and can be linked through a "counterpart_id" so you can see what send is connected to what recieve. - Message Correlations are connected to recieving message instances. It is not to a process instance, and not to a message model. They now include the expected value and retrieval expression required to validate an incoming message. - A process instance is not connected to message correlations. - Message Instances are not always tied to a process instance (for example, a Send Message from an API) - API calls to create a message use the same logic as all other message catching code. - Make use of the new waiting_events() method to check for any new recieve messages in the workflow (much easier than churning through all of the tasks) - One giant mother of a migration. ce449971a do not call serialize if we can use the cached bpmn_json instead w/ burnettk 15d720f94 Merge branch 'main' of github.com:sartography/spiff-arena 8d8347068 turn on sentry detailed tracing for task-data w/ burnettk cdf5f4053 update spiff 5c1ea3c93 run_pyl 384c272af * SpiffWorkflow event_definitions wanted to return a message event's correlation properties mested within correlation keys. But messages are directly related to properties, not to keys - and it forced a number of conversions that made for tricky code. So Messages now contain a dictionary of correlation properties only. * SpiffWorkflow did not serialize correlations - so they were lost between save and retrieve. f45103406 Allow people to run commands like "flask db upgrade" without setting specific environment variables like FLASK_SESSION_SECRET_KEY everytime - they just need to add in their own /instance/config.py with their local configuration. b169c3a87 * Re-work message tests so I could wrap my simple head around what was happening - just needed an example that made sense to me. * Clear out complex get_message_instance_receive how that many-to-many works. * Create decent error messages when correlations fail * Move correlation checks into the MessageInstance class * The APIError could bomb out ugly if it hit a workflow exception with not Task Spec. a39590912 failing test. 6db600caa Merge branch 'main' into feature/message_fixes 4942a728b work in progress - * Link between message instance and correlations is now a link table and many-to-many relationships as recommended by SQLAlchemy * Use the correlation keys, not the process id when accepting api messages. git-subtree-dir: spiffworkflow-backend git-subtree-split: 0bfe4400f4191214b8972977438ceb35a9f5b3c3 --- bin/recreate_db | 8 +- conftest.py | 11 +- keycloak/bin/add_test_users_to_keycloak | 8 +- .../realm_exports/spiffworkflow-realm.json | 680 ++++++++++++++++-- keycloak/test_user_lists/status | 28 +- migrations/versions/9f0b1662a8af_.py | 153 ++++ migrations/versions/d6e5b3af0908_.py | 28 + poetry.lock | 56 +- pyproject.toml | 3 +- src/spiffworkflow_backend/__init__.py | 16 +- src/spiffworkflow_backend/api.yml | 34 +- src/spiffworkflow_backend/config/__init__.py | 8 + src/spiffworkflow_backend/config/default.py | 2 + .../config/permissions/example.yml | 1 + .../exceptions/api_error.py | 4 +- .../load_database_models.py | 4 - .../models/correlation_property_cache.py | 24 + .../models/human_task.py | 9 +- .../models/message_correlation.py | 50 -- .../message_correlation_message_instance.py | 32 - .../models/message_correlation_property.py | 25 - .../models/message_instance.py | 101 ++- .../models/message_instance_correlation.py | 41 ++ .../models/message_model.py | 13 - .../message_triggerable_process_model.py | 8 +- .../models/process_instance.py | 5 +- src/spiffworkflow_backend/models/task.py | 90 +-- src/spiffworkflow_backend/models/user.py | 8 + .../routes/messages_controller.py | 146 +--- .../routes/process_instances_controller.py | 37 +- .../routes/tasks_controller.py | 48 +- .../scripts/get_last_user_completing_task.py | 48 ++ .../scripts/get_process_initiator_user.py | 36 + .../services/authorization_service.py | 12 +- .../services/background_processing_service.py | 2 +- .../services/error_handling_service.py | 27 +- .../services/logging_service.py | 11 +- .../services/message_service.py | 304 ++++---- .../services/process_instance_processor.py | 245 ++----- .../services/process_instance_service.py | 22 +- .../services/spec_file_service.py | 93 +-- .../dynamic_enums_ask_for_color.bpmn | 4 +- tests/data/error/instructions_error.bpmn | 4 +- tests/data/get_localtime/get_localtime.bpmn | 4 +- tests/data/manual_task/manual_task.bpmn | 4 +- tests/data/message/message_send_receive.bpmn | 153 ++++ .../message_receiver.bpmn | 60 +- .../message_sender.bpmn | 119 +-- .../message_receiver_two.bpmn | 12 +- .../message_sender.bpmn | 16 +- tests/data/model_with_lanes/lanes.bpmn | 21 +- .../lanes_with_owner_dict.bpmn | 4 +- tests/data/simple_form/simple_form.bpmn | 6 +- .../simple_form_with_error.bpmn | 4 +- tests/data/simple_script/simple_script.bpmn | 4 +- .../integration/test_process_api.py | 122 ++-- .../test_get_last_user_completing_task.py | 69 ++ .../test_get_process_initiator_user.py | 62 ++ .../unit/test_message_instance.py | 44 +- .../unit/test_message_service.py | 382 ++++++---- 60 files changed, 2253 insertions(+), 1322 deletions(-) create mode 100644 migrations/versions/9f0b1662a8af_.py create mode 100644 migrations/versions/d6e5b3af0908_.py create mode 100644 src/spiffworkflow_backend/models/correlation_property_cache.py delete mode 100644 src/spiffworkflow_backend/models/message_correlation.py delete mode 100644 src/spiffworkflow_backend/models/message_correlation_message_instance.py delete mode 100644 src/spiffworkflow_backend/models/message_correlation_property.py create mode 100644 src/spiffworkflow_backend/models/message_instance_correlation.py delete mode 100644 src/spiffworkflow_backend/models/message_model.py create mode 100644 src/spiffworkflow_backend/scripts/get_last_user_completing_task.py create mode 100644 src/spiffworkflow_backend/scripts/get_process_initiator_user.py create mode 100644 tests/data/message/message_send_receive.bpmn create mode 100644 tests/spiffworkflow_backend/scripts/test_get_last_user_completing_task.py create mode 100644 tests/spiffworkflow_backend/scripts/test_get_process_initiator_user.py diff --git a/bin/recreate_db b/bin/recreate_db index 8a78a9b8c..dd0bf2856 100755 --- a/bin/recreate_db +++ b/bin/recreate_db @@ -41,13 +41,13 @@ if [[ "${1:-}" == "clean" ]]; then # TODO: check to see if the db already exists and we can connect to it. also actually clean it up. # start postgres in background with one db if [[ "${SPIFFWORKFLOW_BACKEND_DATABASE_TYPE:-}" == "postgres" ]]; then - if ! docker exec -it postgres-spiff psql -U spiffworkflow_backend spiffworkflow_backend_testing -c "select 1"; then - docker run --name postgres-spiff -p 5432:5432 -e POSTGRES_PASSWORD=spiffworkflow_backend -e POSTGRES_USER=spiffworkflow_backend -e POSTGRES_DB=spiffworkflow_backend_testing -d postgres + if ! docker exec -it postgres-spiff psql -U spiffworkflow_backend spiffworkflow_backend_unit_testing -c "select 1"; then + docker run --name postgres-spiff -p 5432:5432 -e POSTGRES_PASSWORD=spiffworkflow_backend -e POSTGRES_USER=spiffworkflow_backend -e POSTGRES_DB=spiffworkflow_backend_unit_testing -d postgres sleep 4 # classy fi if ! docker exec -it postgres-spiff psql -U spiffworkflow_backend spiffworkflow_backend_local_development -c "select 1"; then - # create other db. spiffworkflow_backend_testing came with the docker run. - docker exec -it postgres-spiff psql -U spiffworkflow_backend spiffworkflow_backend_testing -c "create database spiffworkflow_backend_local_development;" + # create other db. spiffworkflow_backend_unit_testing came with the docker run. + docker exec -it postgres-spiff psql -U spiffworkflow_backend spiffworkflow_backend_unit_testing -c "create database spiffworkflow_backend_local_development;" fi fi elif [[ "${1:-}" == "migrate" ]]; then diff --git a/conftest.py b/conftest.py index 01fd9e732..9c6c242e4 100644 --- a/conftest.py +++ b/conftest.py @@ -8,8 +8,6 @@ from flask.testing import FlaskClient from tests.spiffworkflow_backend.helpers.base_test import BaseTest from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.human_task_user import HumanTaskUserModel from spiffworkflow_backend.models.process_instance import ProcessInstanceModel from spiffworkflow_backend.models.user import UserModel from spiffworkflow_backend.services.process_instance_processor import ( @@ -46,11 +44,10 @@ def app() -> Flask: @pytest.fixture() def with_db_and_bpmn_file_cleanup() -> None: - """Process_group_resource.""" - db.session.query(HumanTaskUserModel).delete() - - for model in SpiffworkflowBaseDBModel._all_subclasses(): - db.session.query(model).delete() + """Do it cleanly!""" + meta = db.metadata + for table in reversed(meta.sorted_tables): + db.session.execute(table.delete()) db.session.commit() try: diff --git a/keycloak/bin/add_test_users_to_keycloak b/keycloak/bin/add_test_users_to_keycloak index 905823c32..08dd5177b 100755 --- a/keycloak/bin/add_test_users_to_keycloak +++ b/keycloak/bin/add_test_users_to_keycloak @@ -23,10 +23,14 @@ fi if [[ -z "${KEYCLOAK_BASE_URL:-}" ]]; then KEYCLOAK_BASE_URL=http://localhost:7002 fi +if [[ -z "${ADMIN_USERNAME:-}" ]]; then + ADMIN_USERNAME="admin" +fi +if [[ -z "${ADMIN_PASSWORD:-}" ]]; then + ADMIN_PASSWORD="admin" +fi REALM_NAME="$keycloak_realm" -ADMIN_USERNAME="admin" -ADMIN_PASSWORD="admin" SECURE=false KEYCLOAK_URL=$KEYCLOAK_BASE_URL/realms/$REALM_NAME/protocol/openid-connect/token diff --git a/keycloak/realm_exports/spiffworkflow-realm.json b/keycloak/realm_exports/spiffworkflow-realm.json index 78652bcf4..e68e696e7 100644 --- a/keycloak/realm_exports/spiffworkflow-realm.json +++ b/keycloak/realm_exports/spiffworkflow-realm.json @@ -484,21 +484,21 @@ "notBefore" : 0, "groups" : [ ] }, { - "id" : "27b5bdce-1c02-4249-b8ba-521f9bcae2d3", - "createdTimestamp" : 1676302139921, - "username" : "app.program.lead", + "id" : "d959fd73-92b5-43f4-a210-9457c0b89296", + "createdTimestamp" : 1677187934315, + "username" : "app.program-lead", "enabled" : true, "totp" : false, "emailVerified" : false, - "email" : "app.program.lead@status.im", + "email" : "app.program-lead@status.im", "attributes" : { "spiffworkflow-employeeid" : [ "121" ] }, "credentials" : [ { - "id" : "8cd62c66-7357-4c8f-ae57-e45a10150f2d", + "id" : "d959fd73-92b5-43f4-a210-9457c0b89296", "type" : "password", - "createdDate" : 1676302139956, - "secretData" : "{\"value\":\"NhRRaTaL4o8TLmLgFrfIlLo1lBGRgAcoQ+ct7ypw/osYNXcF1zIC7i0AYrwrSSWQ60Wxcx6RZTFRQsZobwCbUw==\",\"salt\":\"nOhBgYVO/Me08wmfOatRdQ==\",\"additionalParameters\":{}}", + "createdDate" : 1677187934366, + "secretData" : "{\"value\":\"6njfc7gdZ1NTsmiyMXOztog8H7yKDSYgBsCFjTod0IszE0zq3WrekGKuT3GDHTHE5xVLO0SZbDQ4V5uRm0auPQ==\",\"salt\":\"eNwudU7v/gvIFX/WNtPu9w==\",\"additionalParameters\":{}}", "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" } ], "disableableCredentialTypes" : [ ], @@ -531,6 +531,167 @@ }, "notBefore" : 0, "groups" : [ ] + }, { + "id" : "7721b278-b117-45c6-9e98-d66efa6272a4", + "createdTimestamp" : 1677187934488, + "username" : "codex.project-lead", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex.project-lead@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "153" ] + }, + "credentials" : [ { + "id" : "4ed0c40f-bd6f-41a2-87c0-f35e826d196c", + "type" : "password", + "createdDate" : 1677187934523, + "secretData" : "{\"value\":\"0xkk4BBlMNVl/xL2b4KLf25PP9h8uY1d2n9kTwEJVm0oOhqnaSEpyKTGlS+oV33DhpNnBDqME922xP+j8kYNgQ==\",\"salt\":\"g20ITxwFU1PnkD4LGdEeIA==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "5e2a535e-056e-485c-b0af-c49bf0d64106", + "createdTimestamp" : 1677181799609, + "username" : "codex.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "185" ] + }, + "credentials" : [ { + "id" : "a2cf9daf-25d2-4cd2-b932-4706442a8437", + "type" : "password", + "createdDate" : 1677181799644, + "secretData" : "{\"value\":\"UY+PfYh5h48i40Klq0KEPVc0DBUrGRxI70BFcs98MD8R7ORJ5G6rWKA3Dq/5I8btu3CJI4PbFeTS/IopMhB7vQ==\",\"salt\":\"mtx4JqI61nsCni3s26PMJg==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "84e8eebf-59ca-466d-8523-2da0aef088ed", + "createdTimestamp" : 1677181799762, + "username" : "codex1.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex1.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "186" ] + }, + "credentials" : [ { + "id" : "cace5730-6cd3-4d19-b0e4-10078fc5024a", + "type" : "password", + "createdDate" : 1677181799797, + "secretData" : "{\"value\":\"QwHtrufirwh38UBlalAikD+dqDo3Bnsp5350OBClcmv7QSlPQ/MqVppRfZXLaseIBbzvnuAjCxmrwtE8ERoy2g==\",\"salt\":\"0LkJgwINFOuVQGvHFp7GVA==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "ffaa3c6f-d6bc-4920-81b8-39d842f57ac5", + "createdTimestamp" : 1677181799898, + "username" : "codex2.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex2.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "187" ] + }, + "credentials" : [ { + "id" : "8c8b872b-86cf-40c8-84a3-f432e0bebee4", + "type" : "password", + "createdDate" : 1677181799933, + "secretData" : "{\"value\":\"IGE1BnNopOP7OJIi5e8AUxT6ZUolat3TkheXZ030xqabu81VdAFYjRKKsrhSf39t9T9ze3d3wHZ0+xI76yxh5Q==\",\"salt\":\"KD8gdrC8seSWEPUJJHKLDw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "7393f1d8-e58c-4b80-8664-6f80931deb7b", + "createdTimestamp" : 1677181800044, + "username" : "codex3.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex3.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "188" ] + }, + "credentials" : [ { + "id" : "ba8252cc-5900-4f5a-8c7e-590b2028ebd0", + "type" : "password", + "createdDate" : 1677181800080, + "secretData" : "{\"value\":\"HrlyO6uWQp615hB9eLdfl5W7ooTw8fZU+jwyFyUsUdIP+HJ2Es4Cu46bJ9Hgdnd7pmuGUma0C/xXR7EGNdvH9w==\",\"salt\":\"XVbQSX3HYRMIqCTyPJmQZw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "a862432c-cf03-4282-b0af-7dff20bfaca6", + "createdTimestamp" : 1677181800213, + "username" : "codex4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "189" ] + }, + "credentials" : [ { + "id" : "43f9f0dd-bae5-4e5b-9c8d-d067e203a1a3", + "type" : "password", + "createdDate" : 1677181800248, + "secretData" : "{\"value\":\"J56SkiE1uYDbA/3k1bFdQzauQG9AYWrR4gZoBTKT/acbOP+p5r0wpZ9BkotDc/R3X9q1KxYx3xU/8BjjZEebwQ==\",\"salt\":\"djpJqi+BXbc2jq+bnthlKw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "b4239cc1-cc70-4224-bd1c-e89e7667dc5a", + "createdTimestamp" : 1677181800350, + "username" : "codex5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "codex5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "190" ] + }, + "credentials" : [ { + "id" : "01585949-5171-4bd6-8193-521c60a1c5b0", + "type" : "password", + "createdDate" : 1677181800384, + "secretData" : "{\"value\":\"VMRw0Z1VZn1vpObUDJu/sKqigkAmdClroJCMNh4msPa8gj13+3KLKrP0xvkFz52PI+3zneb21Mj1FDxlwfzBtg==\",\"salt\":\"+HaiDG8H7DC5XapT0PAARQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "6151b58a-ca4f-44e6-a82a-f13363234555", "createdTimestamp" : 1676302140070, @@ -669,6 +830,29 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "9b7820b2-ad02-431f-a603-2d9b7d4415c8", + "createdTimestamp" : 1677181801624, + "username" : "core6.contributor", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "core6.contributor@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "199" ] + }, + "credentials" : [ { + "id" : "b6cc5352-e173-44e2-a37d-3607b606ab1b", + "type" : "password", + "createdDate" : 1677181801659, + "secretData" : "{\"value\":\"ZIjW8sUAJ5AczMOy+3Jgq82F0hvXqWmcLsmVY88hgVr4rkdjMu0+oOv36OfLFeFNwJrNxQAAots7RGuAyPbZQg==\",\"salt\":\"y6SgpBIdSuEzeJpeFx7/GQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "7b9767ac-24dc-43b0-838f-29e16b4fd14e", "createdTimestamp" : 1675718483773, @@ -752,6 +936,29 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "fae611e8-cde1-4fa1-b653-c6bef8a8c26c", + "createdTimestamp" : 1677181800520, + "username" : "desktop.project-lead", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop.project-lead@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "192" ] + }, + "credentials" : [ { + "id" : "8bc1602b-dceb-4a59-9809-68cb28ff8928", + "type" : "password", + "createdDate" : 1677181800557, + "secretData" : "{\"value\":\"MFB6lcRCnLoXHXMfPDFbDoQSSXmCsZUFetlI+VJVyMieBXesUrBsYC2XrBQX/bg/jI7569Z26ppsh1VtKxrBmw==\",\"salt\":\"f2CuJRGCdmB4QMguj4jMdQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "a71e29a7-678c-4a08-9273-5c8490577c98", "createdTimestamp" : 1676302141251, @@ -772,6 +979,144 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "78e0a909-3634-43f3-80b0-034aa1ddc01d", + "createdTimestamp" : 1677181800708, + "username" : "desktop.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "193" ] + }, + "credentials" : [ { + "id" : "cf167058-268f-42da-94bc-01b35a562f5f", + "type" : "password", + "createdDate" : 1677181800744, + "secretData" : "{\"value\":\"IaSxg2RlpOnwutRGE7QPNVJtmA3klsizOGJq/g+dxAtOYweS1gYlWBFX4EB5zzAfB3gsA3P6gq+2avSK+besNw==\",\"salt\":\"AiM8CxndaAemRW8BQ/r4fw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "38924acc-ac03-4dca-8394-3917121f7509", + "createdTimestamp" : 1677181800877, + "username" : "desktop1.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop1.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "194" ] + }, + "credentials" : [ { + "id" : "204a79e9-6912-4ba9-a0f9-f001ed343242", + "type" : "password", + "createdDate" : 1677181800914, + "secretData" : "{\"value\":\"id13Cma1swB0HDj61wGA7xEIjWN8YKC1qA1WEP4ccV9frIm75xlyBGzwerQg9acNeu1Cltt2m1PDa8pE5ehw+g==\",\"salt\":\"baZl2HLuriksSDppoo/VjA==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "68e45305-dfcc-4ecc-8e62-8d838c46cf56", + "createdTimestamp" : 1677181801035, + "username" : "desktop2.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop2.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "195" ] + }, + "credentials" : [ { + "id" : "586b4314-bfc5-44c0-b1ec-bc8250a546e4", + "type" : "password", + "createdDate" : 1677181801070, + "secretData" : "{\"value\":\"B/7DfIn/ZzJMhzJKZnPQ6oFqQJv/jfRunWDu16TDcfCXXSOlJMmdn2R1yYSSL+hGgDYpaOT86woq0en67uFhnA==\",\"salt\":\"znRgPUHANthkIwXrcOnynQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "8b569875-5265-47bc-b4f9-74764e64fbb9", + "createdTimestamp" : 1677181801182, + "username" : "desktop3.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop3.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "196" ] + }, + "credentials" : [ { + "id" : "b7444657-1937-49c4-b48d-15cd69caec47", + "type" : "password", + "createdDate" : 1677181801216, + "secretData" : "{\"value\":\"iqUzNvgmigp4hgRO4j9rKUvdC/Qa2tLjGJdf5Mf2UieQqBZlqTt0EF/FielwV+D4qYDswcf7Lx9Kyc6sDkOX7g==\",\"salt\":\"113PrU+Thd35/KNKcz1bBg==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "0da7c2a9-f41b-4fdf-b54b-d2c425b18994", + "createdTimestamp" : 1677181801321, + "username" : "desktop4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "197" ] + }, + "credentials" : [ { + "id" : "ac8cfe7e-4a46-436d-8a72-8a2a061e803b", + "type" : "password", + "createdDate" : 1677181801357, + "secretData" : "{\"value\":\"AxFY+VsvoLTKflDvg3cRMjXdOZVOHoRAVxlUVR2YktXsadpo2Jl0ixehU/BByIAs/+TKl8ECM/qQdYV7rZ3rHw==\",\"salt\":\"WV5MxscAoBdJEvSs2HzWAg==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "c1796e6c-1396-4d11-85c2-409225d0ccba", + "createdTimestamp" : 1677181801479, + "username" : "desktop5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "desktop5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "198" ] + }, + "credentials" : [ { + "id" : "5ca9a203-1a04-4be6-93fe-b98f566a6660", + "type" : "password", + "createdDate" : 1677181801516, + "secretData" : "{\"value\":\"WDBB8FDGzyzsjq+Dl+9NXDK7+/S+9VbRFcEyKPxuKe48JvI00s2ZKXE065VuiUAVMvg2RV1tbgw8m31o13m0wA==\",\"salt\":\"wSyEjFR+uWxSA9dc0SNuwQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "3873c0ba-349c-4bec-8be2-5ced8acd56ec", "createdTimestamp" : 1675718483992, @@ -993,6 +1338,52 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "c4bb6e6d-da8b-4c4f-9b83-fdf8516d6946", + "createdTimestamp" : 1677181798082, + "username" : "infra4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "infra4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "175" ] + }, + "credentials" : [ { + "id" : "c7a26698-8d27-4d8f-a8dd-519f74a6d516", + "type" : "password", + "createdDate" : 1677181798173, + "secretData" : "{\"value\":\"k8GfsfeWZg8wfVikCTew3Pgfs/XmlyRl9duh5pe4obM8E+XzGQfgSgx1T4xEIlr/TYl0Hep9zRxEcEtoYNlz8g==\",\"salt\":\"TH94ZAwlFT9cuKgBtcLPzw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "9f269f95-0a5e-4cad-91d5-7b61ee2c795c", + "createdTimestamp" : 1677181798337, + "username" : "infra5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "infra5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "176" ] + }, + "credentials" : [ { + "id" : "04faad56-de12-4a8f-ad54-0e8ef865b0ef", + "type" : "password", + "createdDate" : 1677181798373, + "secretData" : "{\"value\":\"5VJxVKz0uE0a8tZQMbBVaxcEqfdmJdsAdB6T8t0grY+L4etXZHnLlucKkCtQ9aJy1PcDMLjXu6ETrqoTuLkehA==\",\"salt\":\"a6PypYQwyD2Fv/e2UXzGvg==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "b8d0d90e-9a7e-446c-9984-082cb315af8f", "createdTimestamp" : 1675718484095, @@ -1271,6 +1662,75 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "60ba78a0-c346-4967-ad90-89b11d3e5e11", + "createdTimestamp" : 1677181798495, + "username" : "legal4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "legal4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "177" ] + }, + "credentials" : [ { + "id" : "b3efb51c-8dd7-451d-b213-05363588e461", + "type" : "password", + "createdDate" : 1677181798529, + "secretData" : "{\"value\":\"WE9bf/FrGPslQr6NW6Cfq/2U6LLorW8R7PVhIIBqbMC0Ndqqv18wHceyZvLCBUkjiTukPhhUHYYvPCZct0KQjw==\",\"salt\":\"OgtPrHOUoLVNiD8kjVo2fg==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "d481687d-8c76-456d-9e0c-d66075380bbd", + "createdTimestamp" : 1677181798643, + "username" : "legal5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "legal5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "178" ] + }, + "credentials" : [ { + "id" : "26804d36-5691-4ee2-8a03-ac0f69045d6a", + "type" : "password", + "createdDate" : 1677181798677, + "secretData" : "{\"value\":\"yAGa86rD7oVWAUjj2IApbBoIK1CevLxXiJQ3UDdHpJLVVDYRkCDF3qel111EqbsGsdOJ1g2cbc4ii2baM57Jog==\",\"salt\":\"2kzSBHUfFi+EHXJTVlnJ7w==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "8a03f00f-310d-4bae-b918-f6f128f98095", + "createdTimestamp" : 1677187934419, + "username" : "logos.program-lead", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "logos.program-lead@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "160" ] + }, + "credentials" : [ { + "id" : "57e95f47-feb4-4328-88a6-8c8abde98db9", + "type" : "password", + "createdDate" : 1677187934455, + "secretData" : "{\"value\":\"2JMhNDo3jhT8M5w38JLVHiAN/njcXc6moaa9d6L0LYe8yOCxoxmVSqejFDQTyESxeMChBU7qj2NXIGhJMIsBiw==\",\"salt\":\"O5NxbiEqrDNzN041mEz/8Q==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "588e69b9-7534-4073-861d-500475b12b24", "createdTimestamp" : 1675718484566, @@ -1427,8 +1887,8 @@ "notBefore" : 0, "groups" : [ ] }, { - "id" : "2a5d7caa-2c3e-4404-a133-ec220c0307db", - "createdTimestamp" : 1676566095780, + "id" : "6bc87bfb-6288-49df-a0f3-51db4e46201b", + "createdTimestamp" : 1677179612799, "username" : "peopleops.partner2.sme", "enabled" : true, "totp" : false, @@ -1438,10 +1898,10 @@ "spiffworkflow-employeeid" : [ "173" ] }, "credentials" : [ { - "id" : "64fc835c-b693-4fed-ab9f-952cbaadbbfd", + "id" : "c0c57e55-9d34-499f-80a8-0f0cd639e1ed", "type" : "password", - "createdDate" : 1676566095815, - "secretData" : "{\"value\":\"w5nUlwlH1Z46WGhfejPIiRW6OkE9bcjHNCVySUDzMIpkbCm3f78XfuvdGSDeCpJ/FQCJuFo5ciDJ7ExXLyLfnQ==\",\"salt\":\"nz1xSxci+NFsyPZPhFDtZQ==\",\"additionalParameters\":{}}", + "createdDate" : 1677179612835, + "secretData" : "{\"value\":\"xUGT/9b0xVMemt7C30eO/TZfOaf3sO3j/XaADPWV+bXb5yNt0Dc6Ao0KVA0yzrPzCeXVa4C2BlHdXpx4l/nNUw==\",\"salt\":\"7UAhQDr50I44pVegqsm4aw==\",\"additionalParameters\":{}}", "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" } ], "disableableCredentialTypes" : [ ], @@ -1450,8 +1910,8 @@ "notBefore" : 0, "groups" : [ ] }, { - "id" : "2df3aa5e-5e5b-4c4a-b9bc-3a916c651632", - "createdTimestamp" : 1676566095846, + "id" : "f8837814-21dc-475e-9067-41d1da670fff", + "createdTimestamp" : 1677179612959, "username" : "peopleops.partner3.sme", "enabled" : true, "totp" : false, @@ -1461,10 +1921,56 @@ "spiffworkflow-employeeid" : [ "174" ] }, "credentials" : [ { - "id" : "efaaec98-45c7-45cc-b4a4-32708882b72f", + "id" : "d83f8952-b7b7-4860-9af9-b697a84da13a", "type" : "password", - "createdDate" : 1676566095880, - "secretData" : "{\"value\":\"B9M+AGxXUX4/+ce0y6AgFBm4F7phl5+6zToumcfheXglqcag2jr7iqLTtvwVkz3w8x7rmxUrzs7rkJPhK+/Jpg==\",\"salt\":\"rLFkhDJLxRuCNw7PNswlSQ==\",\"additionalParameters\":{}}", + "createdDate" : 1677179612997, + "secretData" : "{\"value\":\"ZBH+k4nUWrpVJoyu4j8nNsYvWMA8fIrS3rxl+Pfi8XUp5QUPxMr2slopxBpdn5rCFxC422rGvE76z59+lsGHFw==\",\"salt\":\"AGjic4GY4x47sB0STHebYw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "44c5d69d-f767-4f11-8d0b-8b6d42cfb1da", + "createdTimestamp" : 1677181799109, + "username" : "peopleops.partner4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "peopleops.partner4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "181" ] + }, + "credentials" : [ { + "id" : "eeb6aa42-0141-4a0e-9135-22e519fe2259", + "type" : "password", + "createdDate" : 1677181799173, + "secretData" : "{\"value\":\"hRXbF8Hv5ZbrLFXr2ceYHva6LV9Nl8R4rWzigTLPkkxKeF87iaifmStRxSWdJv4LZsq4+qwJF3wretnaav6VUw==\",\"salt\":\"ho19cRuxsUuCF5fVo2/fSw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "abed16ea-ffb1-4ca4-a907-206f56d0c6d1", + "createdTimestamp" : 1677181799452, + "username" : "peopleops.partner5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "peopleops.partner5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "182" ] + }, + "credentials" : [ { + "id" : "f07e520a-b3eb-4b1e-95b3-51c64902dd7b", + "type" : "password", + "createdDate" : 1677181799489, + "secretData" : "{\"value\":\"F2Nr7V6xjBFXI8Siw6rLYAN3ToHKkcq8PLU4SI+T7M4Oj6no1Jf9jtT+pqvQV65GNJ9p1F5U023EENnITa6r+g==\",\"salt\":\"oz69O4w8vVKgjtm2hEglmA==\",\"additionalParameters\":{}}", "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" } ], "disableableCredentialTypes" : [ ], @@ -1702,6 +2208,52 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "8495cf5a-d592-4ef4-a25d-b7ab50e4682d", + "createdTimestamp" : 1677300032228, + "username" : "ppg.ba4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "ppg.ba4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "200" ] + }, + "credentials" : [ { + "id" : "690a07af-b356-4021-b012-dc28a52744f7", + "type" : "password", + "createdDate" : 1677300032281, + "secretData" : "{\"value\":\"cRjSpQ9plAFY3XMwDnBXG3uvc6GLnczJuC8b5er7XMy58CpryiRNmi4nzbQNw0IIbvpdcjCTETfMIDMapobXnw==\",\"salt\":\"P9SaAzdcGV4a4Rc57ki8OQ==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "31143c6e-5ea0-4c84-a94c-0215e96226d2", + "createdTimestamp" : 1677300032328, + "username" : "ppg.ba5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "ppg.ba5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "201" ] + }, + "credentials" : [ { + "id" : "6dc24a43-d541-4af5-9514-647a54ac09ee", + "type" : "password", + "createdDate" : 1677300032367, + "secretData" : "{\"value\":\"EAPcqH2t4w066csArNPWxT0pUKMR/RwDAYLdug9PPcmg4BFc71X3w+RXrXhNfcpDz8kTo/BMmjaxyVLDZGGODg==\",\"salt\":\"O+M+MVp1ETT3wyviAeUJnw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "f56fe387-d153-42c2-880a-6726bd624bae", "createdTimestamp" : 1676302144802, @@ -1840,6 +2392,52 @@ "realmRoles" : [ "default-roles-spiffworkflow" ], "notBefore" : 0, "groups" : [ ] + }, { + "id" : "1b2dc2b1-9706-4b69-aba8-088551d56622", + "createdTimestamp" : 1677181798799, + "username" : "security4.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "security4.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "179" ] + }, + "credentials" : [ { + "id" : "4b764d7f-8c3b-4978-93aa-a2dbe0caf71c", + "type" : "password", + "createdDate" : 1677181798833, + "secretData" : "{\"value\":\"kn+VDn4d6qlJBJdhLYuJq4/97vfmZmiL3WXmW1OnhzYYv35splfBEkY12j0R4pxZeZ1OWBR7MJs1kB8AeC9cKQ==\",\"salt\":\"K+0rpb4TJ7J6z0F99AAklA==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] + }, { + "id" : "f76b4ac7-3beb-4465-ad8d-0d4a513782b4", + "createdTimestamp" : 1677181798958, + "username" : "security5.sme", + "enabled" : true, + "totp" : false, + "emailVerified" : false, + "email" : "security5.sme@status.im", + "attributes" : { + "spiffworkflow-employeeid" : [ "180" ] + }, + "credentials" : [ { + "id" : "3c5493c3-f689-44b1-ae51-94e7d0dff4a0", + "type" : "password", + "createdDate" : 1677181798992, + "secretData" : "{\"value\":\"7kr/Rt3nzDMDky8SBKOro3+sbpcDe6XBemF2CGN2NrBaNPdR+BlH9cpHPlxaTGTcwYe0TbNJo9xQ3FQu7NUwJg==\",\"salt\":\"W/jkh3VF9L05hyGNzHR9Bw==\",\"additionalParameters\":{}}", + "credentialData" : "{\"hashIterations\":27500,\"algorithm\":\"pbkdf2-sha256\",\"additionalParameters\":{}}" + } ], + "disableableCredentialTypes" : [ ], + "requiredActions" : [ ], + "realmRoles" : [ "default-roles-spiffworkflow" ], + "notBefore" : 0, + "groups" : [ ] }, { "id" : "b768e3ef-f905-4493-976c-bc3408c04bec", "createdTimestamp" : 1675447832524, @@ -3175,7 +3773,7 @@ "subType" : "authenticated", "subComponents" : { }, "config" : { - "allowed-protocol-mapper-types" : [ "oidc-usermodel-attribute-mapper", "oidc-address-mapper", "oidc-full-name-mapper", "saml-user-property-mapper", "saml-user-attribute-mapper", "oidc-usermodel-property-mapper", "saml-role-list-mapper", "oidc-sha256-pairwise-sub-mapper" ] + "allowed-protocol-mapper-types" : [ "oidc-address-mapper", "oidc-usermodel-property-mapper", "oidc-usermodel-attribute-mapper", "oidc-full-name-mapper", "saml-user-property-mapper", "oidc-sha256-pairwise-sub-mapper", "saml-role-list-mapper", "saml-user-attribute-mapper" ] } }, { "id" : "d68e938d-dde6-47d9-bdc8-8e8523eb08cd", @@ -3193,7 +3791,7 @@ "subType" : "anonymous", "subComponents" : { }, "config" : { - "allowed-protocol-mapper-types" : [ "oidc-usermodel-attribute-mapper", "saml-user-attribute-mapper", "oidc-address-mapper", "saml-user-property-mapper", "oidc-sha256-pairwise-sub-mapper", "saml-role-list-mapper", "oidc-usermodel-property-mapper", "oidc-full-name-mapper" ] + "allowed-protocol-mapper-types" : [ "oidc-usermodel-property-mapper", "oidc-usermodel-attribute-mapper", "oidc-full-name-mapper", "oidc-sha256-pairwise-sub-mapper", "oidc-address-mapper", "saml-role-list-mapper", "saml-user-property-mapper", "saml-user-attribute-mapper" ] } }, { "id" : "3854361d-3fe5-47fb-9417-a99592e3dc5c", @@ -3283,7 +3881,7 @@ "internationalizationEnabled" : false, "supportedLocales" : [ ], "authenticationFlows" : [ { - "id" : "01b4b17c-bb82-41c3-b5b5-b9aadd21cb23", + "id" : "0e6ef523-0828-4847-9646-37c2833ad205", "alias" : "Account verification options", "description" : "Method with which to verity the existing account", "providerId" : "basic-flow", @@ -3305,7 +3903,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "57574e2d-3c3d-4286-9fd1-d7f4ab86c6c1", + "id" : "7edc2f58-0e95-4374-b49c-8589b0a7ee64", "alias" : "Authentication Options", "description" : "Authentication options.", "providerId" : "basic-flow", @@ -3334,7 +3932,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "1eb0e67c-2856-475e-8563-5eca431fd9d0", + "id" : "a4ad982f-def5-4845-840d-971205cae536", "alias" : "Browser - Conditional OTP", "description" : "Flow to determine if the OTP is required for the authentication", "providerId" : "basic-flow", @@ -3356,7 +3954,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "ff023867-aad5-4d19-a7da-60904727cd77", + "id" : "daa18225-9c2b-47b8-b31f-152cd64f4202", "alias" : "Direct Grant - Conditional OTP", "description" : "Flow to determine if the OTP is required for the authentication", "providerId" : "basic-flow", @@ -3378,7 +3976,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "c4f2f1e4-a32c-4559-9fe3-f88cc6cb63da", + "id" : "113bca83-78e1-4148-9124-27aeb9e278d3", "alias" : "First broker login - Conditional OTP", "description" : "Flow to determine if the OTP is required for the authentication", "providerId" : "basic-flow", @@ -3400,7 +3998,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "bfb28a5f-98d9-4ce0-ae8d-75a7ba1ad331", + "id" : "cd8c8c26-aa53-4cd4-a3e0-74a4a4376a98", "alias" : "Handle Existing Account", "description" : "Handle what to do if there is existing account with same email/username like authenticated identity provider", "providerId" : "basic-flow", @@ -3422,7 +4020,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "8b2075bd-9ad7-44c3-9a06-bc60a13beb7a", + "id" : "12cb511e-64b3-4506-8905-3e5c8f08fad9", "alias" : "Reset - Conditional OTP", "description" : "Flow to determine if the OTP should be reset or not. Set to REQUIRED to force.", "providerId" : "basic-flow", @@ -3444,7 +4042,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "1fdcbed7-e44b-4473-ab7b-25037309660b", + "id" : "89863115-cb99-4fbf-abfe-6a8a404b5148", "alias" : "User creation or linking", "description" : "Flow for the existing/non-existing user alternatives", "providerId" : "basic-flow", @@ -3467,7 +4065,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "2f6e9208-b0e6-4941-9bd5-8f83ebc25b6c", + "id" : "c90e6d81-9306-41d0-8376-8c237b8757c6", "alias" : "Verify Existing Account by Re-authentication", "description" : "Reauthentication of existing account", "providerId" : "basic-flow", @@ -3489,7 +4087,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "f059067e-d626-4be3-868f-4c8780318497", + "id" : "6d13fbf1-ba5d-4246-8085-5997f8d44941", "alias" : "browser", "description" : "browser based authentication", "providerId" : "basic-flow", @@ -3525,7 +4123,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "c35098b5-3785-4f52-90e3-39b8f3841f0c", + "id" : "b68f54f3-6361-4480-82ed-a508be0376c2", "alias" : "clients", "description" : "Base authentication for clients", "providerId" : "client-flow", @@ -3561,7 +4159,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "c78934b6-5386-49e7-89e8-9efe1088f5b2", + "id" : "8260dae3-441c-4d08-b96a-591ea07c10a6", "alias" : "direct grant", "description" : "OpenID Connect Resource Owner Grant", "providerId" : "basic-flow", @@ -3590,7 +4188,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "7a08791f-0c8b-4e11-a588-f5856b75337b", + "id" : "3a101262-fb6e-453a-94a4-9119c12d4577", "alias" : "docker auth", "description" : "Used by Docker clients to authenticate against the IDP", "providerId" : "basic-flow", @@ -3605,7 +4203,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "11e93dce-9673-4c99-ae7a-0edaf1c9b7e4", + "id" : "ef1643ac-cf03-41e8-bd89-659de5288339", "alias" : "first broker login", "description" : "Actions taken after first broker login with identity provider account, which is not yet linked to any Keycloak account", "providerId" : "basic-flow", @@ -3628,7 +4226,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "dbb50df7-ec6e-4a34-97f5-b484f1d8a76c", + "id" : "409616c0-64ab-4a9c-a286-a446ea717b53", "alias" : "forms", "description" : "Username, password, otp and other auth forms.", "providerId" : "basic-flow", @@ -3650,7 +4248,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "d7a3dff9-249b-4811-9f36-b78119a4ce3f", + "id" : "a90dd7dc-f6b6-4cd1-85f4-f5aec95e5c7b", "alias" : "http challenge", "description" : "An authentication flow based on challenge-response HTTP Authentication Schemes", "providerId" : "basic-flow", @@ -3672,7 +4270,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "ed4891ad-657c-45ac-9388-6c50d191124d", + "id" : "aa535b04-a256-4c0a-aad6-aaa6d053f821", "alias" : "registration", "description" : "registration flow", "providerId" : "basic-flow", @@ -3688,7 +4286,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "f7c308b0-58de-4ed2-bf69-394144698e5a", + "id" : "cbaa3dde-4b4b-4344-841f-ba7468734286", "alias" : "registration form", "description" : "registration form", "providerId" : "form-flow", @@ -3724,7 +4322,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "3fb75774-a3a5-4e01-bc4a-4e564451601d", + "id" : "62c55336-4753-4c4e-a4f9-03adb86f253f", "alias" : "reset credentials", "description" : "Reset credentials for a user if they forgot their password or something", "providerId" : "basic-flow", @@ -3760,7 +4358,7 @@ "userSetupAllowed" : false } ] }, { - "id" : "822d5c02-9ab3-4a9b-8fa4-1f020c5ffe08", + "id" : "35366a6a-8669-4110-9c62-a4f195243f2c", "alias" : "saml ecp", "description" : "SAML ECP Profile Authentication Flow", "providerId" : "basic-flow", @@ -3776,13 +4374,13 @@ } ] } ], "authenticatorConfig" : [ { - "id" : "0e613377-2aaa-4fed-bb7d-4dea69d5c340", + "id" : "0d2f25a1-c358-4f08-9b44-02559d1d2b5f", "alias" : "create unique user config", "config" : { "require.password.update.after.registration" : "false" } }, { - "id" : "ac6b9188-f0ec-48ec-852a-8e3b331b33a6", + "id" : "350789a4-bbaf-4cba-999d-f40f4cc632ea", "alias" : "review profile config", "config" : { "update.profile.on.first.login" : "missing" diff --git a/keycloak/test_user_lists/status b/keycloak/test_user_lists/status index d370b96a7..292bbb947 100644 --- a/keycloak/test_user_lists/status +++ b/keycloak/test_user_lists/status @@ -1,16 +1,31 @@ email,spiffworkflow-employeeid # admin@spiffworkflow.org amir@status.im -app.program.lead@status.im,121 +app.program-lead@status.im,121 +codex.project-lead@status.im,153 +codex.sme@status.im,185 +codex1.sme@status.im,186 +codex2.sme@status.im,187 +codex3.sme@status.im,188 +codex4.sme@status.im,189 +codex5.sme@status.im,190 core1.contributor@status.im,155 core2.contributor@status.im,156 core3.contributor@status.im,157 core4.contributor@status.im,158 core5.contributor@status.im,159 +core6.contributor@status.im,199 core@status.im,113 dao.project.lead@status.im desktop.program.lead@status.im +desktop.project-lead@status.im,192 desktop.project.lead@status.im +desktop.sme@status.im,193 +desktop1.sme@status.im,194 +desktop2.sme@status.im,195 +desktop3.sme@status.im,196 +desktop4.sme@status.im,197 +desktop5.sme@status.im,198 fin@status.im,118 finance.lead@status.im,128 finance_user1@status.im @@ -20,6 +35,8 @@ infra.sme@status.im,119 infra1.sme@status.im,131 infra2.sme@status.im,132 infra3.sme@status.im,167 +infra4.sme@status.im,175 +infra5.sme@status.im,176 jakub@status.im jarrad@status.im lead@status.im,114 @@ -28,11 +45,16 @@ legal.sme@status.im,125 legal1.sme@status.im,134 legal2.sme@status.im,165 legal3.sme@status.im,166 +legal4.sme@status.im,177 +legal5.sme@status.im,178 +logos.program-lead@status.im,160 manuchehr@status.im,110 peopleops.partner.sme@status.im,148 peopleops.partner1.sme@status.im,149 peopleops.partner2.sme@status.im,173 peopleops.partner3.sme@status.im,174 +peopleops.partner4.sme@status.im,181 +peopleops.partner5.sme@status.im,182 peopleops.partner@status.im,150 peopleops.project-lead@status.im,147 peopleops.talent.sme@status.im,143 @@ -43,6 +65,8 @@ ppg.ba.sme@status.im,138 ppg.ba1.sme@status.im,170 ppg.ba2.sme@status.im,171 ppg.ba3.sme@status.im,172 +ppg.ba4.sme@status.im,200 +ppg.ba5.sme@status.im,201 ppg.ba@status.im,127 sasha@status.im,112 security.project-lead@status.im,151 @@ -50,4 +74,6 @@ security.sme@status.im,123 security1.sme@status.im,135 security2.sme@status.im,168 security3.sme@status.im,169 +security4.sme@status.im,179 +security5.sme@status.im,180 services.lead@status.im,122 diff --git a/migrations/versions/9f0b1662a8af_.py b/migrations/versions/9f0b1662a8af_.py new file mode 100644 index 000000000..6f3270f31 --- /dev/null +++ b/migrations/versions/9f0b1662a8af_.py @@ -0,0 +1,153 @@ +"""empty message + +Revision ID: 9f0b1662a8af +Revises: 63fc8d693b9f +Create Date: 2023-02-24 14:30:05.970959 + +""" +from alembic import op +import sqlalchemy as sa +from sqlalchemy.dialects import mysql + +# revision identifiers, used by Alembic. +revision = '9f0b1662a8af' +down_revision = '63fc8d693b9f' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.create_table('correlation_property_cache', + sa.Column('id', sa.Integer(), nullable=False), + sa.Column('name', sa.String(length=50), nullable=False), + sa.Column('message_name', sa.String(length=50), nullable=False), + sa.Column('process_model_id', sa.String(length=255), nullable=False), + sa.Column('retrieval_expression', sa.String(length=255), nullable=True), + sa.PrimaryKeyConstraint('id') + ) + op.create_table('message_instance_correlation_rule', + sa.Column('id', sa.Integer(), nullable=False), + sa.Column('message_instance_id', sa.Integer(), nullable=False), + sa.Column('name', sa.String(length=50), nullable=False), + sa.Column('retrieval_expression', sa.String(length=255), nullable=True), + sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True), + sa.Column('created_at_in_seconds', sa.Integer(), nullable=True), + sa.ForeignKeyConstraint(['message_instance_id'], ['message_instance.id'], ), + sa.PrimaryKeyConstraint('id'), + sa.UniqueConstraint('message_instance_id', 'name', name='message_instance_id_name_unique') + ) + op.create_index(op.f('ix_message_instance_correlation_rule_message_instance_id'), 'message_instance_correlation_rule', ['message_instance_id'], unique=False) + op.drop_index('ix_message_model_identifier', table_name='message_model') + op.drop_index('ix_message_model_name', table_name='message_model') + op.drop_constraint('message_correlation_property_ibfk_1', 'message_correlation_property', type_='foreignkey') + op.drop_constraint('message_triggerable_process_model_ibfk_1', 'message_triggerable_process_model', type_='foreignkey') + op.drop_constraint('message_instance_ibfk_1', 'message_instance', type_='foreignkey') + op.drop_table('message_model') + op.drop_constraint('message_correlation_message_instance_ibfk_1', 'message_correlation_message_instance', type_='foreignkey') + op.drop_index('ix_message_correlation_name', table_name='message_correlation') + op.drop_index('ix_message_correlation_process_instance_id', table_name='message_correlation') + op.drop_index('ix_message_correlation_value', table_name='message_correlation') +# op.drop_index('message_instance_id_name_unique', table_name='message_correlation') +# op.drop_index('ix_message_correlation_message_correlation_property_id', table_name='message_correlation') + op.drop_table('message_correlation') + op.drop_index('ix_message_correlation_message_instance_message_correlation_id', table_name='message_correlation_message_instance') + op.drop_index('ix_message_correlation_message_instance_message_instance_id', table_name='message_correlation_message_instance') +# op.drop_index('message_correlation_message_instance_unique', table_name='message_correlation_message_instance') + op.drop_table('message_correlation_message_instance') + op.drop_index('ix_message_correlation_property_identifier', table_name='message_correlation_property') + op.drop_index('message_correlation_property_unique', table_name='message_correlation_property') + op.drop_table('message_correlation_property') + op.add_column('message_instance', sa.Column('name', sa.String(length=255), nullable=True)) + op.add_column('message_instance', sa.Column('correlation_keys', sa.JSON(), nullable=True)) + op.add_column('message_instance', sa.Column('user_id', sa.Integer(), nullable=False)) + op.add_column('message_instance', sa.Column('counterpart_id', sa.Integer(), nullable=True)) + op.alter_column('message_instance', 'process_instance_id', + existing_type=mysql.INTEGER(), + nullable=True) + op.create_foreign_key(None, 'message_instance', 'user', ['user_id'], ['id']) + op.drop_column('message_instance', 'message_model_id') + op.add_column('message_triggerable_process_model', sa.Column('message_name', sa.String(length=255), nullable=True)) + op.drop_index('message_model_id', table_name='message_triggerable_process_model') + op.drop_column('message_triggerable_process_model', 'message_model_id') + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.add_column('message_triggerable_process_model', sa.Column('message_model_id', mysql.INTEGER(), autoincrement=False, nullable=False)) + op.create_index('message_model_id', 'message_triggerable_process_model', ['message_model_id'], unique=False) + op.drop_column('message_triggerable_process_model', 'message_name') + op.add_column('message_instance', sa.Column('message_model_id', mysql.INTEGER(), autoincrement=False, nullable=False)) + op.drop_constraint(None, 'message_instance', type_='foreignkey') + op.create_foreign_key('message_instance_ibfk_1', 'message_instance', 'message_model', ['message_model_id'], ['id']) + op.alter_column('message_instance', 'process_instance_id', + existing_type=mysql.INTEGER(), + nullable=False) + op.drop_column('message_instance', 'counterpart_id') + op.drop_column('message_instance', 'user_id') + op.drop_column('message_instance', 'correlation_keys') + op.drop_column('message_instance', 'name') + op.create_table('message_correlation_property', + sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False), + sa.Column('identifier', mysql.VARCHAR(length=50), nullable=True), + sa.Column('message_model_id', mysql.INTEGER(), autoincrement=False, nullable=False), + sa.Column('updated_at_in_seconds', mysql.INTEGER(), autoincrement=False, nullable=True), + sa.Column('created_at_in_seconds', mysql.INTEGER(), autoincrement=False, nullable=True), + sa.ForeignKeyConstraint(['message_model_id'], ['message_model.id'], name='message_correlation_property_ibfk_1'), + sa.PrimaryKeyConstraint('id'), + mysql_collate='utf8mb4_0900_ai_ci', + mysql_default_charset='utf8mb4', + mysql_engine='InnoDB' + ) + op.create_index('message_correlation_property_unique', 'message_correlation_property', ['identifier', 'message_model_id'], unique=False) + op.create_index('ix_message_correlation_property_identifier', 'message_correlation_property', ['identifier'], unique=False) + op.create_table('message_correlation_message_instance', + sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False), + sa.Column('message_instance_id', mysql.INTEGER(), autoincrement=False, nullable=False), + sa.Column('message_correlation_id', mysql.INTEGER(), autoincrement=False, nullable=False), + sa.ForeignKeyConstraint(['message_correlation_id'], ['message_correlation.id'], name='message_correlation_message_instance_ibfk_1'), + sa.ForeignKeyConstraint(['message_instance_id'], ['message_instance.id'], name='message_correlation_message_instance_ibfk_2'), + sa.PrimaryKeyConstraint('id'), + mysql_collate='utf8mb4_0900_ai_ci', + mysql_default_charset='utf8mb4', + mysql_engine='InnoDB' + ) + op.create_index('message_correlation_message_instance_unique', 'message_correlation_message_instance', ['message_instance_id', 'message_correlation_id'], unique=False) + op.create_index('ix_message_correlation_message_instance_message_instance_id', 'message_correlation_message_instance', ['message_instance_id'], unique=False) + op.create_index('ix_message_correlation_message_instance_message_correlation_id', 'message_correlation_message_instance', ['message_correlation_id'], unique=False) + op.create_table('message_correlation', + sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False), + sa.Column('process_instance_id', mysql.INTEGER(), autoincrement=False, nullable=False), + sa.Column('message_correlation_property_id', mysql.INTEGER(), autoincrement=False, nullable=False), + sa.Column('name', mysql.VARCHAR(length=255), nullable=False), + sa.Column('value', mysql.VARCHAR(length=255), nullable=False), + sa.Column('updated_at_in_seconds', mysql.INTEGER(), autoincrement=False, nullable=True), + sa.Column('created_at_in_seconds', mysql.INTEGER(), autoincrement=False, nullable=True), + sa.ForeignKeyConstraint(['message_correlation_property_id'], ['message_correlation_property.id'], name='message_correlation_ibfk_1'), + sa.ForeignKeyConstraint(['process_instance_id'], ['process_instance.id'], name='message_correlation_ibfk_2'), + sa.PrimaryKeyConstraint('id'), + mysql_collate='utf8mb4_0900_ai_ci', + mysql_default_charset='utf8mb4', + mysql_engine='InnoDB' + ) + op.create_index('message_instance_id_name_unique', 'message_correlation', ['process_instance_id', 'message_correlation_property_id', 'name'], unique=False) + op.create_index('ix_message_correlation_value', 'message_correlation', ['value'], unique=False) + op.create_index('ix_message_correlation_process_instance_id', 'message_correlation', ['process_instance_id'], unique=False) + op.create_index('ix_message_correlation_name', 'message_correlation', ['name'], unique=False) + op.create_index('ix_message_correlation_message_correlation_property_id', 'message_correlation', ['message_correlation_property_id'], unique=False) + op.create_table('message_model', + sa.Column('id', mysql.INTEGER(), autoincrement=True, nullable=False), + sa.Column('identifier', mysql.VARCHAR(length=50), nullable=True), + sa.Column('name', mysql.VARCHAR(length=50), nullable=True), + sa.PrimaryKeyConstraint('id'), + mysql_collate='utf8mb4_0900_ai_ci', + mysql_default_charset='utf8mb4', + mysql_engine='InnoDB' + ) + op.create_index('ix_message_model_name', 'message_model', ['name'], unique=False) + op.create_index('ix_message_model_identifier', 'message_model', ['identifier'], unique=False) + op.drop_index(op.f('ix_message_instance_correlation_rule_message_instance_id'), table_name='message_instance_correlation_rule') + op.drop_table('message_instance_correlation_rule') + op.drop_table('correlation_property_cache') + # ### end Alembic commands ### diff --git a/migrations/versions/d6e5b3af0908_.py b/migrations/versions/d6e5b3af0908_.py new file mode 100644 index 000000000..f47b4b57e --- /dev/null +++ b/migrations/versions/d6e5b3af0908_.py @@ -0,0 +1,28 @@ +"""empty message + +Revision ID: d6e5b3af0908 +Revises: 9f0b1662a8af +Create Date: 2023-02-27 11:10:28.058014 + +""" +from alembic import op +import sqlalchemy as sa + + +# revision identifiers, used by Alembic. +revision = 'd6e5b3af0908' +down_revision = '9f0b1662a8af' +branch_labels = None +depends_on = None + + +def upgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.add_column('human_task', sa.Column('bpmn_process_identifier', sa.String(length=255), nullable=True)) + # ### end Alembic commands ### + + +def downgrade(): + # ### commands auto generated by Alembic - please adjust! ### + op.drop_column('human_task', 'bpmn_process_identifier') + # ### end Alembic commands ### diff --git a/poetry.lock b/poetry.lock index a1df93676..005040906 100644 --- a/poetry.lock +++ b/poetry.lock @@ -1111,14 +1111,6 @@ python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.* [package.dependencies] setuptools = "*" -[[package]] -name = "orjson" -version = "3.8.0" -description = "Fast, correct Python JSON library supporting dataclasses, datetimes, and numpy" -category = "main" -optional = false -python-versions = ">=3.7" - [[package]] name = "packaging" version = "21.3" @@ -1825,7 +1817,7 @@ lxml = "*" type = "git" url = "https://github.com/sartography/SpiffWorkflow" reference = "main" -resolved_reference = "b439f69f23b547df4de1e8e0c636997f2fd4e33b" +resolved_reference = "b3235fad598ee3c4680a23f26adb09cdc8f2807b" [[package]] name = "SQLAlchemy" @@ -2204,7 +2196,7 @@ testing = ["flake8 (<5)", "func-timeout", "jaraco.functools", "jaraco.itertools" [metadata] lock-version = "1.1" python-versions = ">=3.9,<3.12" -content-hash = "b16e8fb0cf991bcba08c3ef1ddf205f5899c622a10c79a7f50fb55a36d53b179" +content-hash = "3876acb4e3d947787a3ba8e831844ca0b06bde34dc038be46cabc00aa2a4defe" [metadata.files] alabaster = [ @@ -2855,50 +2847,6 @@ nodeenv = [ {file = "nodeenv-1.7.0-py2.py3-none-any.whl", hash = "sha256:27083a7b96a25f2f5e1d8cb4b6317ee8aeda3bdd121394e5ac54e498028a042e"}, {file = "nodeenv-1.7.0.tar.gz", hash = "sha256:e0e7f7dfb85fc5394c6fe1e8fa98131a2473e04311a45afb6508f7cf1836fa2b"}, ] -orjson = [ - {file = "orjson-3.8.0-cp310-cp310-macosx_10_7_x86_64.whl", hash = "sha256:9a93850a1bdc300177b111b4b35b35299f046148ba23020f91d6efd7bf6b9d20"}, - {file = "orjson-3.8.0-cp310-cp310-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:7536a2a0b41672f824912aeab545c2467a9ff5ca73a066ff04fb81043a0a177a"}, - {file = "orjson-3.8.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:66c19399bb3b058e3236af7910b57b19a4fc221459d722ed72a7dc90370ca090"}, - {file = "orjson-3.8.0-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8b391d5c2ddc2f302d22909676b306cb6521022c3ee306c861a6935670291b2c"}, - {file = "orjson-3.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bdb1042970ca5f544a047d6c235a7eb4acdb69df75441dd1dfcbc406377ab37"}, - {file = "orjson-3.8.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:d189e2acb510e374700cb98cf11b54f0179916ee40f8453b836157ae293efa79"}, - {file = "orjson-3.8.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:6a23b40c98889e9abac084ce5a1fb251664b41da9f6bdb40a4729e2288ed2ed4"}, - {file = "orjson-3.8.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b68a42a31f8429728183c21fb440c21de1b62e5378d0d73f280e2d894ef8942e"}, - {file = "orjson-3.8.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ff13410ddbdda5d4197a4a4c09969cb78c722a67550f0a63c02c07aadc624833"}, - {file = "orjson-3.8.0-cp310-none-win_amd64.whl", hash = "sha256:2d81e6e56bbea44be0222fb53f7b255b4e7426290516771592738ca01dbd053b"}, - {file = "orjson-3.8.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:e2defd9527651ad39ec20ae03c812adf47ef7662bdd6bc07dabb10888d70dc62"}, - {file = "orjson-3.8.0-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:9e6ac22cec72d5b39035b566e4b86c74b84866f12b5b0b6541506a080fb67d6d"}, - {file = "orjson-3.8.0-cp37-cp37m-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:e2f4a5542f50e3d336a18cb224fc757245ca66b1fd0b70b5dd4471b8ff5f2b0e"}, - {file = "orjson-3.8.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e1418feeb8b698b9224b1f024555895169d481604d5d884498c1838d7412794c"}, - {file = "orjson-3.8.0-cp37-cp37m-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6e3da2e4bd27c3b796519ca74132c7b9e5348fb6746315e0f6c1592bc5cf1caf"}, - {file = "orjson-3.8.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:896a21a07f1998648d9998e881ab2b6b80d5daac4c31188535e9d50460edfcf7"}, - {file = "orjson-3.8.0-cp37-cp37m-manylinux_2_28_aarch64.whl", hash = "sha256:4065906ce3ad6195ac4d1bddde862fe811a42d7be237a1ff762666c3a4bb2151"}, - {file = "orjson-3.8.0-cp37-cp37m-manylinux_2_28_x86_64.whl", hash = "sha256:5f856279872a4449fc629924e6a083b9821e366cf98b14c63c308269336f7c14"}, - {file = "orjson-3.8.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:1b1cd25acfa77935bb2e791b75211cec0cfc21227fe29387e553c545c3ff87e1"}, - {file = "orjson-3.8.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:3e2459d441ab8fd8b161aa305a73d5269b3cda13b5a2a39eba58b4dd3e394f49"}, - {file = "orjson-3.8.0-cp37-none-win_amd64.whl", hash = "sha256:d2b5dafbe68237a792143137cba413447f60dd5df428e05d73dcba10c1ea6fcf"}, - {file = "orjson-3.8.0-cp38-cp38-macosx_10_7_x86_64.whl", hash = "sha256:5b072ef8520cfe7bd4db4e3c9972d94336763c2253f7c4718a49e8733bada7b8"}, - {file = "orjson-3.8.0-cp38-cp38-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:e68c699471ea3e2dd1b35bfd71c6a0a0e4885b64abbe2d98fce1ef11e0afaff3"}, - {file = "orjson-3.8.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3c7225e8b08996d1a0c804d3a641a53e796685e8c9a9fd52bd428980032cad9a"}, - {file = "orjson-3.8.0-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8f687776a03c19f40b982fb5c414221b7f3d19097841571be2223d1569a59877"}, - {file = "orjson-3.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7990a9caf3b34016ac30be5e6cfc4e7efd76aa85614a1215b0eae4f0c7e3db59"}, - {file = "orjson-3.8.0-cp38-cp38-manylinux_2_28_aarch64.whl", hash = "sha256:02d638d43951ba346a80f0abd5942a872cc87db443e073f6f6fc530fee81e19b"}, - {file = "orjson-3.8.0-cp38-cp38-manylinux_2_28_x86_64.whl", hash = "sha256:f4b46dbdda2f0bd6480c39db90b21340a19c3b0fcf34bc4c6e465332930ca539"}, - {file = "orjson-3.8.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:655d7387a1634a9a477c545eea92a1ee902ab28626d701c6de4914e2ed0fecd2"}, - {file = "orjson-3.8.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:5edb93cdd3eb32977633fa7aaa6a34b8ab54d9c49cdcc6b0d42c247a29091b22"}, - {file = "orjson-3.8.0-cp38-none-win_amd64.whl", hash = "sha256:03ed95814140ff09f550b3a42e6821f855d981c94d25b9cc83e8cca431525d70"}, - {file = "orjson-3.8.0-cp39-cp39-macosx_10_7_x86_64.whl", hash = "sha256:7b0e72974a5d3b101226899f111368ec2c9824d3e9804af0e5b31567f53ad98a"}, - {file = "orjson-3.8.0-cp39-cp39-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:6ea5fe20ef97545e14dd4d0263e4c5c3bc3d2248d39b4b0aed4b84d528dfc0af"}, - {file = "orjson-3.8.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:6433c956f4a18112342a18281e0bec67fcd8b90be3a5271556c09226e045d805"}, - {file = "orjson-3.8.0-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:87462791dd57de2e3e53068bf4b7169c125c50960f1bdda08ed30c797cb42a56"}, - {file = "orjson-3.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:be02f6acee33bb63862eeff80548cd6b8a62e2d60ad2d8dfd5a8824cc43d8887"}, - {file = "orjson-3.8.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:a709c2249c1f2955dbf879506fd43fa08c31fdb79add9aeb891e3338b648bf60"}, - {file = "orjson-3.8.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:2065b6d280dc58f131ffd93393737961ff68ae7eb6884b68879394074cc03c13"}, - {file = "orjson-3.8.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5fd6cac83136e06e538a4d17117eaeabec848c1e86f5742d4811656ad7ee475f"}, - {file = "orjson-3.8.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:25b5e48fbb9f0b428a5e44cf740675c9281dd67816149fc33659803399adbbe8"}, - {file = "orjson-3.8.0-cp39-none-win_amd64.whl", hash = "sha256:2058653cc12b90e482beacb5c2d52dc3d7606f9e9f5a52c1c10ef49371e76f52"}, - {file = "orjson-3.8.0.tar.gz", hash = "sha256:fb42f7cf57d5804a9daa6b624e3490ec9e2631e042415f3aebe9f35a8492ba6c"}, -] packaging = [ {file = "packaging-21.3-py3-none-any.whl", hash = "sha256:ef103e05f519cdc783ae24ea4e2e0f508a9c99b2d4969652eed6a2e1ea5bd522"}, {file = "packaging-21.3.tar.gz", hash = "sha256:dd47c42927d89ab911e606518907cc2d3a1f38bbd026385970643f9c5b8ecfeb"}, diff --git a/pyproject.toml b/pyproject.toml index cbf0b7ade..843738617 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -28,7 +28,7 @@ flask-migrate = "*" flask-restful = "*" werkzeug = "*" SpiffWorkflow = {git = "https://github.com/sartography/SpiffWorkflow", rev = "main"} -# SpiffWorkflow = {develop = true, path = "../SpiffWorkflow" } +#SpiffWorkflow = {develop = true, path = "../SpiffWorkflow" } sentry-sdk = "^1.10" sphinx-autoapi = "^2.0" flask-bpmn = {git = "https://github.com/sartography/flask-bpmn", rev = "main"} @@ -48,7 +48,6 @@ APScheduler = "*" Jinja2 = "^3.1.2" RestrictedPython = "^6.0" Flask-SQLAlchemy = "^3" -orjson = "^3.8.0" # type hinting stuff # these need to be in the normal (non dev-dependencies) section diff --git a/src/spiffworkflow_backend/__init__.py b/src/spiffworkflow_backend/__init__.py index 92c11037a..3ad842654 100644 --- a/src/spiffworkflow_backend/__init__.py +++ b/src/spiffworkflow_backend/__init__.py @@ -1,6 +1,7 @@ """__init__.""" import faulthandler import os +import sys from typing import Any import connexion # type: ignore @@ -94,14 +95,6 @@ def create_app() -> flask.app.Flask: app.config["CONNEXION_APP"] = connexion_app app.config["SESSION_TYPE"] = "filesystem" - if os.environ.get("FLASK_SESSION_SECRET_KEY") is None: - raise KeyError( - "Cannot find the secret_key from the environment. Please set" - " FLASK_SESSION_SECRET_KEY" - ) - - app.secret_key = os.environ.get("FLASK_SESSION_SECRET_KEY") - setup_config(app) db.init_app(app) migrate.init_app(app, db) @@ -174,10 +167,9 @@ def traces_sampler(sampling_context: Any) -> Any: # tasks_controller.task_submit # this is the current pain point as of 31 jan 2023. - if ( - path_info - and path_info.startswith("/v1.0/tasks/") - and request_method == "PUT" + if path_info and ( + (path_info.startswith("/v1.0/tasks/") and request_method == "PUT") + or (path_info.startswith("/v1.0/task-data/") and request_method == "GET") ): return 1 diff --git a/src/spiffworkflow_backend/api.yml b/src/spiffworkflow_backend/api.yml index 842491c24..c6a04ed05 100755 --- a/src/spiffworkflow_backend/api.yml +++ b/src/spiffworkflow_backend/api.yml @@ -1518,7 +1518,7 @@ paths: items: $ref: "#/components/schemas/Task" - /task-data/{modified_process_model_identifier}/{process_instance_id}: + /task-data/{modified_process_model_identifier}/{process_instance_id}/{spiff_step}: parameters: - name: modified_process_model_identifier in: path @@ -1532,32 +1532,24 @@ paths: description: The unique id of an existing process instance. schema: type: integer - - name: all_tasks - in: query - required: false - description: If true, this wil return all tasks associated with the process instance and not just user tasks. - schema: - type: boolean - name: spiff_step - in: query - required: false + in: path + required: true description: If set will return the tasks as they were during a specific step of execution. schema: type: integer get: + operationId: spiffworkflow_backend.routes.tasks_controller.task_data_show + summary: Get task data for a single task in a spiff step. tags: - Process Instances - operationId: spiffworkflow_backend.routes.process_instances_controller.process_instance_task_list_with_task_data - summary: returns the list of all user tasks associated with process instance with the task data responses: "200": description: list of tasks content: application/json: schema: - type: array - items: - $ref: "#/components/schemas/Task" + $ref: "#/components/schemas/Task" /task-data/{modified_process_model_identifier}/{process_instance_id}/{task_id}: parameters: @@ -1579,6 +1571,12 @@ paths: description: The unique id of the task. schema: type: string + - name: spiff_step + in: query + required: false + description: If set will return the tasks as they were during a specific step of execution. + schema: + type: integer put: operationId: spiffworkflow_backend.routes.process_api_blueprint.task_data_update summary: Update the task data for requested instance and task @@ -1880,19 +1878,19 @@ paths: schema: $ref: "#/components/schemas/Workflow" - /messages/{message_identifier}: + /messages/{message_name}: parameters: - - name: message_identifier + - name: message_name in: path required: true - description: The unique identifier of the message model. + description: The unique name of the message. schema: type: string post: tags: - Messages operationId: spiffworkflow_backend.routes.messages_controller.message_send - summary: Instantiate and run a given process model with a message start event matching given identifier + summary: Instantiate and run a given process model with a message start event matching given name requestBody: content: application/json: diff --git a/src/spiffworkflow_backend/config/__init__.py b/src/spiffworkflow_backend/config/__init__.py index ad5dcb0f5..a9d99b950 100644 --- a/src/spiffworkflow_backend/config/__init__.py +++ b/src/spiffworkflow_backend/config/__init__.py @@ -129,6 +129,14 @@ def setup_config(app: Flask) -> None: "SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR config must be set" ) + if app.config["FLASK_SESSION_SECRET_KEY"] is None: + raise KeyError( + "Cannot find the secret_key from the environment. Please set" + " FLASK_SESSION_SECRET_KEY" + ) + + app.secret_key = os.environ.get("FLASK_SESSION_SECRET_KEY") + app.config["PROCESS_UUID"] = uuid.uuid4() setup_database_uri(app) diff --git a/src/spiffworkflow_backend/config/default.py b/src/spiffworkflow_backend/config/default.py index a08ef3e01..8bb0c1919 100644 --- a/src/spiffworkflow_backend/config/default.py +++ b/src/spiffworkflow_backend/config/default.py @@ -6,6 +6,8 @@ from os import environ # and from_prefixed_env(), though we want to ensure that these variables are all documented, so that # is a benefit of the status quo and having them all in this file explicitly. +FLASK_SESSION_SECRET_KEY = environ.get("FLASK_SESSION_SECRET_KEY") + SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR = environ.get( "SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR" ) diff --git a/src/spiffworkflow_backend/config/permissions/example.yml b/src/spiffworkflow_backend/config/permissions/example.yml index 048f9de64..77d87ca8d 100644 --- a/src/spiffworkflow_backend/config/permissions/example.yml +++ b/src/spiffworkflow_backend/config/permissions/example.yml @@ -28,6 +28,7 @@ groups: users: [ admin@spiffworkflow.org, + nelson@spiffworkflow.org ] permissions: diff --git a/src/spiffworkflow_backend/exceptions/api_error.py b/src/spiffworkflow_backend/exceptions/api_error.py index 5fff05c2d..39ab1335a 100644 --- a/src/spiffworkflow_backend/exceptions/api_error.py +++ b/src/spiffworkflow_backend/exceptions/api_error.py @@ -157,7 +157,7 @@ class ApiError(Exception): error_line=exp.error_line, task_trace=exp.task_trace, ) - elif isinstance(exp, WorkflowException): + elif isinstance(exp, WorkflowException) and exp.task_spec: return ApiError.from_task_spec(error_code, message, exp.task_spec) else: return ApiError("workflow_error", str(exp)) @@ -253,7 +253,7 @@ def handle_exception(exception: Exception) -> flask.wrappers.Response: else: api_exception = ApiError( error_code=error_code, - message=f"{exception.__class__.__name__}", + message=f"{exception.__class__.__name__} {str(exception)}", sentry_link=sentry_link, status_code=status_code, ) diff --git a/src/spiffworkflow_backend/load_database_models.py b/src/spiffworkflow_backend/load_database_models.py index 2ebd11347..8e79e0135 100644 --- a/src/spiffworkflow_backend/load_database_models.py +++ b/src/spiffworkflow_backend/load_database_models.py @@ -21,13 +21,9 @@ from spiffworkflow_backend.models.human_task import HumanTaskModel # noqa: F401 from spiffworkflow_backend.models.spec_reference import ( SpecReferenceCache, ) # noqa: F401 -from spiffworkflow_backend.models.message_correlation_property import ( - MessageCorrelationPropertyModel, -) # noqa: F401 from spiffworkflow_backend.models.message_instance import ( MessageInstanceModel, ) # noqa: F401 -from spiffworkflow_backend.models.message_model import MessageModel # noqa: F401 from spiffworkflow_backend.models.message_triggerable_process_model import ( MessageTriggerableProcessModel, ) # noqa: F401 diff --git a/src/spiffworkflow_backend/models/correlation_property_cache.py b/src/spiffworkflow_backend/models/correlation_property_cache.py new file mode 100644 index 000000000..dd311b934 --- /dev/null +++ b/src/spiffworkflow_backend/models/correlation_property_cache.py @@ -0,0 +1,24 @@ +"""Message_correlation.""" +from dataclasses import dataclass + +from spiffworkflow_backend.models.db import db +from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel + + +@dataclass +class CorrelationPropertyCache(SpiffworkflowBaseDBModel): + """A list of known correlation properties as read from BPMN files. + + This correlation properties are not directly linked to anything + but it provides a way to know what processes are talking about + what messages and correlation keys. And could be useful as an + api endpoint if you wanted to know what another process model + is using. + """ + + __tablename__ = "correlation_property_cache" + id = db.Column(db.Integer, primary_key=True) + name: str = db.Column(db.String(50), nullable=False) + message_name: str = db.Column(db.String(50), nullable=False) + process_model_id: str = db.Column(db.String(255), nullable=False) + retrieval_expression: str = db.Column(db.String(255)) diff --git a/src/spiffworkflow_backend/models/human_task.py b/src/spiffworkflow_backend/models/human_task.py index 3317f7732..5cc208b19 100644 --- a/src/spiffworkflow_backend/models/human_task.py +++ b/src/spiffworkflow_backend/models/human_task.py @@ -34,6 +34,10 @@ class HumanTaskModel(SpiffworkflowBaseDBModel): lane_assignment_id: int | None = db.Column(ForeignKey(GroupModel.id)) completed_by_user_id: int = db.Column(ForeignKey(UserModel.id), nullable=True) # type: ignore + completed_by_user = relationship( + "UserModel", foreign_keys=[completed_by_user_id], viewonly=True + ) + actual_owner_id: int = db.Column(ForeignKey(UserModel.id)) # type: ignore # actual_owner: RelationshipProperty[UserModel] = relationship(UserModel) @@ -49,6 +53,7 @@ class HumanTaskModel(SpiffworkflowBaseDBModel): task_type: str = db.Column(db.String(50)) task_status: str = db.Column(db.String(50)) process_model_display_name: str = db.Column(db.String(255)) + bpmn_process_identifier: str = db.Column(db.String(255)) completed: bool = db.Column(db.Boolean, default=False, nullable=False, index=True) human_task_users = relationship("HumanTaskUserModel", cascade="delete") @@ -74,8 +79,8 @@ class HumanTaskModel(SpiffworkflowBaseDBModel): new_task.process_model_display_name = task.process_model_display_name if hasattr(task, "process_group_identifier"): new_task.process_group_identifier = task.process_group_identifier - if hasattr(task, "process_model_identifier"): - new_task.process_model_identifier = task.process_model_identifier + if hasattr(task, "bpmn_process_identifier"): + new_task.bpmn_process_identifier = task.bpmn_process_identifier # human tasks only have status when getting the list on the home page # and it comes from the process_instance. it should not be confused with task_status. diff --git a/src/spiffworkflow_backend/models/message_correlation.py b/src/spiffworkflow_backend/models/message_correlation.py deleted file mode 100644 index e913938f5..000000000 --- a/src/spiffworkflow_backend/models/message_correlation.py +++ /dev/null @@ -1,50 +0,0 @@ -"""Message_correlation.""" -from dataclasses import dataclass -from typing import TYPE_CHECKING - -from sqlalchemy import ForeignKey -from sqlalchemy.orm import relationship - -from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.message_correlation_property import ( - MessageCorrelationPropertyModel, -) -from spiffworkflow_backend.models.process_instance import ProcessInstanceModel - -if TYPE_CHECKING: - from spiffworkflow_backend.models.message_correlation_message_instance import ( # noqa: F401 - MessageCorrelationMessageInstanceModel, - ) - - -@dataclass -class MessageCorrelationModel(SpiffworkflowBaseDBModel): - """Message Correlations to relate queued messages together.""" - - __tablename__ = "message_correlation" - __table_args__ = ( - db.UniqueConstraint( - "process_instance_id", - "message_correlation_property_id", - "name", - name="message_instance_id_name_unique", - ), - ) - - id = db.Column(db.Integer, primary_key=True) - process_instance_id = db.Column( - ForeignKey(ProcessInstanceModel.id), nullable=False, index=True # type: ignore - ) - message_correlation_property_id = db.Column( - ForeignKey(MessageCorrelationPropertyModel.id), nullable=False, index=True - ) - name = db.Column(db.String(255), nullable=False, index=True) - value = db.Column(db.String(255), nullable=False, index=True) - updated_at_in_seconds: int = db.Column(db.Integer) - created_at_in_seconds: int = db.Column(db.Integer) - - message_correlation_property = relationship("MessageCorrelationPropertyModel") - message_correlations_message_instances = relationship( - "MessageCorrelationMessageInstanceModel", cascade="delete" - ) diff --git a/src/spiffworkflow_backend/models/message_correlation_message_instance.py b/src/spiffworkflow_backend/models/message_correlation_message_instance.py deleted file mode 100644 index 58ded838b..000000000 --- a/src/spiffworkflow_backend/models/message_correlation_message_instance.py +++ /dev/null @@ -1,32 +0,0 @@ -"""Message_correlation_message_instance.""" -from dataclasses import dataclass - -from sqlalchemy import ForeignKey - -from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel -from spiffworkflow_backend.models.message_instance import MessageInstanceModel - - -@dataclass -class MessageCorrelationMessageInstanceModel(SpiffworkflowBaseDBModel): - """MessageCorrelationMessageInstanceModel.""" - - __tablename__ = "message_correlation_message_instance" - - __table_args__ = ( - db.UniqueConstraint( - "message_instance_id", - "message_correlation_id", - name="message_correlation_message_instance_unique", - ), - ) - - id = db.Column(db.Integer, primary_key=True) - message_instance_id = db.Column( - ForeignKey(MessageInstanceModel.id), nullable=False, index=True # type: ignore - ) - message_correlation_id = db.Column( - ForeignKey(MessageCorrelationModel.id), nullable=False, index=True - ) diff --git a/src/spiffworkflow_backend/models/message_correlation_property.py b/src/spiffworkflow_backend/models/message_correlation_property.py deleted file mode 100644 index 1e09dc0c4..000000000 --- a/src/spiffworkflow_backend/models/message_correlation_property.py +++ /dev/null @@ -1,25 +0,0 @@ -"""Message_correlation_property.""" -from sqlalchemy import ForeignKey - -from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.message_model import MessageModel - - -class MessageCorrelationPropertyModel(SpiffworkflowBaseDBModel): - """MessageCorrelationPropertyModel.""" - - __tablename__ = "message_correlation_property" - __table_args__ = ( - db.UniqueConstraint( - "identifier", - "message_model_id", - name="message_correlation_property_unique", - ), - ) - - id = db.Column(db.Integer, primary_key=True) - identifier = db.Column(db.String(50), index=True) - message_model_id = db.Column(ForeignKey(MessageModel.id), nullable=False) - updated_at_in_seconds: int = db.Column(db.Integer) - created_at_in_seconds: int = db.Column(db.Integer) diff --git a/src/spiffworkflow_backend/models/message_instance.py b/src/spiffworkflow_backend/models/message_instance.py index c9ea515e6..99168ec37 100644 --- a/src/spiffworkflow_backend/models/message_instance.py +++ b/src/spiffworkflow_backend/models/message_instance.py @@ -5,6 +5,7 @@ from typing import Any from typing import Optional from typing import TYPE_CHECKING +from SpiffWorkflow.bpmn.PythonScriptEngine import PythonScriptEngine # type: ignore from sqlalchemy import ForeignKey from sqlalchemy.event import listens_for from sqlalchemy.orm import relationship @@ -13,12 +14,12 @@ from sqlalchemy.orm import validates from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.message_model import MessageModel from spiffworkflow_backend.models.process_instance import ProcessInstanceModel +from spiffworkflow_backend.models.user import UserModel if TYPE_CHECKING: - from spiffworkflow_backend.models.message_correlation_message_instance import ( # noqa: F401 - MessageCorrelationMessageInstanceModel, + from spiffworkflow_backend.models.message_instance_correlation import ( # noqa: F401 + MessageInstanceCorrelationRuleModel, ) @@ -45,21 +46,25 @@ class MessageInstanceModel(SpiffworkflowBaseDBModel): __tablename__ = "message_instance" id: int = db.Column(db.Integer, primary_key=True) - process_instance_id: int = db.Column(ForeignKey(ProcessInstanceModel.id), nullable=False) # type: ignore - message_model_id: int = db.Column(ForeignKey(MessageModel.id), nullable=False) - message_model = relationship("MessageModel") - message_correlations_message_instances = relationship( - "MessageCorrelationMessageInstanceModel", cascade="delete" - ) - + process_instance_id: int = db.Column(ForeignKey(ProcessInstanceModel.id), nullable=True) # type: ignore + name: str = db.Column(db.String(255)) message_type: str = db.Column(db.String(20), nullable=False) - payload: str = db.Column(db.JSON) + # Only Send Messages have a payload + payload: dict = db.Column(db.JSON) + # The correlation keys of the process at the time the message was created. + correlation_keys: dict = db.Column(db.JSON) status: str = db.Column(db.String(20), nullable=False, default="ready") + user_id: int = db.Column(ForeignKey(UserModel.id), nullable=False) # type: ignore + user = relationship("UserModel") + counterpart_id: int = db.Column( + db.Integer + ) # Not enforcing self-referential foreign key so we can delete messages. failure_cause: str = db.Column(db.Text()) updated_at_in_seconds: int = db.Column(db.Integer) created_at_in_seconds: int = db.Column(db.Integer) - - message_correlations: Optional[dict] = None + correlation_rules = relationship( + "MessageInstanceCorrelationRuleModel", back_populates="message_instance" + ) @validates("message_type") def validate_message_type(self, key: str, value: Any) -> Any: @@ -71,12 +76,82 @@ class MessageInstanceModel(SpiffworkflowBaseDBModel): """Validate_status.""" return self.validate_enum_field(key, value, MessageStatuses) + def correlates(self, other: Any, expression_engine: PythonScriptEngine) -> bool: + """Returns true if the this Message correlates with the given message. + + This must be a 'receive' message, and the other must be a 'send' or vice/versa. + If both messages have identical correlation_keys, they are a match. Otherwise + we check through this messages correlation properties and use the retrieval expressions + to extract the correlation keys from the send's payload, and verify that these + match up with correlation keys on this message. + """ + if self.is_send() and other.is_receive(): + # Flip the call. + return other.correlates(self, expression_engine) # type: ignore + + if self.name != other.name: + return False + if not self.is_receive(): + return False + if ( + isinstance(self.correlation_keys, dict) + and self.correlation_keys == other.correlation_keys + ): + # We know we have a match, and we can just return if we don't have to figure out the key + return True + + if self.correlation_keys == {}: + # Then there is nothing more to match on -- we accept any message with the given name. + return True + + # Loop over the receives' correlation keys - if any of the keys fully match, then we match. + for expected_values in self.correlation_keys.values(): + if self.payload_matches_expected_values( + other.payload, expected_values, expression_engine + ): + return True + return False + + def is_receive(self) -> bool: + return self.message_type == MessageTypes.receive.value + + def is_send(self) -> bool: + return self.message_type == MessageTypes.send.value + + def payload_matches_expected_values( + self, + payload: dict, + expected_values: dict, + expression_engine: PythonScriptEngine, + ) -> bool: + """Compares the payload of a 'send' message against a single correlation key's expected values.""" + for correlation_key in self.correlation_rules: + expected_value = expected_values.get(correlation_key.name, None) + if ( + expected_value is None + ): # This key is not required for this instance to match. + continue + try: + result = expression_engine._evaluate( + correlation_key.retrieval_expression, payload + ) + except Exception: + # the failure of a payload evaluation may not mean that matches for these + # message instances can't happen with other messages. So don't error up. + # fixme: Perhaps log some sort of error. + return False + if result != expected_value: + return False + return True + # This runs for ALL db flushes for ANY model, not just this one even if it's in the MessageInstanceModel class # so this may not be worth it or there may be a better way to do it # # https://stackoverflow.com/questions/32555829/flask-validates-decorator-multiple-fields-simultaneously/33025472#33025472 # https://docs.sqlalchemy.org/en/14/orm/session_events.html#before-flush + + @listens_for(Session, "before_flush") # type: ignore def ensure_failure_cause_is_set_if_message_instance_failed( session: Any, _flush_context: Optional[Any], _instances: Optional[Any] diff --git a/src/spiffworkflow_backend/models/message_instance_correlation.py b/src/spiffworkflow_backend/models/message_instance_correlation.py new file mode 100644 index 000000000..7431a273f --- /dev/null +++ b/src/spiffworkflow_backend/models/message_instance_correlation.py @@ -0,0 +1,41 @@ +"""Message_correlation.""" +from dataclasses import dataclass + +from sqlalchemy import ForeignKey +from sqlalchemy.orm import relationship + +from spiffworkflow_backend.models.db import db +from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel +from spiffworkflow_backend.models.message_instance import MessageInstanceModel + + +@dataclass +class MessageInstanceCorrelationRuleModel(SpiffworkflowBaseDBModel): + """These are the correlations of a specific Message Instance. + + These will only exist on receive messages. It provides the expression to run on + a send messages payload which must match receive messages correlation_key dictionary + to be considered a valid match. If the expected value is null, then it does not need to + match, but the expression should still evaluate and produce a result. + """ + + __tablename__ = "message_instance_correlation_rule" + __table_args__ = ( + db.UniqueConstraint( + "message_instance_id", + "name", + name="message_instance_id_name_unique", + ), + ) + + id = db.Column(db.Integer, primary_key=True) + message_instance_id = db.Column( + ForeignKey(MessageInstanceModel.id), nullable=False, index=True # type: ignore + ) + name: str = db.Column(db.String(50), nullable=False) + retrieval_expression: str = db.Column(db.String(255)) + updated_at_in_seconds: int = db.Column(db.Integer) + created_at_in_seconds: int = db.Column(db.Integer) + message_instance = relationship( + "MessageInstanceModel", back_populates="correlation_rules" + ) diff --git a/src/spiffworkflow_backend/models/message_model.py b/src/spiffworkflow_backend/models/message_model.py deleted file mode 100644 index 8ebd15c56..000000000 --- a/src/spiffworkflow_backend/models/message_model.py +++ /dev/null @@ -1,13 +0,0 @@ -"""Message_model.""" -from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel - - -class MessageModel(SpiffworkflowBaseDBModel): - """MessageModel.""" - - __tablename__ = "message_model" - - id = db.Column(db.Integer, primary_key=True) - identifier = db.Column(db.String(50), unique=True, index=True) - name = db.Column(db.String(50), unique=True, index=True) diff --git a/src/spiffworkflow_backend/models/message_triggerable_process_model.py b/src/spiffworkflow_backend/models/message_triggerable_process_model.py index edf648218..24a66f3a2 100644 --- a/src/spiffworkflow_backend/models/message_triggerable_process_model.py +++ b/src/spiffworkflow_backend/models/message_triggerable_process_model.py @@ -1,9 +1,6 @@ """Message_correlation_property.""" -from sqlalchemy import ForeignKey - from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.db import SpiffworkflowBaseDBModel -from spiffworkflow_backend.models.message_model import MessageModel class MessageTriggerableProcessModel(SpiffworkflowBaseDBModel): @@ -12,10 +9,7 @@ class MessageTriggerableProcessModel(SpiffworkflowBaseDBModel): __tablename__ = "message_triggerable_process_model" id = db.Column(db.Integer, primary_key=True) - message_model_id = db.Column( - ForeignKey(MessageModel.id), nullable=False, unique=True - ) + message_name: str = db.Column(db.String(255)) process_model_identifier: str = db.Column(db.String(50), nullable=False, index=True) - updated_at_in_seconds: int = db.Column(db.Integer) created_at_in_seconds: int = db.Column(db.Integer) diff --git a/src/spiffworkflow_backend/models/process_instance.py b/src/spiffworkflow_backend/models/process_instance.py index 95930958b..62f5af3f3 100644 --- a/src/spiffworkflow_backend/models/process_instance.py +++ b/src/spiffworkflow_backend/models/process_instance.py @@ -74,7 +74,6 @@ class ProcessInstanceModel(SpiffworkflowBaseDBModel): overlaps="active_human_tasks", ) # type: ignore message_instances = relationship("MessageInstanceModel", cascade="delete") # type: ignore - message_correlations = relationship("MessageCorrelationModel", cascade="delete") # type: ignore process_metadata = relationship( "ProcessInstanceMetadataModel", cascade="delete", @@ -144,6 +143,10 @@ class ProcessInstanceModel(SpiffworkflowBaseDBModel): """Can_submit_task.""" return not self.has_terminal_status() and self.status != "suspended" + def can_receive_message(self) -> bool: + """If this process can currently accept messages.""" + return not self.has_terminal_status() and self.status != "suspended" + def has_terminal_status(self) -> bool: """Has_terminal_status.""" return self.status in self.terminal_statuses() diff --git a/src/spiffworkflow_backend/models/task.py b/src/spiffworkflow_backend/models/task.py index 413be5e5a..148df2310 100644 --- a/src/spiffworkflow_backend/models/task.py +++ b/src/spiffworkflow_backend/models/task.py @@ -22,78 +22,7 @@ class MultiInstanceType(enum.Enum): class Task: """Task.""" - ########################################################################## - # Custom properties and validations defined in Camunda form fields # - ########################################################################## - - # Custom task title - PROP_EXTENSIONS_TITLE = "display_name" - PROP_EXTENSIONS_CLEAR_DATA = "clear_data" - - # Field Types - FIELD_TYPE_STRING = "string" - FIELD_TYPE_LONG = "long" - FIELD_TYPE_BOOLEAN = "boolean" - FIELD_TYPE_DATE = "date" - FIELD_TYPE_ENUM = "enum" - FIELD_TYPE_TEXTAREA = "textarea" # textarea: Multiple lines of text - FIELD_TYPE_AUTO_COMPLETE = "autocomplete" - FIELD_TYPE_FILE = "file" - FIELD_TYPE_FILES = "files" # files: Multiple files - FIELD_TYPE_TEL = "tel" # tel: Phone number - FIELD_TYPE_EMAIL = "email" # email: Email address - FIELD_TYPE_URL = "url" # url: Website address - - FIELD_PROP_AUTO_COMPLETE_MAX = ( # Not used directly, passed in from the front end. - "autocomplete_num" - ) - - # Required field - FIELD_CONSTRAINT_REQUIRED = "required" - - # Field properties and expressions Expressions - FIELD_PROP_REPEAT = "repeat" - FIELD_PROP_READ_ONLY = "read_only" - FIELD_PROP_LDAP_LOOKUP = "ldap.lookup" - FIELD_PROP_READ_ONLY_EXPRESSION = "read_only_expression" - FIELD_PROP_HIDE_EXPRESSION = "hide_expression" - FIELD_PROP_REQUIRED_EXPRESSION = "required_expression" - FIELD_PROP_LABEL_EXPRESSION = "label_expression" - FIELD_PROP_REPEAT_HIDE_EXPRESSION = "repeat_hide_expression" - FIELD_PROP_VALUE_EXPRESSION = "value_expression" - - # Enum field options - FIELD_PROP_SPREADSHEET_NAME = "spreadsheet.name" - FIELD_PROP_DATA_NAME = "data.name" - FIELD_PROP_VALUE_COLUMN = "value.column" - FIELD_PROP_LABEL_COLUMN = "label.column" - - # Enum field options values pulled from task data - - # Group and Repeat functions - FIELD_PROP_GROUP = "group" - FIELD_PROP_REPLEAT = "repeat" - FIELD_PROP_REPLEAT_TITLE = "repeat_title" - FIELD_PROP_REPLEAT_BUTTON = "repeat_button_label" - - # File specific field properties - FIELD_PROP_DOC_CODE = "doc_code" # to associate a file upload field with a doc code - FIELD_PROP_FILE_DATA = ( # to associate a bit of data with a specific file upload file. - "file_data" - ) - - # Additional properties - FIELD_PROP_ENUM_TYPE = "enum_type" - FIELD_PROP_BOOLEAN_TYPE = "boolean_type" - FIELD_PROP_TEXT_AREA_ROWS = "rows" - FIELD_PROP_TEXT_AREA_COLS = "cols" - FIELD_PROP_TEXT_AREA_AUTO = "autosize" - FIELD_PROP_PLACEHOLDER = "placeholder" - FIELD_PROP_DESCRIPTION = "description" - FIELD_PROP_MARKDOWN_DESCRIPTION = "markdown_description" - FIELD_PROP_HELP = "help" - - ########################################################################## + HUMAN_TASK_TYPES = ["User Task", "Manual Task"] def __init__( self, @@ -116,6 +45,7 @@ class Task: process_model_display_name: Union[str, None] = None, process_group_identifier: Union[str, None] = None, process_model_identifier: Union[str, None] = None, + bpmn_process_identifier: Union[str, None] = None, form_schema: Union[dict, None] = None, form_ui_schema: Union[dict, None] = None, parent: Optional[str] = None, @@ -147,6 +77,7 @@ class Task: self.process_instance_status = process_instance_status self.process_group_identifier = process_group_identifier self.process_model_identifier = process_model_identifier + self.bpmn_process_identifier = bpmn_process_identifier self.process_model_display_name = process_model_display_name self.form_schema = form_schema self.form_ui_schema = form_ui_schema @@ -193,6 +124,7 @@ class Task: "process_model_display_name": self.process_model_display_name, "process_group_identifier": self.process_group_identifier, "process_model_identifier": self.process_model_identifier, + "bpmn_process_identifier": self.bpmn_process_identifier, "form_schema": self.form_schema, "form_ui_schema": self.form_ui_schema, "parent": self.parent, @@ -202,20 +134,6 @@ class Task: "task_spiff_step": self.task_spiff_step, } - @classmethod - def valid_property_names(cls) -> list[str]: - """Valid_property_names.""" - return [ - value for name, value in vars(cls).items() if name.startswith("FIELD_PROP") - ] - - @classmethod - def valid_field_types(cls) -> list[str]: - """Valid_field_types.""" - return [ - value for name, value in vars(cls).items() if name.startswith("FIELD_TYPE") - ] - @classmethod def task_state_name_to_int(cls, task_state_name: str) -> int: task_state_integers = {v: k for k, v in TaskStateNames.items()} diff --git a/src/spiffworkflow_backend/models/user.py b/src/spiffworkflow_backend/models/user.py index 464bdc8b2..f32a35d79 100644 --- a/src/spiffworkflow_backend/models/user.py +++ b/src/spiffworkflow_backend/models/user.py @@ -2,6 +2,7 @@ from __future__ import annotations from dataclasses import dataclass +from typing import Any import jwt import marshmallow @@ -82,6 +83,13 @@ class UserModel(SpiffworkflowBaseDBModel): # # return instance + def as_dict(self) -> dict[str, Any]: + # dump the user using our json encoder and then load it back up as a dict + # to remove unwanted field types + user_as_json_string = current_app.json.dumps(self) + user_dict: dict[str, Any] = current_app.json.loads(user_as_json_string) + return user_dict + class UserModelSchema(Schema): """UserModelSchema.""" diff --git a/src/spiffworkflow_backend/routes/messages_controller.py b/src/spiffworkflow_backend/routes/messages_controller.py index 0db93a4a6..1c86fddbb 100644 --- a/src/spiffworkflow_backend/routes/messages_controller.py +++ b/src/spiffworkflow_backend/routes/messages_controller.py @@ -10,19 +10,11 @@ from flask import jsonify from flask import make_response from flask.wrappers import Response +from spiffworkflow_backend import db from spiffworkflow_backend.exceptions.api_error import ApiError -from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel from spiffworkflow_backend.models.message_instance import MessageInstanceModel -from spiffworkflow_backend.models.message_model import MessageModel -from spiffworkflow_backend.models.message_triggerable_process_model import ( - MessageTriggerableProcessModel, -) from spiffworkflow_backend.models.process_instance import ProcessInstanceModel from spiffworkflow_backend.models.process_instance import ProcessInstanceModelSchema -from spiffworkflow_backend.models.process_instance import ProcessInstanceStatus -from spiffworkflow_backend.routes.process_api_blueprint import ( - _find_process_instance_by_id_or_raise, -) from spiffworkflow_backend.services.message_service import MessageService @@ -45,36 +37,14 @@ def message_instance_list( MessageInstanceModel.created_at_in_seconds.desc(), # type: ignore MessageInstanceModel.id.desc(), # type: ignore ) - .join(MessageModel, MessageModel.id == MessageInstanceModel.message_model_id) - .join(ProcessInstanceModel) + .outerjoin(ProcessInstanceModel) # Not all messages were created by a process .add_columns( - MessageModel.identifier.label("message_identifier"), ProcessInstanceModel.process_model_identifier, ProcessInstanceModel.process_model_display_name, ) .paginate(page=page, per_page=per_page, error_out=False) ) - for message_instance in message_instances: - message_correlations: dict = {} - for ( - mcmi - ) in ( - message_instance.MessageInstanceModel.message_correlations_message_instances - ): - mc = MessageCorrelationModel.query.filter_by( - id=mcmi.message_correlation_id - ).all() - for m in mc: - if m.name not in message_correlations: - message_correlations[m.name] = {} - message_correlations[m.name][ - m.message_correlation_property.identifier - ] = m.value - message_instance.MessageInstanceModel.message_correlations = ( - message_correlations - ) - response_json = { "results": message_instances.items, "pagination": { @@ -92,104 +62,58 @@ def message_instance_list( # process_instance_id: Optional[int], # } def message_send( - message_identifier: str, + message_name: str, body: Dict[str, Any], ) -> flask.wrappers.Response: """Message_start.""" - message_model = MessageModel.query.filter_by(identifier=message_identifier).first() - if message_model is None: - raise ( - ApiError( - error_code="unknown_message", - message=f"Could not find message with identifier: {message_identifier}", - status_code=404, - ) - ) - if "payload" not in body: raise ( ApiError( error_code="missing_payload", - message="Body is missing payload.", + message=( + "Please include a 'payload' in the JSON body that contains the" + " message contents." + ), status_code=400, ) ) process_instance = None - if "process_instance_id" in body: - # to make sure we have a valid process_instance_id - process_instance = _find_process_instance_by_id_or_raise( - body["process_instance_id"] - ) - if process_instance.status == ProcessInstanceStatus.suspended.value: - raise ApiError( - error_code="process_instance_is_suspended", + # Create the send message + message_instance = MessageInstanceModel( + message_type="send", + name=message_name, + payload=body["payload"], + user_id=g.user.id, + ) + db.session.add(message_instance) + db.session.commit() + try: + receiver_message = MessageService.correlate_send_message(message_instance) + except Exception as e: + db.session.delete(message_instance) + db.session.commit() + raise e + if not receiver_message: + db.session.delete(message_instance) + db.session.commit() + raise ( + ApiError( + error_code="message_not_accepted", message=( - f"Process Instance '{process_instance.id}' is suspended and cannot" - " accept messages.'" + "No running process instances correlate with the given message" + f" name of '{message_name}'. And this message name is not" + " currently associated with any process Start Event. Nothing" + " to do." ), status_code=400, ) - - if process_instance.status == ProcessInstanceStatus.terminated.value: - raise ApiError( - error_code="process_instance_is_terminated", - message=( - f"Process Instance '{process_instance.id}' is terminated and cannot" - " accept messages.'" - ), - status_code=400, - ) - - message_instance = MessageInstanceModel.query.filter_by( - process_instance_id=process_instance.id, - message_model_id=message_model.id, - message_type="receive", - status="ready", - ).first() - if message_instance is None: - raise ( - ApiError( - error_code="cannot_find_waiting_message", - message=( - "Could not find waiting message for identifier" - f" {message_identifier} and process instance" - f" {process_instance.id}" - ), - status_code=400, - ) - ) - MessageService.process_message_receive( - message_instance, message_model.name, body["payload"] - ) - - else: - message_triggerable_process_model = ( - MessageTriggerableProcessModel.query.filter_by( - message_model_id=message_model.id - ).first() - ) - - if message_triggerable_process_model is None: - raise ( - ApiError( - error_code="cannot_start_message", - message=( - "Message with identifier cannot be start with message:" - f" {message_identifier}" - ), - status_code=400, - ) - ) - - process_instance = MessageService.process_message_triggerable_process_model( - message_triggerable_process_model, - message_model.name, - body["payload"], - g.user, ) + process_instance = ProcessInstanceModel.query.filter_by( + id=receiver_message.process_instance_id + ).first() return Response( json.dumps(ProcessInstanceModelSchema().dump(process_instance)), status=200, diff --git a/src/spiffworkflow_backend/routes/process_instances_controller.py b/src/spiffworkflow_backend/routes/process_instances_controller.py index 7d0b48379..94e6c0cf0 100644 --- a/src/spiffworkflow_backend/routes/process_instances_controller.py +++ b/src/spiffworkflow_backend/routes/process_instances_controller.py @@ -137,7 +137,7 @@ def process_instance_run( processor.unlock_process_instance("Web") if not current_app.config["SPIFFWORKFLOW_BACKEND_RUN_BACKGROUND_SCHEDULER"]: - MessageService.process_message_instances() + MessageService.correlate_all_message_instances() process_instance_api = ProcessInstanceService.processor_to_process_instance_api( processor @@ -514,7 +514,6 @@ def process_instance_task_list_without_task_data_for_me( process_instance, all_tasks, spiff_step, - get_task_data=False, ) @@ -531,24 +530,6 @@ def process_instance_task_list_without_task_data( process_instance, all_tasks, spiff_step, - get_task_data=False, - ) - - -def process_instance_task_list_with_task_data( - modified_process_model_identifier: str, - process_instance_id: int, - all_tasks: bool = False, - spiff_step: int = 0, -) -> flask.wrappers.Response: - """Process_instance_task_list_with_task_data.""" - process_instance = _find_process_instance_by_id_or_raise(process_instance_id) - return process_instance_task_list( - modified_process_model_identifier, - process_instance, - all_tasks, - spiff_step, - get_task_data=True, ) @@ -557,7 +538,6 @@ def process_instance_task_list( process_instance: ProcessInstanceModel, all_tasks: bool = False, spiff_step: int = 0, - get_task_data: bool = False, ) -> flask.wrappers.Response: """Process_instance_task_list.""" step_detail_query = db.session.query(SpiffStepDetailsModel).filter( @@ -579,25 +559,12 @@ def process_instance_task_list( subprocess_state_overrides = {} for step_detail in step_details: if step_detail.task_id in tasks: - task_data = ( - step_detail.task_json["task_data"] | step_detail.task_json["python_env"] - ) - if task_data is None: - task_data = {} - tasks[step_detail.task_id]["data"] = task_data tasks[step_detail.task_id]["state"] = Task.task_state_name_to_int( step_detail.task_state ) else: for subprocess_id, subprocess_info in subprocesses.items(): if step_detail.task_id in subprocess_info["tasks"]: - task_data = ( - step_detail.task_json["task_data"] - | step_detail.task_json["python_env"] - ) - if task_data is None: - task_data = {} - subprocess_info["tasks"][step_detail.task_id]["data"] = task_data subprocess_info["tasks"][step_detail.task_id]["state"] = ( Task.task_state_name_to_int(step_detail.task_state) ) @@ -654,8 +621,6 @@ def process_instance_task_list( calling_subprocess_task_id=calling_subprocess_task_id, task_spiff_step=task_spiff_step, ) - if get_task_data: - task.data = spiff_task.data tasks.append(task) return make_response(jsonify(tasks), 200) diff --git a/src/spiffworkflow_backend/routes/tasks_controller.py b/src/spiffworkflow_backend/routes/tasks_controller.py index c5dab9546..6c449ea32 100644 --- a/src/spiffworkflow_backend/routes/tasks_controller.py +++ b/src/spiffworkflow_backend/routes/tasks_controller.py @@ -36,6 +36,7 @@ from spiffworkflow_backend.models.human_task_user import HumanTaskUserModel from spiffworkflow_backend.models.process_instance import ProcessInstanceModel from spiffworkflow_backend.models.process_instance import ProcessInstanceStatus from spiffworkflow_backend.models.process_model import ProcessModelInfo +from spiffworkflow_backend.models.spiff_step_details import SpiffStepDetailsModel from spiffworkflow_backend.models.task import Task from spiffworkflow_backend.models.user import UserModel from spiffworkflow_backend.routes.process_api_blueprint import ( @@ -72,8 +73,6 @@ class ReactJsonSchemaSelectOption(TypedDict): enum: list[str] -# TODO: see comment for before_request -# @process_api_blueprint.route("/v1.0/tasks", methods=["GET"]) def task_list_my_tasks( process_instance_id: Optional[int] = None, page: int = 1, per_page: int = 100 ) -> flask.wrappers.Response: @@ -108,6 +107,11 @@ def task_list_my_tasks( _get_potential_owner_usernames(assigned_user) ) + # FIXME: this breaks postgres. Look at commit c147cdb47b1481f094b8c3d82dc502fe961f4977 for + # the postgres fix but it breaks the method for mysql. + # error in postgres: + # psycopg2.errors.GroupingError) column \"process_instance.process_model_identifier\" must + # appear in the GROUP BY clause or be used in an aggregate function human_tasks = human_task_query.add_columns( HumanTaskModel.task_id.label("id"), # type: ignore HumanTaskModel.task_name, @@ -171,6 +175,46 @@ def task_list_for_my_groups( ) +def task_data_show( + modified_process_model_identifier: str, + process_instance_id: int, + spiff_step: int = 0, +) -> flask.wrappers.Response: + process_instance = _find_process_instance_by_id_or_raise(process_instance_id) + step_detail = ( + db.session.query(SpiffStepDetailsModel) + .filter( + SpiffStepDetailsModel.process_instance_id == process_instance.id, + SpiffStepDetailsModel.spiff_step == spiff_step, + ) + .first() + ) + + if step_detail is None: + raise ApiError( + error_code="spiff_step_for_proces_instance_not_found", + message=( + "The given spiff step for the given process instance could not be" + " found." + ), + status_code=400, + ) + + processor = ProcessInstanceProcessor(process_instance) + spiff_task = processor.__class__.get_task_by_bpmn_identifier( + step_detail.bpmn_task_identifier, processor.bpmn_process_instance + ) + task_data = step_detail.task_json["task_data"] | step_detail.task_json["python_env"] + task = ProcessInstanceService.spiff_task_to_api_task( + processor, + spiff_task, + task_spiff_step=spiff_step, + ) + task.data = task_data + + return make_response(jsonify(task), 200) + + def _munge_form_ui_schema_based_on_hidden_fields_in_task_data(task: Task) -> None: if task.form_ui_schema is None: task.form_ui_schema = {} diff --git a/src/spiffworkflow_backend/scripts/get_last_user_completing_task.py b/src/spiffworkflow_backend/scripts/get_last_user_completing_task.py new file mode 100644 index 000000000..8d63610bb --- /dev/null +++ b/src/spiffworkflow_backend/scripts/get_last_user_completing_task.py @@ -0,0 +1,48 @@ +"""Get current user.""" +from typing import Any + +from spiffworkflow_backend.models.human_task import HumanTaskModel +from spiffworkflow_backend.models.script_attributes_context import ( + ScriptAttributesContext, +) +from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.scripts.script import Script + + +class GetLastUserCompletingTask(Script): + @staticmethod + def requires_privileged_permissions() -> bool: + """We have deemed this function safe to run without elevated permissions.""" + return False + + def get_description(self) -> str: + return """Return the last user who completed the given task.""" + + def run( + self, + script_attributes_context: ScriptAttributesContext, + *_args: Any, + **kwargs: Any, + ) -> Any: + """Run.""" + # dump the user using our json encoder and then load it back up as a dict + # to remove unwanted field types + if len(_args) == 2: + bpmn_process_identifier = _args[0] + task_name = _args[1] + else: + bpmn_process_identifier = kwargs["bpmn_process_identifier"] + task_name = kwargs["task_bpmn_identifier"] + + human_task = ( + HumanTaskModel.query.filter_by( + process_instance_id=script_attributes_context.process_instance_id, + bpmn_process_identifier=bpmn_process_identifier, + task_name=task_name, + ) + .order_by(HumanTaskModel.id.desc()) # type: ignore + .join(UserModel, UserModel.id == HumanTaskModel.completed_by_user_id) + .first() + ) + + return human_task.completed_by_user.as_dict() diff --git a/src/spiffworkflow_backend/scripts/get_process_initiator_user.py b/src/spiffworkflow_backend/scripts/get_process_initiator_user.py new file mode 100644 index 000000000..266fa57ba --- /dev/null +++ b/src/spiffworkflow_backend/scripts/get_process_initiator_user.py @@ -0,0 +1,36 @@ +"""Get current user.""" +from typing import Any + +from spiffworkflow_backend.models.process_instance import ProcessInstanceModel +from spiffworkflow_backend.models.script_attributes_context import ( + ScriptAttributesContext, +) +from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.scripts.script import Script + + +class GetProcessInitiatorUser(Script): + @staticmethod + def requires_privileged_permissions() -> bool: + """We have deemed this function safe to run without elevated permissions.""" + return False + + def get_description(self) -> str: + return """Return the user that initiated the process instance.""" + + def run( + self, + script_attributes_context: ScriptAttributesContext, + *_args: Any, + **kwargs: Any, + ) -> Any: + """Run.""" + process_instance = ( + ProcessInstanceModel.query.filter_by( + id=script_attributes_context.process_instance_id + ) + .join(UserModel, UserModel.id == ProcessInstanceModel.process_initiator_id) + .first() + ) + + return process_instance.process_initiator.as_dict() diff --git a/src/spiffworkflow_backend/services/authorization_service.py b/src/spiffworkflow_backend/services/authorization_service.py index 9d2f80cb4..1e7c3ee9e 100644 --- a/src/spiffworkflow_backend/services/authorization_service.py +++ b/src/spiffworkflow_backend/services/authorization_service.py @@ -482,11 +482,6 @@ class AuthorizationService: """Profile, picture, website, gender, birthdate, zoneinfo, locale, and updated_at. """ """Email.""" is_new_user = False - user_model = ( - UserModel.query.filter(UserModel.service == user_info["iss"]) - .filter(UserModel.service_id == user_info["sub"]) - .first() - ) user_attributes = {} if "email" in user_info: @@ -515,6 +510,13 @@ class AuthorizationService: tenant_specific_field ] + # example value for service: http://localhost:7002/realms/spiffworkflow (keycloak url) + user_model = ( + UserModel.query.filter(UserModel.service == user_attributes["service"]) + .filter(UserModel.username == user_attributes["username"]) + .first() + ) + if user_model is None: current_app.logger.debug("create_user in login_return") is_new_user = True diff --git a/src/spiffworkflow_backend/services/background_processing_service.py b/src/spiffworkflow_backend/services/background_processing_service.py index 1771c2c8b..0a2b287b9 100644 --- a/src/spiffworkflow_backend/services/background_processing_service.py +++ b/src/spiffworkflow_backend/services/background_processing_service.py @@ -22,4 +22,4 @@ class BackgroundProcessingService: def process_message_instances_with_app_context(self) -> None: """Since this runs in a scheduler, we need to specify the app context as well.""" with self.app.app_context(): - MessageService.process_message_instances() + MessageService.correlate_all_message_instances() diff --git a/src/spiffworkflow_backend/services/error_handling_service.py b/src/spiffworkflow_backend/services/error_handling_service.py index 740aa1069..f1d7f1bc5 100644 --- a/src/spiffworkflow_backend/services/error_handling_service.py +++ b/src/spiffworkflow_backend/services/error_handling_service.py @@ -8,7 +8,7 @@ from flask.wrappers import Response from spiffworkflow_backend.exceptions.api_error import ApiError from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.message_model import MessageModel +from spiffworkflow_backend.models.message_instance import MessageInstanceModel from spiffworkflow_backend.models.message_triggerable_process_model import ( MessageTriggerableProcessModel, ) @@ -80,22 +80,27 @@ class ErrorHandlingService: f" Error:\n{error.__repr__()}" ) message_payload = {"message_text": message_text, "recipients": recipients} - message_identifier = current_app.config[ + message_name = current_app.config[ "SPIFFWORKFLOW_BACKEND_SYSTEM_NOTIFICATION_PROCESS_MODEL_MESSAGE_ID" ] - message_model = MessageModel.query.filter_by( - identifier=message_identifier - ).first() message_triggerable_process_model = ( MessageTriggerableProcessModel.query.filter_by( - message_model_id=message_model.id + message_name=message_name ).first() ) - process_instance = MessageService.process_message_triggerable_process_model( - message_triggerable_process_model, - message_identifier, - message_payload, - g.user, + + # Create the send message + message_instance = MessageInstanceModel( + message_type="send", + name=message_name, + payload=message_payload, + user_id=g.user.id, + ) + db.session.add(message_instance) + db.session.commit() + + process_instance = MessageService.start_process_with_message( + message_triggerable_process_model, message_instance ) return Response( diff --git a/src/spiffworkflow_backend/services/logging_service.py b/src/spiffworkflow_backend/services/logging_service.py index 8d7fa9d02..36d1ea77c 100644 --- a/src/spiffworkflow_backend/services/logging_service.py +++ b/src/spiffworkflow_backend/services/logging_service.py @@ -10,6 +10,7 @@ from flask.app import Flask from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.spiff_logging import SpiffLoggingModel +from spiffworkflow_backend.models.task import Task # flask logging formats: @@ -218,9 +219,13 @@ class DBHandler(logging.Handler): bpmn_task_type = record.task_type if hasattr(record, "task_type") else None # type: ignore timestamp = record.created message = record.msg if hasattr(record, "msg") else None - current_user_id = ( - record.current_user_id if hasattr(record, "current_user_id") else None # type: ignore - ) + + current_user_id = None + if bpmn_task_type in Task.HUMAN_TASK_TYPES and hasattr( + record, "current_user_id" + ): + current_user_id = record.current_user_id # type: ignore + spiff_step = ( record.spiff_step # type: ignore if hasattr(record, "spiff_step") and record.spiff_step is not None # type: ignore diff --git a/src/spiffworkflow_backend/services/message_service.py b/src/spiffworkflow_backend/services/message_service.py index 0e4799ca0..fb9ef6c46 100644 --- a/src/spiffworkflow_backend/services/message_service.py +++ b/src/spiffworkflow_backend/services/message_service.py @@ -1,22 +1,15 @@ """Message_service.""" -from typing import Any -from typing import Optional - -from sqlalchemy import and_ -from sqlalchemy import or_ -from sqlalchemy import select - from spiffworkflow_backend.models.db import db -from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel -from spiffworkflow_backend.models.message_correlation_message_instance import ( - MessageCorrelationMessageInstanceModel, -) from spiffworkflow_backend.models.message_instance import MessageInstanceModel +from spiffworkflow_backend.models.message_instance import MessageStatuses +from spiffworkflow_backend.models.message_instance import MessageTypes from spiffworkflow_backend.models.message_triggerable_process_model import ( MessageTriggerableProcessModel, ) from spiffworkflow_backend.models.process_instance import ProcessInstanceModel -from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.services.process_instance_processor import ( + CustomBpmnScriptEngine, +) from spiffworkflow_backend.services.process_instance_processor import ( ProcessInstanceProcessor, ) @@ -33,114 +26,133 @@ class MessageService: """MessageService.""" @classmethod - def process_message_instances(cls) -> None: - """Process_message_instances.""" + def correlate_send_message( + cls, message_instance_send: MessageInstanceModel + ) -> MessageInstanceModel | None: + """Connects the given send message to a 'receive' message if possible. + + :param message_instance_send: + :return: the message instance that received this message. + """ + # Thread safe via db locking - don't try to progress the same send message over multiple instances + if message_instance_send.status != MessageStatuses.ready.value: + return None + message_instance_send.status = MessageStatuses.running.value + db.session.add(message_instance_send) + db.session.commit() + + # Find available messages that might match + available_receive_messages = MessageInstanceModel.query.filter_by( + name=message_instance_send.name, + status=MessageStatuses.ready.value, + message_type=MessageTypes.receive.value, + ).all() + message_instance_receive: MessageInstanceModel | None = None + try: + for message_instance in available_receive_messages: + if message_instance.correlates( + message_instance_send, CustomBpmnScriptEngine() + ): + message_instance_receive = message_instance + + if message_instance_receive is None: + # Check for a message triggerable process and start that to create a new message_instance_receive + message_triggerable_process_model = ( + MessageTriggerableProcessModel.query.filter_by( + message_name=message_instance_send.name + ).first() + ) + if message_triggerable_process_model: + receiving_process = MessageService.start_process_with_message( + message_triggerable_process_model, message_instance_send + ) + message_instance_receive = MessageInstanceModel.query.filter_by( + process_instance_id=receiving_process.id, + message_type="receive", + status="ready", + ).first() + else: + receiving_process = ( + MessageService.get_process_instance_for_message_instance( + message_instance_receive + ) + ) + + # Assure we can send the message, otherwise keep going. + if ( + message_instance_receive is None + or not receiving_process.can_receive_message() + ): + message_instance_send.status = "ready" + message_instance_send.status = "ready" + db.session.add(message_instance_send) + db.session.commit() + return None + + # Set the receiving message to running, so it is not altered elswhere ... + message_instance_receive.status = "running" + + cls.process_message_receive( + receiving_process, + message_instance_receive, + message_instance_send.name, + message_instance_send.payload, + ) + message_instance_receive.status = "completed" + message_instance_receive.counterpart_id = message_instance_send.id + db.session.add(message_instance_receive) + message_instance_send.status = "completed" + message_instance_send.counterpart_id = message_instance_receive.id + db.session.add(message_instance_send) + db.session.commit() + return message_instance_receive + + except Exception as exception: + db.session.rollback() + message_instance_send.status = "failed" + message_instance_send.failure_cause = str(exception) + db.session.add(message_instance_send) + if message_instance_receive: + message_instance_receive.status = "failed" + message_instance_receive.failure_cause = str(exception) + db.session.add(message_instance_receive) + db.session.commit() + raise exception + + @classmethod + def correlate_all_message_instances(cls) -> None: + """Look at ALL the Send and Receive Messages and attempt to find correlations.""" message_instances_send = MessageInstanceModel.query.filter_by( message_type="send", status="ready" ).all() - message_instances_receive = MessageInstanceModel.query.filter_by( - message_type="receive", status="ready" - ).all() + for message_instance_send in message_instances_send: - # check again in case another background process picked up the message - # while the previous one was running - if message_instance_send.status != "ready": - continue - - message_instance_send.status = "running" - db.session.add(message_instance_send) - db.session.commit() - - message_instance_receive = None - try: - message_instance_receive = cls.get_message_instance_receive( - message_instance_send, message_instances_receive - ) - if message_instance_receive is None: - message_triggerable_process_model = ( - MessageTriggerableProcessModel.query.filter_by( - message_model_id=message_instance_send.message_model_id - ).first() - ) - if message_triggerable_process_model: - process_instance_send = ProcessInstanceModel.query.filter_by( - id=message_instance_send.process_instance_id, - ).first() - # TODO: use the correct swimlane user when that is set up - cls.process_message_triggerable_process_model( - message_triggerable_process_model, - message_instance_send.message_model.name, - message_instance_send.payload, - process_instance_send.process_initiator, - ) - message_instance_send.status = "completed" - else: - # if we can't get a queued message then put it back in the queue - message_instance_send.status = "ready" - - else: - if message_instance_receive.status != "ready": - continue - message_instance_receive.status = "running" - - cls.process_message_receive( - message_instance_receive, - message_instance_send.message_model.name, - message_instance_send.payload, - ) - message_instance_receive.status = "completed" - db.session.add(message_instance_receive) - message_instance_send.status = "completed" - - db.session.add(message_instance_send) - db.session.commit() - except Exception as exception: - db.session.rollback() - message_instance_send.status = "failed" - message_instance_send.failure_cause = str(exception) - db.session.add(message_instance_send) - - if message_instance_receive: - message_instance_receive.status = "failed" - message_instance_receive.failure_cause = str(exception) - db.session.add(message_instance_receive) - - db.session.commit() - raise exception + cls.correlate_send_message(message_instance_send) @staticmethod - def process_message_triggerable_process_model( + def start_process_with_message( message_triggerable_process_model: MessageTriggerableProcessModel, - message_model_name: str, - message_payload: dict, - user: UserModel, + message_instance: MessageInstanceModel, ) -> ProcessInstanceModel: - """Process_message_triggerable_process_model.""" + """Start up a process instance, so it is ready to catch the event.""" process_instance_receive = ProcessInstanceService.create_process_instance_from_process_model_identifier( message_triggerable_process_model.process_model_identifier, - user, + message_instance.user, ) processor_receive = ProcessInstanceProcessor(process_instance_receive) - processor_receive.do_engine_steps(save=False) - processor_receive.bpmn_process_instance.catch_bpmn_message( - message_model_name, - message_payload, - correlations={}, - ) processor_receive.do_engine_steps(save=True) - return process_instance_receive @staticmethod - def process_message_receive( + def get_process_instance_for_message_instance( message_instance_receive: MessageInstanceModel, - message_model_name: str, - message_payload: dict, - ) -> None: + ) -> ProcessInstanceModel: """Process_message_receive.""" - process_instance_receive = ProcessInstanceModel.query.filter_by( - id=message_instance_receive.process_instance_id - ).first() + process_instance_receive: ProcessInstanceModel = ( + ProcessInstanceModel.query.filter_by( + id=message_instance_receive.process_instance_id + ).first() + ) if process_instance_receive is None: raise MessageServiceError( ( @@ -151,83 +163,21 @@ class MessageService: ), ) ) + return process_instance_receive + @staticmethod + def process_message_receive( + process_instance_receive: ProcessInstanceModel, + message_instance_receive: MessageInstanceModel, + message_model_name: str, + message_payload: dict, + ) -> None: + """process_message_receive.""" processor_receive = ProcessInstanceProcessor(process_instance_receive) processor_receive.bpmn_process_instance.catch_bpmn_message( - message_model_name, - message_payload, - correlations={}, + message_model_name, message_payload ) processor_receive.do_engine_steps(save=True) - - @staticmethod - def get_message_instance_receive( - message_instance_send: MessageInstanceModel, - message_instances_receive: list[MessageInstanceModel], - ) -> Optional[MessageInstanceModel]: - """Get_message_instance_receive.""" - message_correlations_send = ( - MessageCorrelationModel.query.join(MessageCorrelationMessageInstanceModel) - .filter_by(message_instance_id=message_instance_send.id) - .all() - ) - - message_correlation_filter = [] - for message_correlation_send in message_correlations_send: - message_correlation_filter.append( - and_( - MessageCorrelationModel.name == message_correlation_send.name, - MessageCorrelationModel.value == message_correlation_send.value, - MessageCorrelationModel.message_correlation_property_id - == message_correlation_send.message_correlation_property_id, - ) - ) - - for message_instance_receive in message_instances_receive: - # sqlalchemy supports select / where statements like active record apparantly - # https://docs.sqlalchemy.org/en/14/core/tutorial.html#conjunctions - message_correlation_select = ( - select([db.func.count()]) - .select_from(MessageCorrelationModel) # type: ignore - .where( - and_( - MessageCorrelationModel.process_instance_id - == message_instance_receive.process_instance_id, - or_(*message_correlation_filter), - ) - ) - .join(MessageCorrelationMessageInstanceModel) # type: ignore - .filter_by( - message_instance_id=message_instance_receive.id, - ) - ) - message_correlations_receive = db.session.execute( - message_correlation_select - ) - - # since the query matches on name, value, and message_instance_receive.id, if the counts - # message correlations found are the same, then this should be the relevant message - if ( - message_correlations_receive.scalar() == len(message_correlations_send) - and message_instance_receive.message_model_id - == message_instance_send.message_model_id - ): - return message_instance_receive - - return None - - @staticmethod - def get_process_instance_for_message_instance( - message_instance: MessageInstanceModel, - ) -> Any: - """Get_process_instance_for_message_instance.""" - process_instance = ProcessInstanceModel.query.filter_by( - id=message_instance.process_instance_id - ).first() - if process_instance is None: - raise MessageServiceError( - f"Process instance cannot be found for message: {message_instance.id}." - f"Tried with id {message_instance.process_instance_id}" - ) - - return process_instance + message_instance_receive.status = MessageStatuses.completed.value + db.session.add(message_instance_receive) + db.session.commit() diff --git a/src/spiffworkflow_backend/services/process_instance_processor.py b/src/spiffworkflow_backend/services/process_instance_processor.py index f85da7108..8c76370d2 100644 --- a/src/spiffworkflow_backend/services/process_instance_processor.py +++ b/src/spiffworkflow_backend/services/process_instance_processor.py @@ -60,15 +60,10 @@ from spiffworkflow_backend.models.file import FileType from spiffworkflow_backend.models.group import GroupModel from spiffworkflow_backend.models.human_task import HumanTaskModel from spiffworkflow_backend.models.human_task_user import HumanTaskUserModel -from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel -from spiffworkflow_backend.models.message_correlation_message_instance import ( - MessageCorrelationMessageInstanceModel, -) -from spiffworkflow_backend.models.message_correlation_property import ( - MessageCorrelationPropertyModel, -) from spiffworkflow_backend.models.message_instance import MessageInstanceModel -from spiffworkflow_backend.models.message_instance import MessageModel +from spiffworkflow_backend.models.message_instance_correlation import ( + MessageInstanceCorrelationRuleModel, +) from spiffworkflow_backend.models.process_instance import ProcessInstanceModel from spiffworkflow_backend.models.process_instance import ProcessInstanceStatus from spiffworkflow_backend.models.process_instance_metadata import ( @@ -449,53 +444,6 @@ class ProcessInstanceProcessor: ) = ProcessInstanceProcessor.get_process_model_and_subprocesses( process_instance_model.process_model_identifier ) - else: - bpmn_json_length = len(process_instance_model.bpmn_json.encode("utf-8")) - megabyte = float(1024**2) - json_size = bpmn_json_length / megabyte - if json_size > 1: - wf_json = json.loads(process_instance_model.bpmn_json) - if "spec" in wf_json and "tasks" in wf_json: - task_tree = wf_json["tasks"] - test_spec = wf_json["spec"] - task_size = "{:.2f}".format( - len(json.dumps(task_tree).encode("utf-8")) / megabyte - ) - spec_size = "{:.2f}".format( - len(json.dumps(test_spec).encode("utf-8")) / megabyte - ) - message = ( - "Workflow " - + process_instance_model.process_model_identifier - + f" JSON Size is over 1MB:{json_size:.2f} MB" - ) - message += f"\n Task Size: {task_size}" - message += f"\n Spec Size: {spec_size}" - current_app.logger.warning(message) - - def check_sub_specs( - test_spec: dict, indent: int = 0, show_all: bool = False - ) -> None: - """Check_sub_specs.""" - for my_spec_name in test_spec["task_specs"]: - my_spec = test_spec["task_specs"][my_spec_name] - my_spec_size = ( - len(json.dumps(my_spec).encode("utf-8")) / megabyte - ) - if my_spec_size > 0.1 or show_all: - current_app.logger.warning( - (" " * indent) - + "Sub-Spec " - + my_spec["name"] - + " :" - + f"{my_spec_size:.2f}" - ) - if "spec" in my_spec: - if my_spec["name"] == "Call_Emails_Process_Email": - pass - check_sub_specs(my_spec["spec"], indent + 5) - - check_sub_specs(test_spec, 5) self.process_model_identifier = process_instance_model.process_model_identifier self.process_model_display_name = ( @@ -776,12 +724,12 @@ class ProcessInstanceProcessor: Rerturns: {process_name: [task_1, task_2, ...], ...} """ - serialized_data = json.loads(self.serialize()) - processes: dict[str, list[str]] = {serialized_data["spec"]["name"]: []} - for task_name, _task_spec in serialized_data["spec"]["task_specs"].items(): - processes[serialized_data["spec"]["name"]].append(task_name) - if "subprocess_specs" in serialized_data: - for subprocess_name, subprocess_details in serialized_data[ + bpmn_json = json.loads(self.process_instance_model.bpmn_json or "{}") + processes: dict[str, list[str]] = {bpmn_json["spec"]["name"]: []} + for task_name, _task_spec in bpmn_json["spec"]["task_specs"].items(): + processes[bpmn_json["spec"]["name"]].append(task_name) + if "subprocess_specs" in bpmn_json: + for subprocess_name, subprocess_details in bpmn_json[ "subprocess_specs" ].items(): processes[subprocess_name] = [] @@ -816,7 +764,7 @@ class ProcessInstanceProcessor: ################################################################# - def get_all_task_specs(self) -> dict[str, dict]: + def get_all_task_specs(self, bpmn_json: dict) -> dict[str, dict]: """This looks both at top level task_specs and subprocess_specs in the serialized data. It returns a dict of all task specs based on the task name like it is in the serialized form. @@ -824,10 +772,9 @@ class ProcessInstanceProcessor: NOTE: this may not fully work for tasks that are NOT call activities since their task_name may not be unique but in our current use case we only care about the call activities here. """ - serialized_data = json.loads(self.serialize()) - spiff_task_json = serialized_data["spec"]["task_specs"] or {} - if "subprocess_specs" in serialized_data: - for _subprocess_name, subprocess_details in serialized_data[ + spiff_task_json = bpmn_json["spec"]["task_specs"] or {} + if "subprocess_specs" in bpmn_json: + for _subprocess_name, subprocess_details in bpmn_json[ "subprocess_specs" ].items(): if "task_specs" in subprocess_details: @@ -849,8 +796,8 @@ class ProcessInstanceProcessor: Also note that subprocess_task_id might in fact be a call activity, because spiff treats call activities like subprocesses in terms of the serialization. """ - bpmn_json = json.loads(self.serialize()) - spiff_task_json = self.get_all_task_specs() + bpmn_json = json.loads(self.process_instance_model.bpmn_json or "{}") + spiff_task_json = self.get_all_task_specs(bpmn_json) subprocesses_by_child_task_ids = {} task_typename_by_task_id = {} @@ -922,6 +869,7 @@ class ProcessInstanceProcessor: process_instance_id=self.process_instance_model.id, completed=False ).all() ready_or_waiting_tasks = self.get_all_ready_or_waiting_tasks() + process_model_display_name = "" process_model_info = self.process_model_service.get_process_model( self.process_instance_model.process_model_identifier @@ -940,6 +888,10 @@ class ProcessInstanceProcessor: ) extensions = task_spec.extensions + # in the xml, it's the id attribute. this identifies the process where the activity lives. + # if it's in a subprocess, it's the inner process. + bpmn_process_identifier = ready_or_waiting_task.workflow.name + form_file_name = None ui_form_file_name = None if "properties" in extensions: @@ -959,6 +911,7 @@ class ProcessInstanceProcessor: human_task = HumanTaskModel( process_instance_id=self.process_instance_model.id, process_model_display_name=process_model_display_name, + bpmn_process_identifier=bpmn_process_identifier, form_file_name=form_file_name, ui_form_file_name=ui_form_file_name, task_id=str(ready_or_waiting_task.id), @@ -1348,157 +1301,56 @@ class ProcessInstanceProcessor: db.session.add(self.process_instance_model) db.session.commit() - # messages have one correlation key (possibly wrong) - # correlation keys may have many correlation properties def process_bpmn_messages(self) -> None: """Process_bpmn_messages.""" bpmn_messages = self.bpmn_process_instance.get_bpmn_messages() for bpmn_message in bpmn_messages: - # only message sends are in get_bpmn_messages - message_model = MessageModel.query.filter_by(name=bpmn_message.name).first() - if message_model is None: - raise ApiError( - "invalid_message_name", - f"Invalid message name: {bpmn_message.name}.", - ) - - if not bpmn_message.correlations: - raise ApiError( - "message_correlations_missing", - ( - "Could not find any message correlations bpmn_message:" - f" {bpmn_message.name}" - ), - ) - - message_correlations = [] - for ( - message_correlation_key, - message_correlation_properties, - ) in bpmn_message.correlations.items(): - for ( - message_correlation_property_identifier, - message_correlation_property_value, - ) in message_correlation_properties.items(): - message_correlation_property = ( - MessageCorrelationPropertyModel.query.filter_by( - identifier=message_correlation_property_identifier, - ).first() - ) - if message_correlation_property is None: - raise ApiError( - "message_correlations_missing_from_process", - ( - "Could not find a known message correlation with" - f" identifier:{message_correlation_property_identifier}" - ), - ) - message_correlations.append( - { - "message_correlation_property": ( - message_correlation_property - ), - "name": message_correlation_key, - "value": message_correlation_property_value, - } - ) message_instance = MessageInstanceModel( process_instance_id=self.process_instance_model.id, + user_id=self.process_instance_model.process_initiator_id, # TODO: use the correct swimlane user when that is set up message_type="send", - message_model_id=message_model.id, + name=bpmn_message.name, payload=bpmn_message.payload, + correlation_keys=self.bpmn_process_instance.correlations, ) db.session.add(message_instance) db.session.commit() - for message_correlation in message_correlations: - message_correlation = MessageCorrelationModel( - process_instance_id=self.process_instance_model.id, - message_correlation_property_id=message_correlation[ - "message_correlation_property" - ].id, - name=message_correlation["name"], - value=message_correlation["value"], - ) - db.session.add(message_correlation) - db.session.commit() - message_correlation_message_instance = ( - MessageCorrelationMessageInstanceModel( - message_instance_id=message_instance.id, - message_correlation_id=message_correlation.id, - ) - ) - db.session.add(message_correlation_message_instance) - db.session.commit() - def queue_waiting_receive_messages(self) -> None: """Queue_waiting_receive_messages.""" - waiting_tasks = self.get_all_waiting_tasks() - for waiting_task in waiting_tasks: - # if it's not something that can wait for a message, skip it - if waiting_task.task_spec.__class__.__name__ not in [ - "IntermediateCatchEvent", - "ReceiveTask", - ]: - continue + waiting_events = self.bpmn_process_instance.waiting_events() + waiting_message_events = filter( + lambda e: e["event_type"] == "Message", waiting_events + ) - # timer events are not related to messaging, so ignore them for these purposes - if waiting_task.task_spec.event_definition.__class__.__name__.endswith( - "TimerEventDefinition" + for event in waiting_message_events: + # Ensure we are only creating one message instance for each waiting message + if ( + MessageInstanceModel.query.filter_by( + process_instance_id=self.process_instance_model.id, + message_type="receive", + name=event["name"], + ).count() + > 0 ): continue - message_model = MessageModel.query.filter_by( - name=waiting_task.task_spec.event_definition.name - ).first() - if message_model is None: - raise ApiError( - "invalid_message_name", - ( - "Invalid message name:" - f" {waiting_task.task_spec.event_definition.name}." - ), - ) - - # Ensure we are only creating one message instance for each waiting message - message_instance = MessageInstanceModel.query.filter_by( - process_instance_id=self.process_instance_model.id, - message_type="receive", - message_model_id=message_model.id, - ).first() - if message_instance: - continue - + # Create a new Message Instance message_instance = MessageInstanceModel( process_instance_id=self.process_instance_model.id, + user_id=self.process_instance_model.process_initiator_id, message_type="receive", - message_model_id=message_model.id, + name=event["name"], + correlation_keys=self.bpmn_process_instance.correlations, ) + for correlation_property in event["value"]: + message_correlation = MessageInstanceCorrelationRuleModel( + message_instance_id=message_instance.id, + name=correlation_property.name, + retrieval_expression=correlation_property.retrieval_expression, + ) + message_instance.correlation_rules.append(message_correlation) db.session.add(message_instance) - - for ( - spiff_correlation_property - ) in waiting_task.task_spec.event_definition.correlation_properties: - # NOTE: we may have to cycle through keys here - # not sure yet if it's valid for a property to be associated with multiple keys - correlation_key_name = spiff_correlation_property.correlation_keys[0] - message_correlation = ( - MessageCorrelationModel.query.filter_by( - process_instance_id=self.process_instance_model.id, - name=correlation_key_name, - ) - .join(MessageCorrelationPropertyModel) - .filter_by(identifier=spiff_correlation_property.name) - .first() - ) - message_correlation_message_instance = ( - MessageCorrelationMessageInstanceModel( - message_instance_id=message_instance.id, - message_correlation_id=message_correlation.id, - ) - ) - db.session.add(message_correlation_message_instance) - db.session.commit() def increment_spiff_step(self) -> None: @@ -1789,7 +1641,6 @@ class ProcessInstanceProcessor: details_model.end_in_seconds = time.time() details_model.task_json = self.get_task_json_from_spiff_task(task) db.session.add(details_model) - # this is the thing that actually commits the db transaction (on behalf of the other updates above as well) self.save() diff --git a/src/spiffworkflow_backend/services/process_instance_service.py b/src/spiffworkflow_backend/services/process_instance_service.py index 7b74ef502..13c40cfe3 100644 --- a/src/spiffworkflow_backend/services/process_instance_service.py +++ b/src/spiffworkflow_backend/services/process_instance_service.py @@ -8,8 +8,8 @@ import sentry_sdk from flask import current_app from SpiffWorkflow.task import Task as SpiffTask # type: ignore +from spiffworkflow_backend import db from spiffworkflow_backend.exceptions.api_error import ApiError -from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.human_task import HumanTaskModel from spiffworkflow_backend.models.process_instance import ProcessInstanceApi from spiffworkflow_backend.models.process_instance import ProcessInstanceModel @@ -41,13 +41,14 @@ class ProcessInstanceService: user: UserModel, ) -> ProcessInstanceModel: """Get_process_instance_from_spec.""" + db.session.commit() try: current_git_revision = GitService.get_current_revision() except GitCommandError: current_git_revision = "" process_instance_model = ProcessInstanceModel( status=ProcessInstanceStatus.not_started.value, - process_initiator=user, + process_initiator_id=user.id, process_model_identifier=process_model.id, process_model_display_name=process_model.display_name, start_in_seconds=round(time.time()), @@ -234,23 +235,6 @@ class ProcessInstanceService: # maybe move this out once we have the interstitial page since this is here just so we can get the next human task processor.do_engine_steps(save=True) - @staticmethod - def extract_form_data(latest_data: dict, task: SpiffTask) -> dict: - """Extracts data from the latest_data that is directly related to the form that is being submitted.""" - data = {} - - if hasattr(task.task_spec, "form"): - for field in task.task_spec.form.fields: - if field.has_property(Task.FIELD_PROP_REPEAT): - group = field.get_property(Task.FIELD_PROP_REPEAT) - if group in latest_data: - data[group] = latest_data[group] - else: - value = ProcessInstanceService.get_dot_value(field.id, latest_data) - if value is not None: - ProcessInstanceService.set_dot_value(field.id, value, data) - return data - @staticmethod def create_dot_dict(data: dict) -> dict[str, Any]: """Create_dot_dict.""" diff --git a/src/spiffworkflow_backend/services/spec_file_service.py b/src/spiffworkflow_backend/services/spec_file_service.py index 13cc41243..4a36fe11f 100644 --- a/src/spiffworkflow_backend/services/spec_file_service.py +++ b/src/spiffworkflow_backend/services/spec_file_service.py @@ -8,14 +8,13 @@ from typing import Optional from lxml import etree # type: ignore from SpiffWorkflow.bpmn.parser.BpmnParser import BpmnValidator # type: ignore +from spiffworkflow_backend.models.correlation_property_cache import ( + CorrelationPropertyCache, +) from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.file import File from spiffworkflow_backend.models.file import FileType from spiffworkflow_backend.models.file import SpecReference -from spiffworkflow_backend.models.message_correlation_property import ( - MessageCorrelationPropertyModel, -) -from spiffworkflow_backend.models.message_model import MessageModel from spiffworkflow_backend.models.message_triggerable_process_model import ( MessageTriggerableProcessModel, ) @@ -175,8 +174,8 @@ class SpecFileService(FileSystemService): """Validate_bpmn_xml.""" file_type = FileSystemService.file_type(file_name) if file_type.value == FileType.bpmn.value: - validator = BpmnValidator() - parser = MyCustomParser(validator=validator) + BpmnValidator() + parser = MyCustomParser() try: parser.add_bpmn_xml( cls.get_etree_from_xml_bytes(binary_data), filename=file_name @@ -336,39 +335,30 @@ class SpecFileService(FileSystemService): @staticmethod def update_message_cache(ref: SpecReference) -> None: """Assure we have a record in the database of all possible message ids and names.""" - for message_model_identifier in ref.messages.keys(): - message_model = MessageModel.query.filter_by( - identifier=message_model_identifier - ).first() - if message_model is None: - message_model = MessageModel( - identifier=message_model_identifier, - name=ref.messages[message_model_identifier], - ) - db.session.add(message_model) - db.session.commit() + # for message_model_identifier in ref.messages.keys(): + # message_model = MessageModel.query.filter_by( + # identifier=message_model_identifier + # ).first() + # if message_model is None: + # message_model = MessageModel( + # identifier=message_model_identifier, + # name=ref.messages[message_model_identifier], + # ) + # db.session.add(message_model) + # db.session.commit() @staticmethod def update_message_trigger_cache(ref: SpecReference) -> None: """Assure we know which messages can trigger the start of a process.""" - for message_model_identifier in ref.start_messages: - message_model = MessageModel.query.filter_by( - identifier=message_model_identifier - ).first() - if message_model is None: - raise ProcessModelFileInvalidError( - "Could not find message model with identifier" - f" '{message_model_identifier}'Required by a Start Event in :" - f" {ref.file_name}" - ) + for message_name in ref.start_messages: message_triggerable_process_model = ( MessageTriggerableProcessModel.query.filter_by( - message_model_id=message_model.id, + message_name=message_name, ).first() ) if message_triggerable_process_model is None: message_triggerable_process_model = MessageTriggerableProcessModel( - message_model_id=message_model.id, + message_name=message_name, process_model_identifier=ref.process_model_id, ) db.session.add(message_triggerable_process_model) @@ -386,33 +376,28 @@ class SpecFileService(FileSystemService): @staticmethod def update_correlation_cache(ref: SpecReference) -> None: """Update_correlation_cache.""" - for correlation_identifier in ref.correlations.keys(): - correlation_property_retrieval_expressions = ref.correlations[ - correlation_identifier - ]["retrieval_expressions"] + for name in ref.correlations.keys(): + correlation_property_retrieval_expressions = ref.correlations[name][ + "retrieval_expressions" + ] for cpre in correlation_property_retrieval_expressions: - message_model_identifier = cpre["messageRef"] - message_model = MessageModel.query.filter_by( - identifier=message_model_identifier + message_name = ref.messages.get(cpre["messageRef"], None) + retrieval_expression = cpre["expression"] + process_model_id = ref.process_model_id + + existing = CorrelationPropertyCache.query.filter_by( + name=name, + message_name=message_name, + process_model_id=process_model_id, + retrieval_expression=retrieval_expression, ).first() - if message_model is None: - raise ProcessModelFileInvalidError( - "Could not find message model with identifier" - f" '{message_model_identifier}'specified by correlation" - f" property: {cpre}" + if existing is None: + new_cache = CorrelationPropertyCache( + name=name, + message_name=message_name, + process_model_id=process_model_id, + retrieval_expression=retrieval_expression, ) - # fixme: I think we are currently ignoring the correction properties. - message_correlation_property = ( - MessageCorrelationPropertyModel.query.filter_by( - identifier=correlation_identifier, - message_model_id=message_model.id, - ).first() - ) - if message_correlation_property is None: - message_correlation_property = MessageCorrelationPropertyModel( - identifier=correlation_identifier, - message_model_id=message_model.id, - ) - db.session.add(message_correlation_property) + db.session.add(new_cache) db.session.commit() diff --git a/tests/data/dynamic_enum_select_fields/dynamic_enums_ask_for_color.bpmn b/tests/data/dynamic_enum_select_fields/dynamic_enums_ask_for_color.bpmn index d4f1aa5d2..9b15cb09a 100644 --- a/tests/data/dynamic_enum_select_fields/dynamic_enums_ask_for_color.bpmn +++ b/tests/data/dynamic_enum_select_fields/dynamic_enums_ask_for_color.bpmn @@ -1,6 +1,6 @@ - + Flow_1my9ag5 @@ -28,7 +28,7 @@ form_ui_hidden_fields = ["veryImportantFieldButOnlySometimes", "building.floor"] - + diff --git a/tests/data/error/instructions_error.bpmn b/tests/data/error/instructions_error.bpmn index 24039bbbd..1db55f390 100644 --- a/tests/data/error/instructions_error.bpmn +++ b/tests/data/error/instructions_error.bpmn @@ -1,6 +1,6 @@ - + Flow_0smvjir @@ -21,7 +21,7 @@ Department: {{ department }} - + diff --git a/tests/data/get_localtime/get_localtime.bpmn b/tests/data/get_localtime/get_localtime.bpmn index 5660ba0bc..2efa2fa68 100644 --- a/tests/data/get_localtime/get_localtime.bpmn +++ b/tests/data/get_localtime/get_localtime.bpmn @@ -1,6 +1,6 @@ - + Flow_0ijucqh @@ -40,7 +40,7 @@ localtime = get_localtime(some_time, timezone) - + diff --git a/tests/data/manual_task/manual_task.bpmn b/tests/data/manual_task/manual_task.bpmn index aefbb376b..4f0fba72c 100644 --- a/tests/data/manual_task/manual_task.bpmn +++ b/tests/data/manual_task/manual_task.bpmn @@ -1,6 +1,6 @@ - + Flow_1xlck7g @@ -18,7 +18,7 @@ - + diff --git a/tests/data/message/message_send_receive.bpmn b/tests/data/message/message_send_receive.bpmn new file mode 100644 index 000000000..e62517f72 --- /dev/null +++ b/tests/data/message/message_send_receive.bpmn @@ -0,0 +1,153 @@ + + + + + + + + + The messages sent here are about an Invoice that can be uniquely identified by the customer_id ("sartography") and a purchase order number (1001) + +It will fire a message connected to the invoice keys above, starting another process, which can communicate back to this specific process instance using the correct key. + + + + po_number + customer_id + + + + + po_number + + + po_number + + + + + customer_id + + + customer_id + + + + + Flow_10conab + + + + + + Flow_1qgz6p0 + + + Flow_037vpjk + Flow_1qgz6p0 + + + + + the_topic = "first_conversation" + + Flow_02lw0q9 + Flow_037vpjk + + + + + + + + + + Flow_10conab + Flow_02lw0q9 + + + + + { +"customer_id": customer_id, +"po_number": po_number, +"amount": amount, +"description": description, +} + + + + + the_payload + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/tests/data/message_send_one_conversation/message_receiver.bpmn b/tests/data/message_send_one_conversation/message_receiver.bpmn index 828f7d2ec..53eadc415 100644 --- a/tests/data/message_send_one_conversation/message_receiver.bpmn +++ b/tests/data/message_send_one_conversation/message_receiver.bpmn @@ -1,42 +1,39 @@ - + - - message_correlation_property_topica - message_correlation_property_topicb + + customer_id + po_number - - - topica + + + customer_id - - the_payload.topica + + customer_id - - - topicb + + + po_number - - the_payload.topicb + + po_number - + - the_payload + invoice - + - { "the_payload": { -"topica": the_payload.topica, -"topicb": the_payload.topicb, -}} + invoice @@ -45,28 +42,21 @@ Flow_11r9uiw - + Flow_0fruoax Flow_11r9uiw Flow_0fruoax - + + - - - - - - - - @@ -79,6 +69,14 @@ + + + + + + + + diff --git a/tests/data/message_send_one_conversation/message_sender.bpmn b/tests/data/message_send_one_conversation/message_sender.bpmn index 7bda31eb7..e62517f72 100644 --- a/tests/data/message_send_one_conversation/message_sender.bpmn +++ b/tests/data/message_send_one_conversation/message_sender.bpmn @@ -5,25 +5,31 @@ - - message_correlation_property_topica - message_correlation_property_topicb + + The messages sent here are about an Invoice that can be uniquely identified by the customer_id ("sartography") and a purchase order number (1001) + +It will fire a message connected to the invoice keys above, starting another process, which can communicate back to this specific process instance using the correct key. + + + + po_number + customer_id - - - topica + + + po_number - - the_payload.topica + + po_number - - - topicb + + + customer_id - - the_payload.topicb + + customer_id @@ -32,42 +38,45 @@ - + Flow_1qgz6p0 - + Flow_037vpjk Flow_1qgz6p0 - + - + the_topic = "first_conversation" - Flow_1ihr88m + Flow_02lw0q9 Flow_037vpjk - - + + + + + + + + Flow_10conab - Flow_1ihr88m - -timestamp = time.time() -the_topica = f"first_conversation_a_{timestamp}" -the_topicb = f"first_conversation_b_{timestamp}" -del time - + Flow_02lw0q9 + - + { -"topica": the_topica, -"topicb": the_topicb, +"customer_id": customer_id, +"po_number": po_number, +"amount": amount, +"description": description, } - + the_payload @@ -78,22 +87,6 @@ del time - - - - - - - - - - - - - - - - @@ -103,20 +96,44 @@ del time - + - - + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -128,7 +145,7 @@ del time - + diff --git a/tests/data/message_send_two_conversations/message_receiver_two.bpmn b/tests/data/message_send_two_conversations/message_receiver_two.bpmn index 67e856a5e..a1bbe6f01 100644 --- a/tests/data/message_send_two_conversations/message_receiver_two.bpmn +++ b/tests/data/message_send_two_conversations/message_receiver_two.bpmn @@ -12,18 +12,18 @@ - topica_two + topica_two - topica_two + topic_two_a - topicb_two + topicb_two - topicb_two + topic_two_b @@ -34,8 +34,8 @@ { -"topica_two": payload_var_two.topica_two, -"topicb_two": payload_var_two.topicb_two, +"topic_two_a": payload_var_two.topica_two, +"topic_two_b": payload_var_two.topicb_two, "second_var_two": second_var_two } diff --git a/tests/data/message_send_two_conversations/message_sender.bpmn b/tests/data/message_send_two_conversations/message_sender.bpmn index 160517055..61b06a1c6 100644 --- a/tests/data/message_send_two_conversations/message_sender.bpmn +++ b/tests/data/message_send_two_conversations/message_sender.bpmn @@ -19,18 +19,18 @@ - topica_one + topica_one - payload_var_one.topica_one + topica_one - topicb_one + topicb_one - payload_var_one.topicb + topicb_one @@ -117,18 +117,18 @@ del time - topica_two + topica_two - topica_two + topic_two_a - topicb_two + topicb_two - topicb_two + topic_two_b diff --git a/tests/data/model_with_lanes/lanes.bpmn b/tests/data/model_with_lanes/lanes.bpmn index 3ee435013..b396bf714 100644 --- a/tests/data/model_with_lanes/lanes.bpmn +++ b/tests/data/model_with_lanes/lanes.bpmn @@ -1,13 +1,13 @@ - + - + StartEvent_1 - initator_one + initiator_one Event_06f4e68 initiator_two @@ -18,18 +18,18 @@ Flow_1tbyols - - - + + + - This is initiator user? + This is for the initiator user Flow_1tbyols Flow_16ppta1 - This is finance user? + This is for a Finance Team user Flow_16ppta1 Flow_1cfcauf @@ -41,7 +41,8 @@ - This is initiator again? + This is initiator again + Flow_1cfcauf Flow_0x92f7d @@ -63,7 +64,7 @@ - + diff --git a/tests/data/model_with_lanes/lanes_with_owner_dict.bpmn b/tests/data/model_with_lanes/lanes_with_owner_dict.bpmn index 0c2af8d48..9d0f2a307 100644 --- a/tests/data/model_with_lanes/lanes_with_owner_dict.bpmn +++ b/tests/data/model_with_lanes/lanes_with_owner_dict.bpmn @@ -1,9 +1,9 @@ - + - + StartEvent_1 diff --git a/tests/data/simple_form/simple_form.bpmn b/tests/data/simple_form/simple_form.bpmn index 410561738..a2f29fd3b 100644 --- a/tests/data/simple_form/simple_form.bpmn +++ b/tests/data/simple_form/simple_form.bpmn @@ -1,6 +1,6 @@ - + Flow_0smvjir @@ -14,6 +14,7 @@ Hello {{ name }} Department: {{ department }} + user_completing_task = get_last_user_completing_task("Process_WithForm", "Activity_SimpleForm") Flow_1ly1khd Flow_1boyhcj @@ -25,13 +26,14 @@ Department: {{ department }} + process_initiator_user = get_process_initiator_user() Flow_0smvjir Flow_1ly1khd - + diff --git a/tests/data/simple_form_with_error/simple_form_with_error.bpmn b/tests/data/simple_form_with_error/simple_form_with_error.bpmn index 351d53a65..43d3d1167 100644 --- a/tests/data/simple_form_with_error/simple_form_with_error.bpmn +++ b/tests/data/simple_form_with_error/simple_form_with_error.bpmn @@ -1,6 +1,6 @@ - + Flow_0smvjir @@ -31,7 +31,7 @@ Department: {{ department }} - + diff --git a/tests/data/simple_script/simple_script.bpmn b/tests/data/simple_script/simple_script.bpmn index 6e14807fa..f5efba61d 100644 --- a/tests/data/simple_script/simple_script.bpmn +++ b/tests/data/simple_script/simple_script.bpmn @@ -1,6 +1,6 @@ - + Flow_0r3ua0i @@ -48,7 +48,7 @@ b = 2 - + diff --git a/tests/spiffworkflow_backend/integration/test_process_api.py b/tests/spiffworkflow_backend/integration/test_process_api.py index a72796e0f..881f11ca5 100644 --- a/tests/spiffworkflow_backend/integration/test_process_api.py +++ b/tests/spiffworkflow_backend/integration/test_process_api.py @@ -583,7 +583,7 @@ class TestProcessApi(BaseTest): # We should get 5 back, as one of the items in the cache is a decision. assert len(response.json) == 5 simple_form = next( - p for p in response.json if p["identifier"] == "Proccess_WithForm" + p for p in response.json if p["identifier"] == "Process_WithForm" ) assert simple_form["display_name"] == "Process With Form" assert simple_form["process_model_id"] == "test_group_one/simple_form" @@ -1347,11 +1347,12 @@ class TestProcessApi(BaseTest): bpmn_file_location=bpmn_file_location, ) - message_model_identifier = "message_send" + message_model_identifier = "Request Approval" payload = { - "topica": "the_topica_string", - "topicb": "the_topicb_string", - "andThis": "another_item_non_key", + "customer_id": "sartography", + "po_number": "1001", + "amount": "One Billion Dollars! Mwhahahahahaha", + "description": "But seriously.", } response = client.post( f"/v1.0/messages/{message_model_identifier}", @@ -1372,7 +1373,7 @@ class TestProcessApi(BaseTest): processor = ProcessInstanceProcessor(process_instance) process_instance_data = processor.get_data() assert process_instance_data - assert process_instance_data["the_payload"] == payload + assert process_instance_data["invoice"] == payload def test_message_send_when_providing_message_to_running_process_instance( self, @@ -1395,13 +1396,12 @@ class TestProcessApi(BaseTest): bpmn_file_location=bpmn_file_location, ) - message_model_identifier = "message_response" + message_model_identifier = "Approval Result" payload = { - "the_payload": { - "topica": "the_payload.topica_string", - "topicb": "the_payload.topicb_string", - "andThis": "another_item_non_key", - } + "customer_id": "sartography", + "po_number": "1001", + "amount": "One Billion Dollars! Mwhahahahahaha", + "description": "Ya!, a-ok bud!", } response = self.create_process_instance_from_process_model_id_with_api( client, @@ -1415,9 +1415,25 @@ class TestProcessApi(BaseTest): f"/v1.0/process-instances/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}/run", headers=self.logged_in_headers(with_super_admin_user), ) - assert response.json is not None + process_instance = ProcessInstanceModel.query.filter_by( + id=process_instance_id + ).first() + processor = ProcessInstanceProcessor(process_instance) + processor.do_engine_steps(save=True) + task = processor.get_all_user_tasks()[0] + human_task = process_instance.active_human_tasks[0] + + ProcessInstanceService.complete_form_task( + processor, + task, + payload, + with_super_admin_user, + human_task, + ) + processor.save() + response = client.post( f"/v1.0/messages/{message_model_identifier}", content_type="application/json", @@ -1462,14 +1478,14 @@ class TestProcessApi(BaseTest): bpmn_file_location=bpmn_file_location, ) - message_model_identifier = "message_response" + message_model_identifier = "Approval Result" payload = { - "the_payload": { - "topica": "the_payload.topica_string", - "topicb": "the_payload.topicb_string", - "andThis": "another_item_non_key", - } + "customer_id": "sartography", + "po_number": "1001", + "amount": "One Billion Dollars! Mwhahahahahaha", + "description": "But seriously.", } + response = self.create_process_instance_from_process_model_id_with_api( client, process_model_identifier, @@ -1478,20 +1494,25 @@ class TestProcessApi(BaseTest): assert response.json is not None process_instance_id = response.json["id"] - response = client.post( - f"/v1.0/process-instances/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}/run", - headers=self.logged_in_headers(with_super_admin_user), - ) - - assert response.status_code == 200 - assert response.json is not None - process_instance = ProcessInstanceModel.query.filter_by( id=process_instance_id ).first() processor = ProcessInstanceProcessor(process_instance) + processor.do_engine_steps(save=True) + task = processor.get_all_user_tasks()[0] + human_task = process_instance.active_human_tasks[0] + + ProcessInstanceService.complete_form_task( + processor, + task, + payload, + with_super_admin_user, + human_task, + ) + processor.save() processor.suspend() + payload["description"] = "Message To Suspended" response = client.post( f"/v1.0/messages/{message_model_identifier}", content_type="application/json", @@ -1502,16 +1523,15 @@ class TestProcessApi(BaseTest): ) assert response.status_code == 400 assert response.json - assert response.json["error_code"] == "process_instance_is_suspended" + assert response.json["error_code"] == "message_not_accepted" processor.resume() + payload["description"] = "Message To Resumed" response = client.post( f"/v1.0/messages/{message_model_identifier}", content_type="application/json", headers=self.logged_in_headers(with_super_admin_user), - data=json.dumps( - {"payload": payload, "process_instance_id": process_instance_id} - ), + data=json.dumps({"payload": payload}), ) assert response.status_code == 200 json_data = response.json @@ -1538,7 +1558,7 @@ class TestProcessApi(BaseTest): ) assert response.status_code == 400 assert response.json - assert response.json["error_code"] == "process_instance_is_terminated" + assert response.json["error_code"] == "message_not_accepted" def test_process_instance_can_be_terminated( self, @@ -2293,11 +2313,12 @@ class TestProcessApi(BaseTest): # process_model_source_directory="message_send_one_conversation", # bpmn_file_name="message_receiver", # ) - message_model_identifier = "message_send" + message_model_identifier = "Request Approval" payload = { - "topica": "the_topica_string", - "topicb": "the_topicb_string", - "andThis": "another_item_non_key", + "customer_id": "sartography", + "po_number": "1001", + "amount": "One Billion Dollars! Mwhahahahahaha", + "description": "But seriously.", } response = client.post( f"/v1.0/messages/{message_model_identifier}", @@ -2309,6 +2330,7 @@ class TestProcessApi(BaseTest): assert response.json is not None process_instance_id_one = response.json["id"] + payload["po_number"] = "1002" response = client.post( f"/v1.0/messages/{message_model_identifier}", content_type="application/json", @@ -2325,7 +2347,9 @@ class TestProcessApi(BaseTest): ) assert response.status_code == 200 assert response.json is not None - assert len(response.json["results"]) == 1 + assert ( + len(response.json["results"]) == 2 + ) # Two messages, one is the completed receive, the other is new send assert ( response.json["results"][0]["process_instance_id"] == process_instance_id_one @@ -2337,7 +2361,7 @@ class TestProcessApi(BaseTest): ) assert response.status_code == 200 assert response.json is not None - assert len(response.json["results"]) == 1 + assert len(response.json["results"]) == 2 assert ( response.json["results"][0]["process_instance_id"] == process_instance_id_two @@ -2349,8 +2373,14 @@ class TestProcessApi(BaseTest): ) assert response.status_code == 200 assert response.json is not None - assert len(response.json["results"]) == 2 + # 4 -Two messages for each process (a record of the completed receive, and then a send created) + # + 2 -Two messages logged for the API Calls used to create the processes. + assert len(response.json["results"]) == 6 + @pytest.mark.skipif( + os.environ.get("SPIFFWORKFLOW_BACKEND_DATABASE_TYPE") == "postgres", + reason="look at comment in tasks_controller method task_list_my_tasks", + ) def test_correct_user_can_get_and_update_a_task( self, app: Flask, @@ -2762,12 +2792,18 @@ class TestProcessApi(BaseTest): assert response.json["status"] == "complete" response = client.get( - f"/v1.0/task-data/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}?all_tasks=true", + f"/v1.0/process-instances/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}/task-info?all_tasks=true", headers=self.logged_in_headers(with_super_admin_user), ) assert response.status_code == 200 - end = next(task for task in response.json if task["type"] == "End Event") - assert end["data"]["result"] == {"message": "message 1"} + end_task = next(task for task in response.json if task["type"] == "End Event") + response = client.get( + f"/v1.0/task-data/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}/{end_task['task_spiff_step']}", + headers=self.logged_in_headers(with_super_admin_user), + ) + assert response.status_code == 200 + task = response.json + assert task["data"]["result"] == {"message": "message 1"} def test_manual_complete_task( self, @@ -2828,7 +2864,7 @@ class TestProcessApi(BaseTest): ) response = client.get( - f"/v1.0/task-data/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}", + f"/v1.0/process-instances/{self.modify_process_identifier_for_path_param(process_model_identifier)}/{process_instance_id}/task-info", headers=self.logged_in_headers(with_super_admin_user), ) assert len(response.json) == 1 diff --git a/tests/spiffworkflow_backend/scripts/test_get_last_user_completing_task.py b/tests/spiffworkflow_backend/scripts/test_get_last_user_completing_task.py new file mode 100644 index 000000000..d6533eaec --- /dev/null +++ b/tests/spiffworkflow_backend/scripts/test_get_last_user_completing_task.py @@ -0,0 +1,69 @@ +"""Test_get_localtime.""" +from flask.app import Flask +from flask.testing import FlaskClient +from tests.spiffworkflow_backend.helpers.base_test import BaseTest +from tests.spiffworkflow_backend.helpers.test_data import load_test_spec + +from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.services.authorization_service import AuthorizationService +from spiffworkflow_backend.services.process_instance_processor import ( + ProcessInstanceProcessor, +) +from spiffworkflow_backend.services.process_instance_service import ( + ProcessInstanceService, +) + + +class TestGetLastUserCompletingTask(BaseTest): + def test_get_last_user_completing_task_script_works( + self, + app: Flask, + client: FlaskClient, + with_db_and_bpmn_file_cleanup: None, + with_super_admin_user: UserModel, + ) -> None: + """Test_sets_permission_correctly_on_human_task.""" + self.create_process_group( + client, with_super_admin_user, "test_group", "test_group" + ) + initiator_user = self.find_or_create_user("initiator_user") + assert initiator_user.principal is not None + AuthorizationService.import_permissions_from_yaml_file() + + process_model = load_test_spec( + process_model_id="misc/category_number_one/simple_form", + # bpmn_file_name="simp.bpmn", + process_model_source_directory="simple_form", + ) + process_instance = self.create_process_instance_from_process_model( + process_model=process_model, user=initiator_user + ) + processor = ProcessInstanceProcessor(process_instance) + processor.do_engine_steps(save=True) + + assert len(process_instance.active_human_tasks) == 1 + human_task = process_instance.active_human_tasks[0] + assert len(human_task.potential_owners) == 1 + assert human_task.potential_owners[0] == initiator_user + + spiff_task = processor.__class__.get_task_by_bpmn_identifier( + human_task.task_name, processor.bpmn_process_instance + ) + ProcessInstanceService.complete_form_task( + processor, spiff_task, {"name": "HEY"}, initiator_user, human_task + ) + + assert len(process_instance.active_human_tasks) == 1 + human_task = process_instance.active_human_tasks[0] + spiff_task = processor.__class__.get_task_by_bpmn_identifier( + human_task.task_name, processor.bpmn_process_instance + ) + ProcessInstanceService.complete_form_task( + processor, spiff_task, {}, initiator_user, human_task + ) + + assert spiff_task is not None + assert ( + initiator_user.username + == spiff_task.get_data("user_completing_task")["username"] + ) diff --git a/tests/spiffworkflow_backend/scripts/test_get_process_initiator_user.py b/tests/spiffworkflow_backend/scripts/test_get_process_initiator_user.py new file mode 100644 index 000000000..5e7342278 --- /dev/null +++ b/tests/spiffworkflow_backend/scripts/test_get_process_initiator_user.py @@ -0,0 +1,62 @@ +"""Test_get_localtime.""" +from spiffworkflow_backend.services.authorization_service import AuthorizationService +from tests.spiffworkflow_backend.helpers.test_data import load_test_spec + +from flask.app import Flask +from flask.testing import FlaskClient +from tests.spiffworkflow_backend.helpers.base_test import BaseTest + +from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.services.process_instance_processor import ( + ProcessInstanceProcessor, +) +from spiffworkflow_backend.services.process_instance_service import ( + ProcessInstanceService, +) + + +class TestGetProcessInitiatorUser(BaseTest): + + def test_get_process_initiator_user( + self, + app: Flask, + client: FlaskClient, + with_db_and_bpmn_file_cleanup: None, + with_super_admin_user: UserModel, + ) -> None: + """Test_sets_permission_correctly_on_human_task.""" + self.create_process_group( + client, with_super_admin_user, "test_group", "test_group" + ) + initiator_user = self.find_or_create_user("initiator_user") + assert initiator_user.principal is not None + AuthorizationService.import_permissions_from_yaml_file() + + process_model = load_test_spec( + process_model_id="misc/category_number_one/simple_form", + # bpmn_file_name="simp.bpmn", + process_model_source_directory="simple_form", + ) + process_instance = self.create_process_instance_from_process_model( + process_model=process_model, user=initiator_user + ) + processor = ProcessInstanceProcessor(process_instance) + processor.do_engine_steps(save=True) + + assert len(process_instance.active_human_tasks) == 1 + human_task = process_instance.active_human_tasks[0] + assert len(human_task.potential_owners) == 1 + assert human_task.potential_owners[0] == initiator_user + + spiff_task = processor.__class__.get_task_by_bpmn_identifier( + human_task.task_name, processor.bpmn_process_instance + ) + ProcessInstanceService.complete_form_task( + processor, spiff_task, {"name": "HEY"}, initiator_user, human_task + ) + + assert spiff_task is not None + assert ( + initiator_user.username + == spiff_task.get_data("process_initiator_user")["username"] + ) diff --git a/tests/spiffworkflow_backend/unit/test_message_instance.py b/tests/spiffworkflow_backend/unit/test_message_instance.py index 6c90eb254..b48bc239c 100644 --- a/tests/spiffworkflow_backend/unit/test_message_instance.py +++ b/tests/spiffworkflow_backend/unit/test_message_instance.py @@ -6,7 +6,6 @@ from tests.spiffworkflow_backend.helpers.base_test import BaseTest from spiffworkflow_backend.models.db import db from spiffworkflow_backend.models.message_instance import MessageInstanceModel -from spiffworkflow_backend.models.message_model import MessageModel from spiffworkflow_backend.models.user import UserModel from spiffworkflow_backend.services.process_model_service import ProcessModelService @@ -38,8 +37,7 @@ class TestMessageInstance(BaseTest): with_super_admin_user: UserModel, ) -> None: """Test_can_create_message_instance.""" - message_model_identifier = "message_model_one" - message_model = self.create_message_model(message_model_identifier) + message_name = "Message Model One" process_model_identifier = self.setup_message_tests( client, with_super_admin_user ) @@ -53,8 +51,10 @@ class TestMessageInstance(BaseTest): queued_message = MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, + payload={"Word": "Eat At Mashita's, its delicious!"}, ) db.session.add(queued_message) db.session.commit() @@ -75,12 +75,10 @@ class TestMessageInstance(BaseTest): with_super_admin_user: UserModel, ) -> None: """Test_cannot_set_invalid_status.""" - message_model_identifier = "message_model_one" - message_model = self.create_message_model(message_model_identifier) + message_name = "message_model_one" process_model_identifier = self.setup_message_tests( client, with_super_admin_user ) - process_model = ProcessModelService.get_process_model( process_model_id=process_model_identifier ) @@ -91,8 +89,9 @@ class TestMessageInstance(BaseTest): with pytest.raises(ValueError) as exception: MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, status="BAD_STATUS", ) assert ( @@ -101,8 +100,9 @@ class TestMessageInstance(BaseTest): queued_message = MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, ) db.session.add(queued_message) db.session.commit() @@ -121,8 +121,7 @@ class TestMessageInstance(BaseTest): with_super_admin_user: UserModel, ) -> None: """Test_cannot_set_invalid_message_type.""" - message_model_identifier = "message_model_one" - message_model = self.create_message_model(message_model_identifier) + message_name = "message_model_one" process_model_identifier = self.setup_message_tests( client, with_super_admin_user ) @@ -137,8 +136,9 @@ class TestMessageInstance(BaseTest): with pytest.raises(ValueError) as exception: MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="BAD_MESSAGE_TYPE", - message_model_id=message_model.id, + name=message_name, ) assert ( str(exception.value) @@ -147,8 +147,9 @@ class TestMessageInstance(BaseTest): queued_message = MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, ) db.session.add(queued_message) db.session.commit() @@ -168,8 +169,7 @@ class TestMessageInstance(BaseTest): with_super_admin_user: UserModel, ) -> None: """Test_force_failure_cause_if_status_is_failure.""" - message_model_identifier = "message_model_one" - message_model = self.create_message_model(message_model_identifier) + message_name = "message_model_one" process_model_identifier = self.setup_message_tests( client, with_super_admin_user ) @@ -183,8 +183,9 @@ class TestMessageInstance(BaseTest): queued_message = MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, status="failed", ) db.session.add(queued_message) @@ -199,8 +200,9 @@ class TestMessageInstance(BaseTest): queued_message = MessageInstanceModel( process_instance_id=process_instance.id, + user_id=process_instance.process_initiator_id, message_type="send", - message_model_id=message_model.id, + name=message_name, ) db.session.add(queued_message) db.session.commit() @@ -211,11 +213,3 @@ class TestMessageInstance(BaseTest): db.session.commit() assert queued_message.id is not None assert queued_message.failure_cause == "THIS TEST FAILURE" - - @staticmethod - def create_message_model(message_model_identifier: str) -> MessageModel: - """Create_message_model.""" - message_model = MessageModel(identifier=message_model_identifier) - db.session.add(message_model) - db.session.commit() - return message_model diff --git a/tests/spiffworkflow_backend/unit/test_message_service.py b/tests/spiffworkflow_backend/unit/test_message_service.py index c012e287a..266004397 100644 --- a/tests/spiffworkflow_backend/unit/test_message_service.py +++ b/tests/spiffworkflow_backend/unit/test_message_service.py @@ -1,16 +1,15 @@ """Test_message_service.""" +import pytest from flask import Flask from flask.testing import FlaskClient from tests.spiffworkflow_backend.helpers.base_test import BaseTest from tests.spiffworkflow_backend.helpers.test_data import load_test_spec -from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel -from spiffworkflow_backend.models.message_correlation_message_instance import ( - MessageCorrelationMessageInstanceModel, -) +from spiffworkflow_backend.exceptions.api_error import ApiError from spiffworkflow_backend.models.message_instance import MessageInstanceModel from spiffworkflow_backend.models.process_instance import ProcessInstanceModel from spiffworkflow_backend.models.user import UserModel +from spiffworkflow_backend.routes.messages_controller import message_send from spiffworkflow_backend.services.message_service import MessageService from spiffworkflow_backend.services.process_instance_processor import ( ProcessInstanceProcessor, @@ -23,105 +22,209 @@ from spiffworkflow_backend.services.process_instance_service import ( class TestMessageService(BaseTest): """TestMessageService.""" - def test_can_send_message_to_waiting_message( + def test_message_from_api_into_running_process( self, app: Flask, client: FlaskClient, with_db_and_bpmn_file_cleanup: None, with_super_admin_user: UserModel, ) -> None: - """Test_can_send_message_to_waiting_message.""" - process_group_id = "test_group" + """Test sending a message to a running process via the API. + + This example workflow will send a message called 'request_approval' and then wait for a response message + of 'approval_result'. This test assures that it will fire the message with the correct correlation properties + and will respond only to a message called 'approval_result' that has the matching correlation properties, + as sent by an API Call. + """ + self.payload = { + "customer_id": "Sartography", + "po_number": 1001, + "description": "We built a new feature for messages!", + "amount": "100.00", + } + + self.start_sender_process(client, with_super_admin_user, "test_from_api") + self.assure_a_message_was_sent() + self.assure_there_is_a_process_waiting_on_a_message() + + # Make an API call to the service endpoint, but use the wrong po number + with pytest.raises(ApiError): + message_send("Approval Result", {"payload": {"po_number": 5001}}) + + # Should return an error when making an API call for right po number, wrong client + with pytest.raises(ApiError): + message_send( + "Approval Result", + {"payload": {"po_number": 1001, "customer_id": "jon"}}, + ) + + # No error when calling with the correct parameters + message_send( + "Approval Result", + {"payload": {"po_number": 1001, "customer_id": "Sartography"}}, + ) + + # There is no longer a waiting message + waiting_messages = ( + MessageInstanceModel.query.filter_by(message_type="receive") + .filter_by(status="ready") + .filter_by(process_instance_id=self.process_instance.id) + .all() + ) + assert len(waiting_messages) == 0 + + # The process has completed + assert self.process_instance.status == "complete" + + def test_single_conversation_between_two_processes( + self, + app: Flask, + client: FlaskClient, + with_super_admin_user: UserModel, + ) -> None: + """Test messages between two different running processes using a single conversation. + + Assure that communication between two processes works the same as making a call through the API, here + we have two process instances that are communicating with each other using one conversation about an + Invoice whose details are defined in the following message payload + """ + self.payload = { + "customer_id": "Sartography", + "po_number": 1001, + "description": "We built a new feature for messages!", + "amount": "100.00", + } + + # Load up the definition for the receiving process (it has a message start event that should cause it to + # fire when a unique message comes through. + # Fire up the first process + load_test_spec( + "test_group/message_receive", + process_model_source_directory="message_send_one_conversation", + bpmn_file_name="message_receiver.bpmn", + ) + + # Now start the main process + self.start_sender_process( + client, with_super_admin_user, "test_between_processes" + ) + self.assure_a_message_was_sent() + + # This is typically called in a background cron process, so we will manually call it + # here in the tests + # The first time it is called, it will instantiate a new instance of the message_recieve process + MessageService.correlate_all_message_instances() + + # The sender process should still be waiting on a message to be returned to it ... + self.assure_there_is_a_process_waiting_on_a_message() + + # The second time we call ths process_message_isntances (again it would typically be running on cron) + # it will deliver the message that was sent from the receiver back to the original sender. + MessageService.correlate_all_message_instances() + + # But there should be no send message waiting for delivery, because + # the message receiving process should pick it up instantly via + # it's start event. + waiting_messages = ( + MessageInstanceModel.query.filter_by(message_type="receive") + .filter_by(status="ready") + .filter_by(process_instance_id=self.process_instance.id) + .order_by(MessageInstanceModel.id) + .all() + ) + assert len(waiting_messages) == 0 + MessageService.correlate_all_message_instances() + MessageService.correlate_all_message_instances() + MessageService.correlate_all_message_instances() + assert len(waiting_messages) == 0 + + # The message sender process is complete + assert self.process_instance.status == "complete" + + # The message receiver process is also complete + message_receiver_process = ( + ProcessInstanceModel.query.filter_by( + process_model_identifier="test_group/message_receive" + ) + .order_by(ProcessInstanceModel.id) + .first() + ) + assert message_receiver_process.status == "complete" + + def start_sender_process( + self, + client: FlaskClient, + with_super_admin_user: UserModel, + group_name: str = "test_group", + ) -> None: + process_group_id = group_name self.create_process_group( client, with_super_admin_user, process_group_id, process_group_id ) - load_test_spec( - "test_group/message_receiver", + process_model = load_test_spec( + "test_group/message", process_model_source_directory="message_send_one_conversation", - bpmn_file_name="message_receiver.bpmn", - ) - process_model_sender = load_test_spec( - "test_group/message_sender", - process_model_source_directory="message_send_one_conversation", - bpmn_file_name="message_sender.bpmn", + bpmn_file_name="message_sender.bpmn", # Slightly misnamed, it sends and receives ) - process_instance_sender = ProcessInstanceService.create_process_instance_from_process_model_identifier( - process_model_sender.id, + self.process_instance = ProcessInstanceService.create_process_instance_from_process_model_identifier( + process_model.id, with_super_admin_user, ) + processor_send_receive = ProcessInstanceProcessor(self.process_instance) + processor_send_receive.do_engine_steps(save=True) + task = processor_send_receive.get_all_user_tasks()[0] + human_task = self.process_instance.active_human_tasks[0] - processor_sender = ProcessInstanceProcessor(process_instance_sender) - processor_sender.do_engine_steps() - processor_sender.save() + ProcessInstanceService.complete_form_task( + processor_send_receive, + task, + self.payload, + with_super_admin_user, + human_task, + ) + processor_send_receive.save() - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 2 - # ensure both message instances are for the same process instance - # it will be send_message and receive_message_response + def assure_a_message_was_sent(self) -> None: + # There should be one new send message for the given process instance. + send_messages = ( + MessageInstanceModel.query.filter_by(message_type="send") + .filter_by(process_instance_id=self.process_instance.id) + .order_by(MessageInstanceModel.id) + .all() + ) + assert len(send_messages) == 1 + send_message = send_messages[0] assert ( - message_instance_result[0].process_instance_id - == message_instance_result[1].process_instance_id + send_message.payload == self.payload + ), "The send message should match up with the payload" + assert send_message.name == "Request Approval" + assert send_message.status == "ready" + + def assure_there_is_a_process_waiting_on_a_message(self) -> None: + # There should be one new send message for the given process instance. + waiting_messages = ( + MessageInstanceModel.query.filter_by(message_type="receive") + .filter_by(status="ready") + .filter_by(process_instance_id=self.process_instance.id) + .order_by(MessageInstanceModel.id) + .all() ) + assert len(waiting_messages) == 1 + waiting_message = waiting_messages[0] + self.assure_correlation_properties_are_right(waiting_message) - message_instance_sender = message_instance_result[0] - assert message_instance_sender.process_instance_id == process_instance_sender.id - message_correlations = MessageCorrelationModel.query.all() - assert len(message_correlations) == 2 - assert message_correlations[0].process_instance_id == process_instance_sender.id - message_correlations_message_instances = ( - MessageCorrelationMessageInstanceModel.query.all() + def assure_correlation_properties_are_right( + self, message: MessageInstanceModel + ) -> None: + # Correlation Properties should match up + po_curr = next(c for c in message.correlation_rules if c.name == "po_number") + customer_curr = next( + c for c in message.correlation_rules if c.name == "customer_id" ) - assert len(message_correlations_message_instances) == 4 - assert ( - message_correlations_message_instances[0].message_instance_id - == message_instance_sender.id - ) - assert ( - message_correlations_message_instances[1].message_instance_id - == message_instance_sender.id - ) - assert ( - message_correlations_message_instances[2].message_instance_id - == message_instance_result[1].id - ) - assert ( - message_correlations_message_instances[3].message_instance_id - == message_instance_result[1].id - ) - - # process first message - MessageService.process_message_instances() - assert message_instance_sender.status == "completed" - - process_instance_result = ProcessInstanceModel.query.all() - - assert len(process_instance_result) == 2 - process_instance_receiver = process_instance_result[1] - - # just make sure it's a different process instance - assert process_instance_receiver.id != process_instance_sender.id - assert process_instance_receiver.status == "complete" - - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 3 - message_instance_receiver = message_instance_result[1] - assert message_instance_receiver.id != message_instance_sender.id - assert message_instance_receiver.status == "ready" - - # process second message - MessageService.process_message_instances() - - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 3 - for message_instance in message_instance_result: - assert message_instance.status == "completed" - - process_instance_result = ProcessInstanceModel.query.all() - assert len(process_instance_result) == 2 - for process_instance in process_instance_result: - assert process_instance.status == "complete" + assert po_curr is not None + assert customer_curr is not None def test_can_send_message_to_multiple_process_models( self, @@ -131,7 +234,7 @@ class TestMessageService(BaseTest): with_super_admin_user: UserModel, ) -> None: """Test_can_send_message_to_multiple_process_models.""" - process_group_id = "test_group" + process_group_id = "test_group_multi" self.create_process_group( client, with_super_admin_user, process_group_id, process_group_id ) @@ -155,64 +258,55 @@ class TestMessageService(BaseTest): user = self.find_or_create_user() process_instance_sender = ProcessInstanceService.create_process_instance_from_process_model_identifier( - process_model_sender.id, - user, - # process_group_identifier=process_model_sender.process_group_id, + process_model_sender.id, user ) processor_sender = ProcessInstanceProcessor(process_instance_sender) processor_sender.do_engine_steps() processor_sender.save() - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 3 - # ensure both message instances are for the same process instance - # it will be send_message and receive_message_response + # At this point, the message_sender process has fired two different messages but those + # processes have not started, and it is now paused, waiting for to receive a message. so + # we should have two sends and a receive. assert ( - message_instance_result[0].process_instance_id - == message_instance_result[1].process_instance_id + MessageInstanceModel.query.filter_by( + process_instance_id=process_instance_sender.id + ).count() + == 3 ) + assert ( + MessageInstanceModel.query.count() == 3 + ) # all messages are related to the instance + orig_send_messages = MessageInstanceModel.query.filter_by( + message_type="send" + ).all() + assert len(orig_send_messages) == 2 + assert MessageInstanceModel.query.filter_by(message_type="receive").count() == 1 - message_instance_sender = message_instance_result[0] - assert message_instance_sender.process_instance_id == process_instance_sender.id - message_correlations = MessageCorrelationModel.query.all() - assert len(message_correlations) == 4 - assert message_correlations[0].process_instance_id == process_instance_sender.id - message_correlations_message_instances = ( - MessageCorrelationMessageInstanceModel.query.all() - ) - assert len(message_correlations_message_instances) == 6 - assert ( - message_correlations_message_instances[0].message_instance_id - == message_instance_sender.id - ) - assert ( - message_correlations_message_instances[1].message_instance_id - == message_instance_sender.id - ) - assert ( - message_correlations_message_instances[2].message_instance_id - == message_instance_result[1].id - ) - assert ( - message_correlations_message_instances[3].message_instance_id - == message_instance_result[1].id - ) - - # process first message - MessageService.process_message_instances() - assert message_instance_sender.status == "completed" + # process message instances + MessageService.correlate_all_message_instances() + # Once complete the original send messages should be completed and two new instances + # should now exist, one for each of the process instances ... + # for osm in orig_send_messages: + # assert osm.status == "completed" process_instance_result = ProcessInstanceModel.query.all() - assert len(process_instance_result) == 3 - process_instance_receiver_one = ProcessInstanceModel.query.filter_by( - process_model_identifier="test_group/message_receiver_one" - ).first() + process_instance_receiver_one = ( + ProcessInstanceModel.query.filter_by( + process_model_identifier="test_group/message_receiver_one" + ) + .order_by(ProcessInstanceModel.id) + .first() + ) assert process_instance_receiver_one is not None - process_instance_receiver_two = ProcessInstanceModel.query.filter_by( - process_model_identifier="test_group/message_receiver_two" - ).first() + process_instance_receiver_two = ( + ProcessInstanceModel.query.filter_by( + process_model_identifier="test_group/message_receiver_two" + ) + .order_by(ProcessInstanceModel.id) + .first() + ) assert process_instance_receiver_two is not None # just make sure it's a different process instance @@ -229,8 +323,12 @@ class TestMessageService(BaseTest): assert process_instance_receiver_two.id != process_instance_sender.id assert process_instance_receiver_two.status == "complete" - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 5 + message_instance_result = ( + MessageInstanceModel.query.order_by(MessageInstanceModel.id) + .order_by(MessageInstanceModel.id) + .all() + ) + assert len(message_instance_result) == 7 message_instance_receiver_one = [ x @@ -244,21 +342,25 @@ class TestMessageService(BaseTest): ][0] assert message_instance_receiver_one is not None assert message_instance_receiver_two is not None - assert message_instance_receiver_one.id != message_instance_sender.id - assert message_instance_receiver_one.status == "ready" - assert message_instance_receiver_two.id != message_instance_sender.id - assert message_instance_receiver_two.status == "ready" - # process second message - MessageService.process_message_instances() - MessageService.process_message_instances() + # Cause a currelation event + MessageService.correlate_all_message_instances() + # We have to run it a second time because instances are firing + # more messages that need to be picked up. + MessageService.correlate_all_message_instances() - message_instance_result = MessageInstanceModel.query.all() - assert len(message_instance_result) == 6 + message_instance_result = ( + MessageInstanceModel.query.order_by(MessageInstanceModel.id) + .order_by(MessageInstanceModel.id) + .all() + ) + assert len(message_instance_result) == 8 for message_instance in message_instance_result: assert message_instance.status == "completed" - process_instance_result = ProcessInstanceModel.query.all() + process_instance_result = ProcessInstanceModel.query.order_by( + ProcessInstanceModel.id + ).all() assert len(process_instance_result) == 3 for process_instance in process_instance_result: assert process_instance.status == "complete"