mirror of
https://github.com/status-im/spiff-arena.git
synced 2025-02-21 22:18:21 +00:00
0bfe4400f Merge branch 'main' of github.com:sartography/spiff-arena into main fbe97dd90 Point to the latest spiffworkflow 805400312 fixed conflict with db migrations w/ burnettk 6e48c8205 Merge remote-tracking branch 'origin/main' into feature/script_get_last_user_completing_task 3a6cc9c2b added script to get process initiator w/ burnettk 37770dbca run_pyl 8a068cdb2 Merging main bf25c1cef run_pyl 4ca6e3f27 Needed an additional check for empty correlation keys - which on a RECEIVE message, should always match anything. 0fb88c50d remove unwanted test files w/ burnettk 89d92687d script to get last user completing a task is working w/ burnettk 98f056e5c Merge remote-tracking branch 'origin/main' into feature/script_get_last_user_completing_task 46af55a60 poetry remove orjson 28a0db097 wip for get_last_user_completing_task script task 2596cfeb1 postgres really will just order however it wants if you do not specify an order_by clause 5e339b2fb add ppg.ba4.sme and ba5 873cfdfc1 fix postgres db name and comment out debug job f54d6ae4d removed some unused code from task and fixed the logs table a bit w/ burnettk 7c74e216a run_pyl f53c85924 skip failing test if postgres and added comment about cause w/ burnettk fbe2237f1 # SpiffWorkflow: 1) Type Safe checking on correlation properties (no more str()) 2) A running workflows Correlations are once again at the key level. 2677736c2 look users up by service and username instead of service_id since usernames have to be unique anyway w/ burnettk 22d1f8bbb avoid using task-data endpoint for task data and only use it to get tasks based on spiff step instead 698fb5d5c put back the task data code when getting tasks b52f97453 Merge remote-tracking branch 'origin/main' into feature/task_data_api_refactor 9c4a17c93 removed commented out code w/ burnettk 3357fbef4 removed task-data endpoints since we no longer need them w/ burnettk 3b64afb65 turn off profiling again fcbcfd0ae add two users and update one 2ee7cba09 BPMN Parser was returning all retrieval expressions, rather than the ones specific to a correlation property, as was intended. Adding a correlation cache - so we have a reference of all the messages and properties (though still lacking a description of keys) Adding yet another migration, maybe should squash em. 6658e26bb added api to get task data and do not return from task data list anymore w/ burnettk 1fd3261d7 run_pyl (part 2) f6c63eb1c Merge branch 'main' of github.com:sartography/spiff-arena 5c5262a31 added comment about refactoring getting task data w/ burnettk jbirddog 6d4aa9043 lint e4c0ed7e1 add test users adfb0644f Adding Migration. ec36290f2 remove task size check since it can take a long time to run and we do not do anything with it w/ burnettk jbirddog 9b0b95f95 Merge remote-tracking branch 'origin/main' into feature/message_fixes f3d124a70 run_pyl be6ac8743 BPMN.io -- Just show the message names not the ids - to assure we are only exposing the names. SpiffWorkflow - - start_messages function should return message names, not ids. - don't catch external thrown messages within the same workflow process - add an expected value to the Correlation Property Model so we can use this well defined class as an external communication tool (rather than building an arbitrary dictionary) - Added a "get_awaiting_correlations" to an event, so we can get a list of the correlation properties related to the workflows currently defined correlation values. - workflows.waiting_events() function now returns the above awaiting correlations as the value on returned message events Backend - Dropping MessageModel and MessageCorrelationProperties - at least for now. We don't need them to send / receive messages though we may eventually want to track the messages and correlations defined across the system - these things (which are ever changing) should not be directly connected to the Messages which may be in flux - and the cross relationships between the tables could cause unexpected and unceissary errors. Commented out the caching logic so we can turn this back on later. - Slight improvement to API Errors - MessageInstances are no longer in a many-to-many relationship with Correlations - Each message instance has a unique set of message correlations specific to the instance. - Message Instances have users, and can be linked through a "counterpart_id" so you can see what send is connected to what recieve. - Message Correlations are connected to recieving message instances. It is not to a process instance, and not to a message model. They now include the expected value and retrieval expression required to validate an incoming message. - A process instance is not connected to message correlations. - Message Instances are not always tied to a process instance (for example, a Send Message from an API) - API calls to create a message use the same logic as all other message catching code. - Make use of the new waiting_events() method to check for any new recieve messages in the workflow (much easier than churning through all of the tasks) - One giant mother of a migration. ce449971a do not call serialize if we can use the cached bpmn_json instead w/ burnettk 15d720f94 Merge branch 'main' of github.com:sartography/spiff-arena 8d8347068 turn on sentry detailed tracing for task-data w/ burnettk cdf5f4053 update spiff 5c1ea3c93 run_pyl 384c272af * SpiffWorkflow event_definitions wanted to return a message event's correlation properties mested within correlation keys. But messages are directly related to properties, not to keys - and it forced a number of conversions that made for tricky code. So Messages now contain a dictionary of correlation properties only. * SpiffWorkflow did not serialize correlations - so they were lost between save and retrieve. f45103406 Allow people to run commands like "flask db upgrade" without setting specific environment variables like FLASK_SESSION_SECRET_KEY everytime - they just need to add in their own /instance/config.py with their local configuration. b169c3a87 * Re-work message tests so I could wrap my simple head around what was happening - just needed an example that made sense to me. * Clear out complex get_message_instance_receive how that many-to-many works. * Create decent error messages when correlations fail * Move correlation checks into the MessageInstance class * The APIError could bomb out ugly if it hit a workflow exception with not Task Spec. a39590912 failing test. 6db600caa Merge branch 'main' into feature/message_fixes 4942a728b work in progress - * Link between message instance and correlations is now a link table and many-to-many relationships as recommended by SQLAlchemy * Use the correlation keys, not the process id when accepting api messages. git-subtree-dir: spiffworkflow-backend git-subtree-split: 0bfe4400f4191214b8972977438ceb35a9f5b3c3
125 lines
3.8 KiB
Python
125 lines
3.8 KiB
Python
"""Conftest."""
|
|
import os
|
|
import shutil
|
|
|
|
import pytest
|
|
from flask.app import Flask
|
|
from flask.testing import FlaskClient
|
|
from tests.spiffworkflow_backend.helpers.base_test import BaseTest
|
|
|
|
from spiffworkflow_backend.models.db import db
|
|
from spiffworkflow_backend.models.process_instance import ProcessInstanceModel
|
|
from spiffworkflow_backend.models.user import UserModel
|
|
from spiffworkflow_backend.services.process_instance_processor import (
|
|
ProcessInstanceProcessor,
|
|
)
|
|
from spiffworkflow_backend.services.process_instance_service import (
|
|
ProcessInstanceService,
|
|
)
|
|
from spiffworkflow_backend.services.process_model_service import ProcessModelService
|
|
|
|
# from tests.spiffworkflow_backend.helpers.test_data import load_test_spec
|
|
|
|
|
|
# We need to call this before importing spiffworkflow_backend
|
|
# otherwise typeguard cannot work. hence the noqa: E402
|
|
if os.environ.get("RUN_TYPEGUARD") == "true":
|
|
from typeguard.importhook import install_import_hook
|
|
|
|
install_import_hook(packages="spiffworkflow_backend")
|
|
|
|
|
|
from spiffworkflow_backend import create_app # noqa: E402
|
|
|
|
|
|
@pytest.fixture(scope="session")
|
|
def app() -> Flask:
|
|
"""App."""
|
|
os.environ["SPIFFWORKFLOW_BACKEND_ENV"] = "unit_testing"
|
|
os.environ["FLASK_SESSION_SECRET_KEY"] = "super_secret_key"
|
|
app = create_app()
|
|
|
|
return app
|
|
|
|
|
|
@pytest.fixture()
|
|
def with_db_and_bpmn_file_cleanup() -> None:
|
|
"""Do it cleanly!"""
|
|
meta = db.metadata
|
|
for table in reversed(meta.sorted_tables):
|
|
db.session.execute(table.delete())
|
|
db.session.commit()
|
|
|
|
try:
|
|
yield
|
|
finally:
|
|
process_model_service = ProcessModelService()
|
|
if os.path.exists(process_model_service.root_path()):
|
|
shutil.rmtree(process_model_service.root_path())
|
|
|
|
|
|
@pytest.fixture()
|
|
def with_super_admin_user() -> UserModel:
|
|
"""With_super_admin_user."""
|
|
return BaseTest.create_user_with_permission("super_admin")
|
|
|
|
|
|
@pytest.fixture()
|
|
def setup_process_instances_for_reports(
|
|
client: FlaskClient, with_super_admin_user: UserModel
|
|
) -> list[ProcessInstanceModel]:
|
|
"""Setup_process_instances_for_reports."""
|
|
user = with_super_admin_user
|
|
process_group_id = "runs_without_input"
|
|
process_model_id = "sample"
|
|
# bpmn_file_name = "sample.bpmn"
|
|
bpmn_file_location = "sample"
|
|
process_model_identifier = BaseTest().create_group_and_model_with_bpmn(
|
|
client,
|
|
with_super_admin_user,
|
|
process_group_id=process_group_id,
|
|
process_model_id=process_model_id,
|
|
# bpmn_file_name=bpmn_file_name,
|
|
bpmn_file_location=bpmn_file_location,
|
|
)
|
|
|
|
# BaseTest().create_process_group(
|
|
# client=client, user=user, process_group_id=process_group_id, display_name=process_group_id
|
|
# )
|
|
# process_model_id = "runs_without_input/sample"
|
|
# load_test_spec(
|
|
# process_model_id=f"{process_group_id}/{process_model_id}",
|
|
# process_model_source_directory="sample"
|
|
# )
|
|
process_instances = []
|
|
for data in [kay(), ray(), jay()]:
|
|
process_instance = ProcessInstanceService.create_process_instance_from_process_model_identifier(
|
|
# process_group_identifier=process_group_id,
|
|
process_model_identifier=process_model_identifier,
|
|
user=user,
|
|
)
|
|
processor = ProcessInstanceProcessor(process_instance)
|
|
processor.slam_in_data(data)
|
|
process_instance.status = "complete"
|
|
db.session.add(process_instance)
|
|
db.session.commit()
|
|
|
|
process_instances.append(process_instance)
|
|
|
|
return process_instances
|
|
|
|
|
|
def kay() -> dict:
|
|
"""Kay."""
|
|
return {"name": "kay", "grade_level": 2, "test_score": 10}
|
|
|
|
|
|
def ray() -> dict:
|
|
"""Ray."""
|
|
return {"name": "ray", "grade_level": 1, "test_score": 9}
|
|
|
|
|
|
def jay() -> dict:
|
|
"""Jay."""
|
|
return {"name": "jay", "grade_level": 2, "test_score": 8}
|