spiff-arena/tests/SpiffWorkflow/bpmn/CollaborationTest.py
burnettk f3d02daf7a Squashed 'SpiffWorkflow/' changes from 7b39b2235..b3235fad5
b3235fad5 Merging main
09623ca61 # SpiffWorkflow: 1) Type Safe checking on correlation properties (no more str()) 2) A running workflows Correlations are once again at the key level.
d6806f69d maintain a way to access the correlations in relation to the correlation keys
065a86cde BPMN Parser was returning all retrieval expressions, rather than the ones specific to a correlation property, as was intended. Adding a correlation cache - so we have a reference of all the messages and properties (though still lacking a description of keys) Adding yet another migration, maybe should squash em.
9e8832c93 Merge remote-tracking branch 'origin/main' into feature/message_fixes
8efa922ae run_pyl
72a7e535a BPMN.io -- Just show the message names not the ids - to assure we are only exposing the names. SpiffWorkflow -     - start_messages function should return message names, not ids.     - don't catch external thrown messages within the same workflow process     - add an expected value to the Correlation Property Model so we can use this well defined class as an external communication tool (rather than building an arbitrary dictionary)     - Added a "get_awaiting_correlations" to an event, so we can get a list of the correlation properties related to the workflows currently defined correlation values.     - workflows.waiting_events() function now returns the above awaiting correlations as the value on returned message events  Backend     - Dropping MessageModel and MessageCorrelationProperties - at least for now.  We don't need them to send / receive messages though we may eventually want to track the messages and correlations defined across the system - these things (which are ever changing) should not be directly connected to the Messages which may be in flux - and the cross relationships between the tables could cause unexpected and unceissary errors.  Commented out the caching logic so we can turn this back on later.     - Slight improvement to API Errors     - MessageInstances are no longer in a many-to-many relationship with Correlations - Each message instance has a unique set of message correlations specific to the instance.     - Message Instances have users, and can be linked through a "counterpart_id" so you can see what send is connected to what recieve.     - Message Correlations are connected to  recieving message instances.  It is not to a process instance, and not to a message model.  They now include the expected value and retrieval expression required to validate an incoming message.     - A process instance is not connected to message correlations.     - Message Instances are not always tied to a process instance (for example, a Send Message from an API)     - API calls to create a message use the same logic as all other message catching code.     - Make use of the new waiting_events() method to check for any new recieve messages in the workflow (much easier than     churning through all of the tasks)     - One giant mother of a migration.
cb2ff8a93 * SpiffWorkflow event_definitions wanted to return a message event's correlation properties mested within correlation keys.  But messages are directly related to properties, not to keys - and it forced a number of conversions that made for tricky code.  So Messages now contain a dictionary of correlation properties only. * SpiffWorkflow did not serialize correlations - so they were lost between save and retrieve.
d4852a1a5 * Re-work message tests so I could wrap my simple head around what was happening - just needed an example that made sense to me. * Clear out complex get_message_instance_receive how that many-to-many works. * Create decent error messages when correlations fail * Move correlation checks into the MessageInstance class * The APIError could bomb out ugly if it hit a workflow exception with not Task Spec.

git-subtree-dir: SpiffWorkflow
git-subtree-split: b3235fad598ee3c4680a23f26adb09cdc8f2807b
2023-02-27 14:59:36 -05:00

149 lines
6.9 KiB
Python

from SpiffWorkflow.bpmn.specs.SubWorkflowTask import CallActivity
from SpiffWorkflow.bpmn.workflow import BpmnWorkflow
from SpiffWorkflow.task import TaskState
from tests.SpiffWorkflow.bpmn.BpmnWorkflowTestCase import BpmnWorkflowTestCase
class CollaborationTest(BpmnWorkflowTestCase):
def testParserProvidesInfoOnMessagesAndCorrelations(self):
parser = self.get_parser('collaboration.bpmn')
self.assertEqual(list(parser.messages.keys()), ['love_letter', 'love_letter_response'])
self.assertEqual(parser.correlations,
{'lover_name': {'name': "Lover's Name",
'retrieval_expressions': [
{'expression': 'lover_name',
'messageRef': 'love_letter'},
{'expression': 'from_name',
'messageRef': 'love_letter_response'}]}}
)
def testCollaboration(self):
spec, subprocesses = self.load_collaboration('collaboration.bpmn', 'my_collaboration')
# Only executable processes should be started
self.assertIn('process_buddy', subprocesses)
self.assertNotIn('random_person_process', subprocesses)
self.workflow = BpmnWorkflow(spec, subprocesses)
start = self.workflow.get_tasks_from_spec_name('Start')[0]
# Set up some data to be evaluated so that the workflow can proceed
start.data['lover_name'] = 'Peggy'
self.workflow.do_engine_steps()
# Call activities should be created for executable processes and be reachable
buddy = self.workflow.get_tasks_from_spec_name('process_buddy')[0]
self.assertIsInstance(buddy.task_spec, CallActivity)
self.assertEqual(buddy.task_spec.spec, 'process_buddy')
self.assertEqual(buddy.state, TaskState.WAITING)
def testBpmnMessage(self):
spec, subprocesses = self.load_workflow_spec('collaboration.bpmn', 'process_buddy')
workflow = BpmnWorkflow(spec, subprocesses)
start = workflow.get_tasks_from_spec_name('Start')[0]
# Set up some data to be evaluated so that the workflow can proceed
start.data['lover_name'] = 'Peggy'
workflow.do_engine_steps()
# An external message should be created
messages = workflow.get_bpmn_messages()
self.assertEqual(len(messages), 1)
self.assertEqual(len(workflow.bpmn_messages), 0)
receive = workflow.get_tasks_from_spec_name('EventReceiveLetter')[0]
# Waiting Events should contain details about what we are no waiting on.
events = workflow.waiting_events()
self.assertEqual(1, len(events))
self.assertEqual("Message", events[0]['event_type'])
self.assertEqual("Love Letter Response", events[0]['name'])
self.assertEqual(['lover'], events[0]['value'][0].correlation_keys)
self.assertEqual('from_name', events[0]['value'][0].retrieval_expression)
self.assertEqual('lover_name', events[0]['value'][0].name)
workflow.catch_bpmn_message('Love Letter Response', messages[0].payload,
messages[0].correlations)
workflow.do_engine_steps()
# The external message created above should be caught
self.assertEqual(receive.state, TaskState.COMPLETED)
self.assertEqual(receive.data, messages[0].payload)
self.assertEqual(workflow.is_completed(), True)
def testCorrelation(self):
specs = self.get_all_specs('correlation.bpmn')
proc_1 = specs['proc_1']
workflow = BpmnWorkflow(proc_1, specs)
workflow.do_engine_steps()
for idx, task in enumerate(workflow.get_ready_user_tasks()):
task.data['task_num'] = idx
task.complete()
workflow.do_engine_steps()
ready_tasks = workflow.get_ready_user_tasks()
waiting = workflow.get_tasks_from_spec_name('get_response')
# Two processes should have been started and two corresponding catch events should be waiting
self.assertEqual(len(ready_tasks), 2)
self.assertEqual(len(waiting), 2)
for task in waiting:
self.assertEqual(task.state, TaskState.WAITING)
# Now copy the task_num that was sent into a new variable
for task in ready_tasks:
task.data.update(init_id=task.data['task_num'])
task.complete()
workflow.do_engine_steps()
# If the messages were routed properly, the id should match
for task in workflow.get_tasks_from_spec_name('subprocess_end'):
self.assertEqual(task.data['task_num'], task.data['init_id'])
def testTwoCorrelationKeys(self):
specs = self.get_all_specs('correlation_two_conversations.bpmn')
proc_1 = specs['proc_1']
workflow = BpmnWorkflow(proc_1, specs)
workflow.do_engine_steps()
for idx, task in enumerate(workflow.get_ready_user_tasks()):
task.data['task_num'] = idx
task.complete()
workflow.do_engine_steps()
# Two processes should have been started and two corresponding catch events should be waiting
ready_tasks = workflow.get_ready_user_tasks()
waiting = workflow.get_tasks_from_spec_name('get_response_one')
self.assertEqual(len(ready_tasks), 2)
self.assertEqual(len(waiting), 2)
for task in waiting:
self.assertEqual(task.state, TaskState.WAITING)
# Now copy the task_num that was sent into a new variable
for task in ready_tasks:
task.data.update(init_id=task.data['task_num'])
task.complete()
workflow.do_engine_steps()
# Complete dummy tasks
for task in workflow.get_ready_user_tasks():
task.complete()
workflow.do_engine_steps()
# Repeat for the other process, using a different mapped name
ready_tasks = workflow.get_ready_user_tasks()
waiting = workflow.get_tasks_from_spec_name('get_response_two')
self.assertEqual(len(ready_tasks), 2)
self.assertEqual(len(waiting), 2)
for task in ready_tasks:
task.data.update(subprocess=task.data['task_num'])
task.complete()
workflow.do_engine_steps()
# If the messages were routed properly, the id should match
for task in workflow.get_tasks_from_spec_name('subprocess_end'):
self.assertEqual(task.data['task_num'], task.data['init_id'])
self.assertEqual(task.data['task_num'], task.data['subprocess'])
def testSerialization(self):
spec, subprocesses = self.load_collaboration('collaboration.bpmn', 'my_collaboration')
self.workflow = BpmnWorkflow(spec, subprocesses)
start = self.workflow.get_tasks_from_spec_name('Start')[0]
start.data['lover_name'] = 'Peggy'
self.workflow.do_engine_steps()
self.save_restore()