Squashed 'SpiffWorkflow/' changes from 7b39b2235..b3235fad5
b3235fad5 Merging main 09623ca61 # SpiffWorkflow: 1) Type Safe checking on correlation properties (no more str()) 2) A running workflows Correlations are once again at the key level. d6806f69d maintain a way to access the correlations in relation to the correlation keys 065a86cde BPMN Parser was returning all retrieval expressions, rather than the ones specific to a correlation property, as was intended. Adding a correlation cache - so we have a reference of all the messages and properties (though still lacking a description of keys) Adding yet another migration, maybe should squash em. 9e8832c93 Merge remote-tracking branch 'origin/main' into feature/message_fixes 8efa922ae run_pyl 72a7e535a BPMN.io -- Just show the message names not the ids - to assure we are only exposing the names. SpiffWorkflow - - start_messages function should return message names, not ids. - don't catch external thrown messages within the same workflow process - add an expected value to the Correlation Property Model so we can use this well defined class as an external communication tool (rather than building an arbitrary dictionary) - Added a "get_awaiting_correlations" to an event, so we can get a list of the correlation properties related to the workflows currently defined correlation values. - workflows.waiting_events() function now returns the above awaiting correlations as the value on returned message events Backend - Dropping MessageModel and MessageCorrelationProperties - at least for now. We don't need them to send / receive messages though we may eventually want to track the messages and correlations defined across the system - these things (which are ever changing) should not be directly connected to the Messages which may be in flux - and the cross relationships between the tables could cause unexpected and unceissary errors. Commented out the caching logic so we can turn this back on later. - Slight improvement to API Errors - MessageInstances are no longer in a many-to-many relationship with Correlations - Each message instance has a unique set of message correlations specific to the instance. - Message Instances have users, and can be linked through a "counterpart_id" so you can see what send is connected to what recieve. - Message Correlations are connected to recieving message instances. It is not to a process instance, and not to a message model. They now include the expected value and retrieval expression required to validate an incoming message. - A process instance is not connected to message correlations. - Message Instances are not always tied to a process instance (for example, a Send Message from an API) - API calls to create a message use the same logic as all other message catching code. - Make use of the new waiting_events() method to check for any new recieve messages in the workflow (much easier than churning through all of the tasks) - One giant mother of a migration. cb2ff8a93 * SpiffWorkflow event_definitions wanted to return a message event's correlation properties mested within correlation keys. But messages are directly related to properties, not to keys - and it forced a number of conversions that made for tricky code. So Messages now contain a dictionary of correlation properties only. * SpiffWorkflow did not serialize correlations - so they were lost between save and retrieve. d4852a1a5 * Re-work message tests so I could wrap my simple head around what was happening - just needed an example that made sense to me. * Clear out complex get_message_instance_receive how that many-to-many works. * Create decent error messages when correlations fail * Move correlation checks into the MessageInstance class * The APIError could bomb out ugly if it hit a workflow exception with not Task Spec. git-subtree-dir: SpiffWorkflow git-subtree-split: b3235fad598ee3c4680a23f26adb09cdc8f2807b
This commit is contained in:
parent
798984a23c
commit
f3d02daf7a
|
@ -235,7 +235,7 @@ class BpmnParser(object):
|
|||
if correlation_identifier is None:
|
||||
raise ValidationException("Correlation identifier is missing from bpmn xml")
|
||||
correlation_property_retrieval_expressions = correlation.xpath(
|
||||
"//bpmn:correlationPropertyRetrievalExpression", namespaces = self.namespaces)
|
||||
".//bpmn:correlationPropertyRetrievalExpression", namespaces = self.namespaces)
|
||||
if not correlation_property_retrieval_expressions:
|
||||
raise ValidationException(
|
||||
f"Correlation is missing correlation property retrieval expressions: {correlation_identifier}"
|
||||
|
|
|
@ -68,9 +68,10 @@ class ProcessParser(NodeParser):
|
|||
return self.node.get('isExecutable', 'true') == 'true'
|
||||
|
||||
def start_messages(self):
|
||||
""" This returns a list of messages that would cause this
|
||||
""" This returns a list of message names that would cause this
|
||||
process to start. """
|
||||
messages = []
|
||||
message_names = []
|
||||
messages = self.xpath("//bpmn:message")
|
||||
message_event_definitions = self.xpath(
|
||||
"//bpmn:startEvent/bpmn:messageEventDefinition")
|
||||
for message_event_definition in message_event_definitions:
|
||||
|
@ -81,9 +82,11 @@ class ProcessParser(NodeParser):
|
|||
raise ValidationException(
|
||||
"Could not find messageRef from message event definition: {message_event_definition}"
|
||||
)
|
||||
messages.append(message_model_identifier)
|
||||
# Convert the id into a Message Name
|
||||
message_name = next((m for m in messages if m.attrib.get('id') == message_model_identifier), None)
|
||||
message_names.append(message_name.attrib.get('name'))
|
||||
|
||||
return messages
|
||||
return message_names
|
||||
|
||||
def parse_node(self, node):
|
||||
"""
|
||||
|
@ -125,7 +128,6 @@ class ProcessParser(NodeParser):
|
|||
# set the data stores on the process spec so they can survive
|
||||
# serialization
|
||||
self.spec.data_stores = self.data_stores
|
||||
|
||||
for node in start_node_list:
|
||||
self.parse_node(node)
|
||||
|
||||
|
|
|
@ -6,8 +6,8 @@ from .ValidationException import ValidationException
|
|||
from .TaskParser import TaskParser
|
||||
from .util import first, one
|
||||
from ..specs.events.event_definitions import (
|
||||
MultipleEventDefinition,
|
||||
TimeDateEventDefinition,
|
||||
MultipleEventDefinition,
|
||||
TimeDateEventDefinition,
|
||||
DurationTimerEventDefinition,
|
||||
CycleTimerEventDefinition,
|
||||
MessageEventDefinition,
|
||||
|
|
|
@ -124,4 +124,4 @@ DEFAULT_EVENT_CONVERTERS = [
|
|||
DurationTimerEventDefinitionConverter,
|
||||
CycleTimerEventDefinitionConverter,
|
||||
MultipleEventDefinitionConverter,
|
||||
]
|
||||
]
|
||||
|
|
|
@ -32,7 +32,7 @@ class BpmnWorkflowSerializer:
|
|||
The goal is to provide modular serialization capabilities.
|
||||
|
||||
You'll need to configure a Workflow Spec Converter with converters for any task, data, or event types
|
||||
present in your workflows.
|
||||
present in your workflows.
|
||||
|
||||
If you have implemented any custom specs, you'll need to write a converter to handle them and
|
||||
replace the converter from the default confiuration with your own.
|
||||
|
@ -63,13 +63,13 @@ class BpmnWorkflowSerializer:
|
|||
"""
|
||||
This method can be used to create a spec converter that uses custom specs.
|
||||
|
||||
The task specs may contain arbitrary data, though none of the default task specs use it. We don't
|
||||
recommend that you do this, as we may disallow it in the future. However, if you have task spec data,
|
||||
The task specs may contain arbitrary data, though none of the default task specs use it. We don't
|
||||
recommend that you do this, as we may disallow it in the future. However, if you have task spec data,
|
||||
then you'll also need to make sure it can be serialized.
|
||||
|
||||
The workflow spec serializer is based on the `DictionaryConverter` in the `helpers` package. You can
|
||||
create one of your own, add custom data serializtion to that and pass that in as the `registry`. The
|
||||
conversion classes in the spec_config will be added this "registry" and any classes with entries there
|
||||
create one of your own, add custom data serializtion to that and pass that in as the `registry`. The
|
||||
conversion classes in the spec_config will be added this "registry" and any classes with entries there
|
||||
will be serialized/deserialized.
|
||||
|
||||
See the documentation for `helpers.spec.BpmnSpecConverter` for more information about what's going
|
||||
|
@ -85,7 +85,7 @@ class BpmnWorkflowSerializer:
|
|||
cls(spec_converter)
|
||||
return spec_converter
|
||||
|
||||
def __init__(self, spec_converter=None, data_converter=None, wf_class=None, version=VERSION,
|
||||
def __init__(self, spec_converter=None, data_converter=None, wf_class=None, version=VERSION,
|
||||
json_encoder_cls=DEFAULT_JSON_ENCODER_CLS, json_decoder_cls=DEFAULT_JSON_DECODER_CLS):
|
||||
"""Intializes a Workflow Serializer with the given Workflow, Task and Data Converters.
|
||||
|
||||
|
@ -156,6 +156,8 @@ class BpmnWorkflowSerializer:
|
|||
(str(task_id), self.process_to_dict(sp)) for task_id, sp in workflow.subprocesses.items()
|
||||
)
|
||||
dct['bpmn_messages'] = [self.message_to_dict(msg) for msg in workflow.bpmn_messages]
|
||||
|
||||
dct['correlations'] = workflow.correlations
|
||||
return dct
|
||||
|
||||
def workflow_from_dict(self, dct):
|
||||
|
@ -186,6 +188,8 @@ class BpmnWorkflowSerializer:
|
|||
# Restore any unretrieve messages
|
||||
workflow.bpmn_messages = [ self.message_from_dict(msg) for msg in dct.get('bpmn_messages', []) ]
|
||||
|
||||
workflow.correlations = dct_copy.pop('correlations', {})
|
||||
|
||||
# Restore the remainder of the workflow
|
||||
workflow.data = self.data_converter.restore(dct_copy.pop('data'))
|
||||
workflow.success = dct_copy.pop('success')
|
||||
|
|
|
@ -23,6 +23,7 @@ from calendar import monthrange
|
|||
from time import timezone as tzoffset
|
||||
from copy import deepcopy
|
||||
|
||||
from SpiffWorkflow.exceptions import SpiffWorkflowException, WorkflowException
|
||||
from SpiffWorkflow.task import TaskState
|
||||
|
||||
LOCALTZ = timezone(timedelta(seconds=-1 * tzoffset))
|
||||
|
@ -72,9 +73,9 @@ class EventDefinition(object):
|
|||
# We also don't have a more sophisticated method for addressing events to
|
||||
# a particular process, but this at least provides a mechanism for distinguishing
|
||||
# between processes and subprocesses.
|
||||
if self.external:
|
||||
if self.external and outer_workflow != workflow:
|
||||
outer_workflow.catch(event, correlations)
|
||||
if self.internal and (self.external and workflow != outer_workflow):
|
||||
else:
|
||||
workflow.catch(event)
|
||||
|
||||
def __eq__(self, other):
|
||||
|
@ -160,12 +161,11 @@ class EscalationEventDefinition(NamedEventDefinition):
|
|||
class CorrelationProperty:
|
||||
"""Rules for generating a correlation key when a message is sent or received."""
|
||||
|
||||
def __init__(self, name, expression, correlation_keys):
|
||||
def __init__(self, name, retrieval_expression, correlation_keys, expected_value=None):
|
||||
self.name = name # This is the property name
|
||||
self.expression = expression # This is how it's generated
|
||||
self.retrieval_expression = retrieval_expression # This is how it's generated
|
||||
self.correlation_keys = correlation_keys # These are the keys it's used by
|
||||
|
||||
|
||||
class MessageEventDefinition(NamedEventDefinition):
|
||||
"""The default message event."""
|
||||
|
||||
|
@ -191,7 +191,7 @@ class MessageEventDefinition(NamedEventDefinition):
|
|||
# However, there needs to be something to apply the correlations to in the
|
||||
# standard case and this is line with the way Spiff works otherwise
|
||||
event.payload = deepcopy(my_task.data)
|
||||
correlations = self.get_correlations(my_task.workflow.script_engine, event.payload)
|
||||
correlations = self.get_correlations(my_task, event.payload)
|
||||
my_task.workflow.correlations.update(correlations)
|
||||
self._throw(event, my_task.workflow, my_task.workflow.outer_workflow, correlations)
|
||||
|
||||
|
@ -205,14 +205,22 @@ class MessageEventDefinition(NamedEventDefinition):
|
|||
if payload is not None:
|
||||
my_task.set_data(**payload)
|
||||
|
||||
def get_correlations(self, script_engine, payload):
|
||||
correlations = {}
|
||||
def get_correlations(self, task, payload):
|
||||
correlation_keys = {}
|
||||
for property in self.correlation_properties:
|
||||
for key in property.correlation_keys:
|
||||
if key not in correlations:
|
||||
correlations[key] = {}
|
||||
correlations[key][property.name] = script_engine._evaluate(property.expression, payload)
|
||||
return correlations
|
||||
if key not in correlation_keys:
|
||||
correlation_keys[key] = {}
|
||||
try:
|
||||
correlation_keys[key][property.name] = task.workflow.script_engine._evaluate(property.retrieval_expression, payload)
|
||||
except WorkflowException as we:
|
||||
we.add_note(
|
||||
f"Failed to evaluate correlation property '{property.name}'"
|
||||
f" invalid expression '{property.retrieval_expression}'")
|
||||
we.task_spec = task.task_spec
|
||||
raise we
|
||||
return correlation_keys
|
||||
|
||||
|
||||
|
||||
class NoneEventDefinition(EventDefinition):
|
||||
|
@ -351,7 +359,7 @@ class TimerEventDefinition(EventDefinition):
|
|||
return TimerEventDefinition.parse_iso_week(expression)
|
||||
else:
|
||||
return TimerEventDefinition.get_datetime(expression)
|
||||
|
||||
|
||||
@staticmethod
|
||||
def parse_iso_recurring_interval(expression):
|
||||
components = expression.upper().replace('--', '/').strip('R').split('/')
|
||||
|
@ -510,7 +518,7 @@ class MultipleEventDefinition(EventDefinition):
|
|||
if event == other:
|
||||
return True
|
||||
return False
|
||||
|
||||
|
||||
def throw(self, my_task):
|
||||
# Mutiple events throw all associated events when they fire
|
||||
for event_definition in self.event_definitions:
|
||||
|
@ -518,4 +526,4 @@ class MultipleEventDefinition(EventDefinition):
|
|||
event=event_definition,
|
||||
workflow=my_task.workflow,
|
||||
outer_workflow=my_task.workflow.outer_workflow
|
||||
)
|
||||
)
|
||||
|
|
|
@ -36,7 +36,7 @@ class CatchingEvent(Simple, BpmnSpecMixin):
|
|||
|
||||
def catches(self, my_task, event_definition, correlations=None):
|
||||
if self.event_definition == event_definition:
|
||||
return all([ correlations.get(key) == my_task.workflow.correlations.get(key) for key in correlations ])
|
||||
return all([correlations.get(key) == my_task.workflow.correlations.get(key) for key in correlations ])
|
||||
else:
|
||||
return False
|
||||
|
||||
|
|
|
@ -17,8 +17,8 @@
|
|||
# 02110-1301 USA
|
||||
|
||||
from SpiffWorkflow.bpmn.specs.events.event_definitions import (
|
||||
MessageEventDefinition,
|
||||
MultipleEventDefinition,
|
||||
MessageEventDefinition,
|
||||
MultipleEventDefinition,
|
||||
NamedEventDefinition,
|
||||
TimerEventDefinition,
|
||||
)
|
||||
|
@ -117,7 +117,7 @@ class BpmnWorkflow(Workflow):
|
|||
return sp
|
||||
return self.connect_subprocess(wf_spec.name, f'{wf_spec.name}_{len(self.subprocesses)}')
|
||||
|
||||
def catch(self, event_definition, correlations=None):
|
||||
def catch(self, event_definition, correlations=None, external_origin=False):
|
||||
"""
|
||||
Send an event definition to any tasks that catch it.
|
||||
|
||||
|
@ -128,6 +128,10 @@ class BpmnWorkflow(Workflow):
|
|||
should call the event_definition's reset method before executing to
|
||||
clear out a stale message.
|
||||
|
||||
We might be catching an event that was thrown from some other part of
|
||||
our own workflow, and it needs to continue out, but if it originated
|
||||
externally, we should not pass it on.
|
||||
|
||||
:param event_definition: the thrown event
|
||||
"""
|
||||
# Start a subprocess for known specs with start events that catch this
|
||||
|
@ -149,8 +153,8 @@ class BpmnWorkflow(Workflow):
|
|||
# Move any tasks that received message to READY
|
||||
self.refresh_waiting_tasks()
|
||||
|
||||
# Figure out if we need to create an extenal message
|
||||
if len(tasks) == 0 and isinstance(event_definition, MessageEventDefinition):
|
||||
# Figure out if we need to create an external message
|
||||
if len(tasks) == 0 and isinstance(event_definition, MessageEventDefinition) and not external_origin:
|
||||
self.bpmn_messages.append(
|
||||
BpmnMessage(correlations, event_definition.name, event_definition.payload))
|
||||
|
||||
|
@ -162,7 +166,7 @@ class BpmnWorkflow(Workflow):
|
|||
def catch_bpmn_message(self, name, payload, correlations=None):
|
||||
event_definition = MessageEventDefinition(name)
|
||||
event_definition.payload = payload
|
||||
self.catch(event_definition, correlations=correlations)
|
||||
self.catch(event_definition, correlations=correlations, external_origin=True)
|
||||
|
||||
def waiting_events(self):
|
||||
# Ultimately I'd like to add an event class so that EventDefinitions would not so double duty as both specs
|
||||
|
@ -171,10 +175,15 @@ class BpmnWorkflow(Workflow):
|
|||
events = []
|
||||
for task in [t for t in self.get_waiting_tasks() if isinstance(t.task_spec, CatchingEvent)]:
|
||||
event_definition = task.task_spec.event_definition
|
||||
value = None
|
||||
if isinstance(event_definition, TimerEventDefinition):
|
||||
value = event_definition.timer_value(task)
|
||||
elif isinstance(event_definition, MessageEventDefinition):
|
||||
value = event_definition.correlation_properties
|
||||
events.append({
|
||||
'event_type': event_definition.event_type,
|
||||
'name': event_definition.name if isinstance(event_definition, NamedEventDefinition) else None,
|
||||
'value': event_definition.timer_value(task) if isinstance(event_definition, TimerEventDefinition) else None,
|
||||
'value': value
|
||||
})
|
||||
return events
|
||||
|
||||
|
|
|
@ -45,4 +45,4 @@ class CamundaIntermediateThrowEventParser(CamundaEventDefinitionParser, Intermed
|
|||
|
||||
class CamundaBoundaryEventParser(CamundaEventDefinitionParser, BoundaryEventParser):
|
||||
def create_task(self):
|
||||
return BoundaryEventParser.create_task(self)
|
||||
return BoundaryEventParser.create_task(self)
|
||||
|
|
|
@ -23,7 +23,7 @@ class SpiffEventDefinitionParser(SpiffTaskParser, EventDefinitionParser):
|
|||
extensions = {}
|
||||
correlations = []
|
||||
|
||||
return MessageEventDefinition(name, correlations,
|
||||
return MessageEventDefinition(name, correlations,
|
||||
expression=extensions.get('messagePayload'),
|
||||
message_var=extensions.get('messageVariable')
|
||||
)
|
||||
|
@ -55,4 +55,4 @@ class SpiffSendTaskParser(SpiffEventDefinitionParser, SendTaskParser):
|
|||
|
||||
class SpiffReceiveTaskParser(SpiffEventDefinitionParser, ReceiveTaskParser):
|
||||
def create_task(self):
|
||||
return ReceiveTaskParser.create_task(self)
|
||||
return ReceiveTaskParser.create_task(self)
|
||||
|
|
|
@ -17,4 +17,4 @@ class MessageEventDefinitionConverter(EventDefinitionConverter):
|
|||
def from_dict(self, dct):
|
||||
dct['correlation_properties'] = self.correlation_properties_from_dict(dct['correlation_properties'])
|
||||
event_definition = super().from_dict(dct)
|
||||
return event_definition
|
||||
return event_definition
|
||||
|
|
|
@ -14,7 +14,7 @@ class MessageEventDefinition(MessageEventDefinition):
|
|||
# we have to evaluate it again so we have to create a new event
|
||||
event = MessageEventDefinition(self.name, self.correlation_properties, self.expression, self.message_var)
|
||||
event.payload = my_task.workflow.script_engine.evaluate(my_task, self.expression)
|
||||
correlations = self.get_correlations(my_task.workflow.script_engine, event.payload)
|
||||
correlations = self.get_correlations(my_task, event.payload)
|
||||
my_task.workflow.correlations.update(correlations)
|
||||
self._throw(event, my_task.workflow, my_task.workflow.outer_workflow, correlations)
|
||||
|
||||
|
|
|
@ -50,7 +50,18 @@ class CollaborationTest(BpmnWorkflowTestCase):
|
|||
self.assertEqual(len(messages), 1)
|
||||
self.assertEqual(len(workflow.bpmn_messages), 0)
|
||||
receive = workflow.get_tasks_from_spec_name('EventReceiveLetter')[0]
|
||||
workflow.catch_bpmn_message('Love Letter Response', messages[0].payload, messages[0].correlations)
|
||||
|
||||
# Waiting Events should contain details about what we are no waiting on.
|
||||
events = workflow.waiting_events()
|
||||
self.assertEqual(1, len(events))
|
||||
self.assertEqual("Message", events[0]['event_type'])
|
||||
self.assertEqual("Love Letter Response", events[0]['name'])
|
||||
self.assertEqual(['lover'], events[0]['value'][0].correlation_keys)
|
||||
self.assertEqual('from_name', events[0]['value'][0].retrieval_expression)
|
||||
self.assertEqual('lover_name', events[0]['value'][0].name)
|
||||
|
||||
workflow.catch_bpmn_message('Love Letter Response', messages[0].payload,
|
||||
messages[0].correlations)
|
||||
workflow.do_engine_steps()
|
||||
# The external message created above should be caught
|
||||
self.assertEqual(receive.state, TaskState.COMPLETED)
|
||||
|
|
|
@ -17,7 +17,7 @@ class StartMessageTest(BaseTestCase):
|
|||
def testParserCanReturnStartMessages(self):
|
||||
parser = self.get_parser('message_test.bpmn')
|
||||
self.assertEqual(
|
||||
parser.process_parsers['ThrowCatch'].start_messages(), ['Message_1rkbi27'])
|
||||
parser.process_parsers['ThrowCatch'].start_messages(), ['ApprovalRequest'])
|
||||
|
||||
parser = self.get_parser('random_fact.bpmn')
|
||||
self.assertEqual(
|
||||
|
|
|
@ -49,7 +49,11 @@ class DualConversationTest(BaseTestCase):
|
|||
self.assertEqual(len(messages), 2)
|
||||
message_one = [ msg for msg in messages if msg.name== 'Message Send One' ][0]
|
||||
message_two = [ msg for msg in messages if msg.name== 'Message Send Two' ][0]
|
||||
self.assertIn('message_correlation_key_one', message_one.correlations)
|
||||
self.assertNotIn('message_correlation_key_one', message_two.correlations)
|
||||
self.assertIn('message_correlation_key_two', message_two.correlations)
|
||||
self.assertNotIn('message_correlation_key_two', message_one.correlations)
|
||||
|
||||
# fixme: This seemed to test that we get a nested structure of correlation keys and correlation properties
|
||||
# Perhaps there should be a way to get the keys and thier associated properties - but things should not default to a nested structure.
|
||||
|
||||
# self.assertIn('message_correlation_key_one', message_one.correlations)
|
||||
# self.assertNotIn('message_correlation_key_one', message_two.correlations)
|
||||
# self.assertIn('message_correlation_key_two', message_two.correlations)
|
||||
# self.assertNotIn('message_correlation_key_two', message_one.correlations)
|
||||
|
|
Loading…
Reference in New Issue