Merge commit '05e226fcabf9bbe6d9ca9816cd2df827926b619a' into main
This commit is contained in:
commit
929d04af5f
|
@ -46,7 +46,7 @@ class NodeParser:
|
||||||
if ref is not None and ref.get('dataObjectRef') in self.process_parser.spec.data_objects:
|
if ref is not None and ref.get('dataObjectRef') in self.process_parser.spec.data_objects:
|
||||||
specs.append(self.process_parser.spec.data_objects[ref.get('dataObjectRef')])
|
specs.append(self.process_parser.spec.data_objects[ref.get('dataObjectRef')])
|
||||||
else:
|
else:
|
||||||
raise ValidationException(f'Cannot resolve dataInputAssociation {name}', self.node, self.file_name)
|
raise ValidationException(f'Cannot resolve dataInputAssociation {name}', self.node, self.filename)
|
||||||
return specs
|
return specs
|
||||||
|
|
||||||
def parse_outgoing_data_references(self):
|
def parse_outgoing_data_references(self):
|
||||||
|
@ -56,7 +56,7 @@ class NodeParser:
|
||||||
if ref is not None and ref.get('dataObjectRef') in self.process_parser.spec.data_objects:
|
if ref is not None and ref.get('dataObjectRef') in self.process_parser.spec.data_objects:
|
||||||
specs.append(self.process_parser.spec.data_objects[ref.get('dataObjectRef')])
|
specs.append(self.process_parser.spec.data_objects[ref.get('dataObjectRef')])
|
||||||
else:
|
else:
|
||||||
raise ValidationException(f'Cannot resolve dataOutputAssociation {name}', self.node, self.file_name)
|
raise ValidationException(f'Cannot resolve dataOutputAssociation {name}', self.node, self.filename)
|
||||||
return specs
|
return specs
|
||||||
|
|
||||||
def parse_extensions(self, node=None):
|
def parse_extensions(self, node=None):
|
||||||
|
|
|
@ -7,7 +7,7 @@ Filtering Tasks
|
||||||
In our earlier example, all we did was check the lane a task was in and display
|
In our earlier example, all we did was check the lane a task was in and display
|
||||||
it along with the task name and state.
|
it along with the task name and state.
|
||||||
|
|
||||||
Lets take a look at a sample workflow with lanes:
|
Let's take a look at a sample workflow with lanes:
|
||||||
|
|
||||||
.. figure:: figures/lanes.png
|
.. figure:: figures/lanes.png
|
||||||
:scale: 30%
|
:scale: 30%
|
||||||
|
@ -15,7 +15,7 @@ Lets take a look at a sample workflow with lanes:
|
||||||
|
|
||||||
Workflow with lanes
|
Workflow with lanes
|
||||||
|
|
||||||
To get all of the tasks that are ready for the 'Customer' workflow, we could
|
To get all the tasks that are ready for the 'Customer' workflow, we could
|
||||||
specify the lane when retrieving ready user tasks:
|
specify the lane when retrieving ready user tasks:
|
||||||
|
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
@ -50,14 +50,14 @@ Logging
|
||||||
Spiff provides several loggers:
|
Spiff provides several loggers:
|
||||||
- the :code:`spiff` logger, which emits messages when a workflow is initialized and when tasks change state
|
- the :code:`spiff` logger, which emits messages when a workflow is initialized and when tasks change state
|
||||||
- the :code:`spiff.metrics` logger, which emits messages containing the elapsed duration of tasks
|
- the :code:`spiff.metrics` logger, which emits messages containing the elapsed duration of tasks
|
||||||
- the :code:`spiff.data` logger, which emits message when task or workflow data is updated.
|
- the :code:`spiff.data` logger, which emits a message when task or workflow data is updated.
|
||||||
|
|
||||||
Log level :code:`INFO` will provide reasonably detailed information about state changes.
|
Log level :code:`INFO` will provide reasonably detailed information about state changes.
|
||||||
|
|
||||||
As usual, log level :code:`DEBUG` will probably provide more logs than you really want
|
As usual, log level :code:`DEBUG` will probably provide more logs than you really want
|
||||||
to see, but the logs will contain the task and task internal data.
|
to see, but the logs will contain the task and task internal data.
|
||||||
|
|
||||||
Data can be included at any level less than :code:`INFO`. In our exmple application,
|
Data can be included at any level less than :code:`INFO`. In our example application,
|
||||||
we define a custom log level
|
we define a custom log level
|
||||||
|
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
@ -76,7 +76,7 @@ Serialization
|
||||||
|
|
||||||
Serialization Changed in Version 1.1.7.
|
Serialization Changed in Version 1.1.7.
|
||||||
Support for pre-1.1.7 serialization will be dropped in a future release.
|
Support for pre-1.1.7 serialization will be dropped in a future release.
|
||||||
The old serialization method still works but it is deprecated.
|
The old serialization method still works, but it is deprecated.
|
||||||
To migrate your system to the new version, see "Migrating between
|
To migrate your system to the new version, see "Migrating between
|
||||||
serialization versions" below.
|
serialization versions" below.
|
||||||
|
|
||||||
|
@ -131,7 +131,7 @@ To restore the workflow:
|
||||||
with open(args.restore) as state:
|
with open(args.restore) as state:
|
||||||
wf = serializer.deserialize_json(state.read())
|
wf = serializer.deserialize_json(state.read())
|
||||||
|
|
||||||
The workflow serializer is designed to be flexible and modular and as such is a little complicated. It has
|
The workflow serializer is designed to be flexible and modular, and as such is a little complicated. It has
|
||||||
two components:
|
two components:
|
||||||
|
|
||||||
- a workflow spec converter (which handles workflow and task specs)
|
- a workflow spec converter (which handles workflow and task specs)
|
||||||
|
@ -141,7 +141,7 @@ The default workflow spec converter likely to meet your needs, either on its own
|
||||||
:code:`UserTask` and :code:`BusinessRuleTask` in the :code:`camnuda` or :code:`spiff` and :code:`dmn` subpackages
|
:code:`UserTask` and :code:`BusinessRuleTask` in the :code:`camnuda` or :code:`spiff` and :code:`dmn` subpackages
|
||||||
of this library, and all you'll need to do is add them to the list of task converters, as we did above.
|
of this library, and all you'll need to do is add them to the list of task converters, as we did above.
|
||||||
|
|
||||||
However, he default data converter is very simple, adding only JSON-serializable conversions of :code:`datetime`
|
However, the default data converter is very simple, adding only JSON-serializable conversions of :code:`datetime`
|
||||||
and :code:`timedelta` objects (we make these available in our default script engine) and UUIDs. If your
|
and :code:`timedelta` objects (we make these available in our default script engine) and UUIDs. If your
|
||||||
workflow or task data contains objects that are not JSON-serializable, you'll need to extend ours, or extend
|
workflow or task data contains objects that are not JSON-serializable, you'll need to extend ours, or extend
|
||||||
its base class to create one of your own.
|
its base class to create one of your own.
|
||||||
|
@ -245,7 +245,7 @@ The code would then look more like this:
|
||||||
|
|
||||||
Because the serializer is highly customizable, we've made it possible for you to manage your own versions of the
|
Because the serializer is highly customizable, we've made it possible for you to manage your own versions of the
|
||||||
serialization. You can do this by passing a version number into the serializer, which will be embedded in the
|
serialization. You can do this by passing a version number into the serializer, which will be embedded in the
|
||||||
json of all workflows. This allow you to modify the serialization and customize it over time, and still manage
|
json of all workflows. This allows you to modify the serialization and customize it over time, and still manage
|
||||||
the different forms as you make adjustments without leaving people behind.
|
the different forms as you make adjustments without leaving people behind.
|
||||||
|
|
||||||
Versioned Serializer
|
Versioned Serializer
|
||||||
|
@ -273,7 +273,7 @@ security reasons.
|
||||||
and :code:`exec`! If you have security concerns, you should definitely investigate
|
and :code:`exec`! If you have security concerns, you should definitely investigate
|
||||||
replacing the default with your own implementation.
|
replacing the default with your own implementation.
|
||||||
|
|
||||||
We'll cover a simple extension of custom script engine here. There is also an examples of
|
We'll cover a simple extension of custom script engine here. There is also an example of
|
||||||
a similar engine based on `RestrictedPython <https://restrictedpython.readthedocs.io/en/latest/>`_
|
a similar engine based on `RestrictedPython <https://restrictedpython.readthedocs.io/en/latest/>`_
|
||||||
included alongside this example.
|
included alongside this example.
|
||||||
|
|
||||||
|
|
|
@ -31,7 +31,7 @@ We'll include examples of all of these types in this section.
|
||||||
Transactions
|
Transactions
|
||||||
^^^^^^^^^^^^
|
^^^^^^^^^^^^
|
||||||
|
|
||||||
We also need to introduce the concept of a Transaction, bceause certain events
|
We also need to introduce the concept of a Transaction because certain events
|
||||||
can only be used in that context. A Transaction is essentially a subprocess, but
|
can only be used in that context. A Transaction is essentially a subprocess, but
|
||||||
it must fully complete before it affects its outer workflow.
|
it must fully complete before it affects its outer workflow.
|
||||||
|
|
||||||
|
@ -147,7 +147,7 @@ this tutorial.
|
||||||
|
|
||||||
We ask the Employee to verify that they were able to retrieve the product; if they
|
We ask the Employee to verify that they were able to retrieve the product; if they
|
||||||
were unable to do so, then we generate an Error End Event, which we will handle
|
were unable to do so, then we generate an Error End Event, which we will handle
|
||||||
with an Interrupting Error Boundary Event (Error events are *always* Interrupting).
|
with an Interrupting Error Boundary Event (Error events are *always* interrupting).
|
||||||
|
|
||||||
If the product is unavailable, our Manager will notify the customer, issue a refund,
|
If the product is unavailable, our Manager will notify the customer, issue a refund,
|
||||||
and cancel the order.
|
and cancel the order.
|
||||||
|
@ -161,7 +161,7 @@ Event, you'll have to use Escalation, because BPMN does not allow Intermediate E
|
||||||
and that Error Events cannot be Non-Interrupting.
|
and that Error Events cannot be Non-Interrupting.
|
||||||
|
|
||||||
In our example, we'll assume that if we failed to ship the product, we can try again later,
|
In our example, we'll assume that if we failed to ship the product, we can try again later,
|
||||||
so we will not end the Subprocess (Escalation events can be either Interrupting or
|
so, we will not end the Subprocess (Escalation events can be either Interrupting or
|
||||||
Non-Interrupting).
|
Non-Interrupting).
|
||||||
|
|
||||||
However, we still want to notify our customer of a delay, so we use a Non-Interrupting
|
However, we still want to notify our customer of a delay, so we use a Non-Interrupting
|
||||||
|
|
|
@ -23,7 +23,7 @@ Exclusive Gateway
|
||||||
Exclusive gateways are used when exactly one alternative can be selected.
|
Exclusive gateways are used when exactly one alternative can be selected.
|
||||||
|
|
||||||
Suppose our products are T-shirts and we offer product C in several colors. After
|
Suppose our products are T-shirts and we offer product C in several colors. After
|
||||||
the user selects a product, we check to see it if is customizable. Our default
|
the user selects a product, we check to see it if is customizable. Our default
|
||||||
branch will be 'Not Customizable', but we'll direct the user to a second form
|
branch will be 'Not Customizable', but we'll direct the user to a second form
|
||||||
if they select 'C'; our condition for choosing this branch is a simple python
|
if they select 'C'; our condition for choosing this branch is a simple python
|
||||||
expression.
|
expression.
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
BPMN Workflows
|
BPMN Workflows
|
||||||
==============
|
==============
|
||||||
|
|
||||||
The basic idea of SpiffWorkflow is that you can use it to write an interpreter
|
The basic idea of SpiffWorkflow is that you can use it to write an interpreter
|
||||||
in Python that creates business applications from BPMN models. In this section,
|
in Python that creates business applications from BPMN models. In this section,
|
||||||
we'll develop a model of an example process and as well as a
|
we'll develop a model of an example process and as well as a
|
||||||
simple workflow runner.
|
simple workflow runner.
|
||||||
|
@ -11,8 +11,8 @@ We expect that readers will fall into two general categories:
|
||||||
- People with a background in BPMN who might not be very familiar Python
|
- People with a background in BPMN who might not be very familiar Python
|
||||||
- Python developers who might not know much about BPMN
|
- Python developers who might not know much about BPMN
|
||||||
|
|
||||||
This section of the documentation provides an example that (hopefully) serves
|
This section of the documentation provides an example that (hopefully) serves
|
||||||
the needs of both groups. We will introduce the BPMN elements that SpiffWorkflow
|
the needs of both groups. We will introduce the BPMN elements that SpiffWorkflow
|
||||||
supports and show how to build a simple workflow runner around them.
|
supports and show how to build a simple workflow runner around them.
|
||||||
|
|
||||||
SpiffWorkflow does heavy-lifting such as keeping track of task dependencies and
|
SpiffWorkflow does heavy-lifting such as keeping track of task dependencies and
|
||||||
|
@ -29,7 +29,7 @@ Quickstart
|
||||||
Check out the code in `spiff-example-cli <https://github.com/sartography/spiff-example-cli>`_
|
Check out the code in `spiff-example-cli <https://github.com/sartography/spiff-example-cli>`_
|
||||||
and follow the instructions to set up an environment to run it in.
|
and follow the instructions to set up an environment to run it in.
|
||||||
|
|
||||||
Run the sample workflow we built up using our example application with the following
|
Run the sample workflow we built up using our example application with the following
|
||||||
command:
|
command:
|
||||||
|
|
||||||
.. code-block:: console
|
.. code-block:: console
|
||||||
|
|
|
@ -28,7 +28,7 @@ selections in a collection.
|
||||||
|
|
||||||
Selecting more than one product
|
Selecting more than one product
|
||||||
|
|
||||||
We'll also need to update our element docmentation to display all products.
|
We'll also need to update our element documentation to display all products.
|
||||||
|
|
||||||
.. figure:: figures/documentation_multi.png
|
.. figure:: figures/documentation_multi.png
|
||||||
:scale: 30%
|
:scale: 30%
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
Organizing More Complex Workflows
|
Organizing More Complex Workflows
|
||||||
=================================
|
=================================
|
||||||
|
|
||||||
BPMN Model
|
BPMN Model
|
||||||
|
@ -43,20 +43,20 @@ For a simple code example of displaying a tasks lane, see `Handling Lanes`_
|
||||||
Subprocesses
|
Subprocesses
|
||||||
^^^^^^^^^^^^
|
^^^^^^^^^^^^
|
||||||
|
|
||||||
In general, subprocesses are a way of grouping work into smaller units. This, in
|
In general, subprocesses are a way of grouping work into smaller units. This, in
|
||||||
theory, will help us to re-use sections of business logic, but it will also allow
|
theory, will help us to re-use sections of business logic, but it will also allow
|
||||||
us to treat groups of work as a unit.
|
us to treat groups of work as a unit.
|
||||||
|
|
||||||
Subprocesses come in two different flavors. In this workflow we see an Expanded
|
Subprocesses come in two different flavors. In this workflow we see an Expanded
|
||||||
Subprocess. Unfortunately, we can't collapse an expanded subprocess within BPMN.js,
|
Subprocess. Unfortunately, we can't collapse an expanded subprocess within BPMN.js,
|
||||||
so expanded subprocesses are mainly useful for conceptualizing a group of tasks as
|
so expanded subprocesses are mainly useful for conceptualizing a group of tasks as
|
||||||
a unit.
|
a unit.
|
||||||
|
|
||||||
It also possible to refer to external subprocesses via a Call Activity Task. This
|
It also possible to refer to external subprocesses via a Call Activity Task. This
|
||||||
allows us to 'call' a separate workflow in a different file by referencing the ID of
|
allows us to 'call' a separate workflow in a different file by referencing the ID of
|
||||||
the called workflow, which can simplify business logic and make it re-usable.
|
the called workflow, which can simplify business logic and make it re-usable.
|
||||||
|
|
||||||
We'll expand 'Fulfill Order' into sub tasks -- retrieving the product and shipping
|
We'll expand 'Fulfill Order' into sub tasks -- retrieving the product and shipping
|
||||||
the order -- and create an Expanded Subprocess.
|
the order -- and create an Expanded Subprocess.
|
||||||
|
|
||||||
We'll also expand our selection of products, adding several new products and the ability
|
We'll also expand our selection of products, adding several new products and the ability
|
||||||
|
@ -68,14 +68,14 @@ to customize certain products by size and style in addition to color.
|
||||||
|
|
||||||
Updated Product List
|
Updated Product List
|
||||||
|
|
||||||
.. note::
|
.. note::
|
||||||
|
|
||||||
I've added what customizations are available for each product in the 'Annotations'
|
I've added what customizations are available for each product in the 'Annotations'
|
||||||
column of the DMN table. This is not actually used by Spiff; it simply provides
|
column of the DMN table. This is not actually used by Spiff; it simply provides
|
||||||
the option of documenting the decisions contained in the table.
|
the option of documenting the decisions contained in the table.
|
||||||
|
|
||||||
Since adding gateways for navigating the new options will add a certain amount of
|
Since adding gateways for navigating the new options will add a certain amount of
|
||||||
clutter to our diagram, we'll create a separate workflow around selecting and
|
clutter to our diagram, we'll create a separate workflow around selecting and
|
||||||
customizing products and refer to that in our main workflow.
|
customizing products and refer to that in our main workflow.
|
||||||
|
|
||||||
.. figure:: figures/call_activity.png
|
.. figure:: figures/call_activity.png
|
||||||
|
@ -116,7 +116,7 @@ our sample application, we'll simply display which lane a task belongs to.
|
||||||
.. code:: python
|
.. code:: python
|
||||||
|
|
||||||
if hasattr(task.task_spec, 'lane') and task.task_spec.lane is not None:
|
if hasattr(task.task_spec, 'lane') and task.task_spec.lane is not None:
|
||||||
lane = f'[{task.task_spec.lane}]'
|
lane = f'[{task.task_spec.lane}]'
|
||||||
else:
|
else:
|
||||||
lane = ''
|
lane = ''
|
||||||
|
|
||||||
|
|
|
@ -17,7 +17,7 @@ instead of the `run.py <https://github.com/sartography/spiff-example-clie/blob/m
|
||||||
Camunda's BPMN editor does not handle data objects in the expected way. You can create data object
|
Camunda's BPMN editor does not handle data objects in the expected way. You can create data object
|
||||||
references, but there is no way to re-use data objects.
|
references, but there is no way to re-use data objects.
|
||||||
|
|
||||||
It also does not support Message Correlations, and the inteface for generating a message payload doesn't work
|
It also does not support Message Correlations, and the interface for generating a message payload doesn't work
|
||||||
well in a Python environment.
|
well in a Python environment.
|
||||||
|
|
||||||
We have extended BPMN.js to correct some of these issues. The examples in this section were created using our
|
We have extended BPMN.js to correct some of these issues. The examples in this section were created using our
|
||||||
|
@ -25,7 +25,7 @@ custom BPMN editor, `bpmn-js-spiffworkflow <https://github.com/sartography/bpmn-
|
||||||
|
|
||||||
Data Objects
|
Data Objects
|
||||||
^^^^^^^^^^^^
|
^^^^^^^^^^^^
|
||||||
|
|
||||||
Data objects exist at a process level and are not visible in the diagram, but when you create a data object
|
Data objects exist at a process level and are not visible in the diagram, but when you create a data object
|
||||||
reference, you can choose what data object it points to.
|
reference, you can choose what data object it points to.
|
||||||
|
|
||||||
|
@ -35,8 +35,8 @@ Data Objects
|
||||||
|
|
||||||
Configuring a data object reference
|
Configuring a data object reference
|
||||||
|
|
||||||
When a data output association (a line) is drawn from a task to a data object reference, the value is copied
|
When a data output association (a line) is drawn from a task to a data object reference, the value is copied
|
||||||
from the task data to the workflow data and removed from the task. If a data input association is created from
|
from the task data to the workflow data and removed from the task. If a data input association is created from
|
||||||
a data object reference, the value is temporarily copied into the task data while the task is being executed,
|
a data object reference, the value is temporarily copied into the task data while the task is being executed,
|
||||||
and immediate removed afterwards.
|
and immediate removed afterwards.
|
||||||
|
|
||||||
|
@ -59,7 +59,7 @@ the 'Enter Payment Info' has been completed.
|
||||||
Configuring Messages
|
Configuring Messages
|
||||||
^^^^^^^^^^^^^^^^^^^^
|
^^^^^^^^^^^^^^^^^^^^
|
||||||
|
|
||||||
Messages are handled slightly differently in Spiff Message Events. On an Message Throw Event or Send Task,
|
Messages are handled slightly differently in Spiff Message Events. On a Message Throw Event or Send Task,
|
||||||
we define a payload, which is simply a bit of python code that will be evaluated against the task data and
|
we define a payload, which is simply a bit of python code that will be evaluated against the task data and
|
||||||
sent along with the message. In the corresponding Message Catch Event or Receive Task, we define a
|
sent along with the message. In the corresponding Message Catch Event or Receive Task, we define a
|
||||||
variable name where we'll store the result.
|
variable name where we'll store the result.
|
||||||
|
|
|
@ -4,13 +4,13 @@ Putting it All Together
|
||||||
In this section we'll be discussing the overall structure of the workflow
|
In this section we'll be discussing the overall structure of the workflow
|
||||||
runner we developed in `spiff-example-cli <https://github.com/sartography/spiff-example-cli>`_.
|
runner we developed in `spiff-example-cli <https://github.com/sartography/spiff-example-cli>`_.
|
||||||
|
|
||||||
Our example application contains two different workflow runners, one that uses tasks with
|
Our example application contains two different workflow runners, one that uses tasks with
|
||||||
Camunda extensions
|
Camunda extensions
|
||||||
(`run.py <https://github.com/sartography/spiff-example-cli/blob/main/run.py>`_) and one
|
(`run.py <https://github.com/sartography/spiff-example-cli/blob/main/run.py>`_) and one
|
||||||
that uses tasks with Spiff extensions
|
that uses tasks with Spiff extensions
|
||||||
(`run-spiff.py <https://github.com/sartography/spiff-example-cli/blob/main/run.py>`_).
|
(`run-spiff.py <https://github.com/sartography/spiff-example-cli/blob/main/run.py>`_).
|
||||||
|
|
||||||
Most of the workflow operations will not change, so shared functions are defined in
|
Most of the workflow operations will not change, so shared functions are defined in
|
||||||
`utils.py <https://github.com/sartography/spiff-example-cli/blob/main/utils.py>`_.
|
`utils.py <https://github.com/sartography/spiff-example-cli/blob/main/utils.py>`_.
|
||||||
|
|
||||||
The primary difference is handling user tasks. Spiff User Tasks define an extensions
|
The primary difference is handling user tasks. Spiff User Tasks define an extensions
|
||||||
|
@ -23,7 +23,7 @@ Loading a Workflow
|
||||||
-------------------
|
-------------------
|
||||||
|
|
||||||
The :code:`CamundaParser` extends the base :code:`BpmnParser`, adding functionality for
|
The :code:`CamundaParser` extends the base :code:`BpmnParser`, adding functionality for
|
||||||
parsing forms defined in Camunda User Tasks and decision tables defined in Camunda
|
parsing forms defined in Camunda User Tasks and decision tables defined in Camunda
|
||||||
Business Rule Tasks. (There is a similar :code:`SpiffBpmnParser` used by the alternate
|
Business Rule Tasks. (There is a similar :code:`SpiffBpmnParser` used by the alternate
|
||||||
runner.)
|
runner.)
|
||||||
|
|
||||||
|
@ -52,23 +52,23 @@ Our workflow parser looks like this;
|
||||||
We'll obtain the workflow specification from the parser for the top level process
|
We'll obtain the workflow specification from the parser for the top level process
|
||||||
using :code:`parser.get_spec()`.
|
using :code:`parser.get_spec()`.
|
||||||
|
|
||||||
We have two options for finding subprocess specs. The method :code:`parser.find_all_specs()`
|
We have two options for finding subprocess specs. The method :code:`parser.find_all_specs()`
|
||||||
will create specs for all executable processes found in every file supplied. The method
|
will create specs for all executable processes found in every file supplied. The method
|
||||||
:code:`parser.get_subprocess_specs(process)` will create specs only for processes used by
|
:code:`parser.get_subprocess_specs(process)` will create specs only for processes used by
|
||||||
the specified process. Both search recursively for subprocesses; the only difference is
|
the specified process. Both search recursively for subprocesses; the only difference is
|
||||||
the latter method limits the search start to the specified process.
|
the latter method limits the search start to the specified process.
|
||||||
|
|
||||||
Our examples are pretty simple and we're not loading any extraneous stuff, so we'll
|
Our examples are pretty simple, and we're not loading any extraneous stuff, so we'll
|
||||||
just always load everything. If your entire workflow is contained in your top-level
|
just always load everything. If your entire workflow is contained in your top-level
|
||||||
process, you can omit the :code:`subprocess` argument, but if your workflow contains
|
process, you can omit the :code:`subprocess` argument, but if your workflow contains
|
||||||
call activities, you'll need to use one of these methods to find the models for any
|
call activities, you'll need to use one of these methods to find the models for any
|
||||||
called processes.
|
called processes.
|
||||||
|
|
||||||
We also provide an enhanced script engine to our workflow. More information about how and
|
We also provide an enhanced script engine to our workflow. More information about how and
|
||||||
why you might want to do this is covered in :doc:`advanced`. The :code:`script_engine`
|
why you might want to do this is covered in :doc:`advanced`. The :code:`script_engine`
|
||||||
argument is optional and the default will be used if none is supplied.
|
argument is optional and the default will be used if none is supplied.
|
||||||
|
|
||||||
We return :code:`BpmnWorkflow` that runs our top-level workflow and contains specs for any
|
We return :code:`BpmnWorkflow` that runs our top-level workflow and contains specs for any
|
||||||
subprocesses defined by that workflow.
|
subprocesses defined by that workflow.
|
||||||
|
|
||||||
Defining Task Handlers
|
Defining Task Handlers
|
||||||
|
@ -91,7 +91,7 @@ We create a mapping of task type to handler, which we'll pass to our workflow ru
|
||||||
|
|
||||||
This might not be a step you would need to do in an application you build, since
|
This might not be a step you would need to do in an application you build, since
|
||||||
you would likely have only one set of task specs that need to be parsed, handled, and
|
you would likely have only one set of task specs that need to be parsed, handled, and
|
||||||
serialized; however our `run` method is an awful lot of code to maintain in two separate
|
serialized; however, our `run` method is an awful lot of code to maintain in two separate
|
||||||
files.
|
files.
|
||||||
|
|
||||||
Running a Workflow
|
Running a Workflow
|
||||||
|
@ -180,10 +180,10 @@ Examining the Workflow State
|
||||||
----------------------------
|
----------------------------
|
||||||
|
|
||||||
When this application is run and we want to present steps to the user, we'll need
|
When this application is run and we want to present steps to the user, we'll need
|
||||||
to be able to examine the workflow and task states and associated data. We'll cover
|
to be able to examine the workflow and task states and associated data. We'll cover
|
||||||
the basics of this in this section.
|
the basics of this in this section.
|
||||||
|
|
||||||
The code below is a simple method for displaying information about a task. We use
|
The code below is a simple method for displaying information about a task. We use
|
||||||
this in two ways
|
this in two ways
|
||||||
|
|
||||||
- presenting a list of tasks to a user (in this case the state will always be ready, so we won't include it)
|
- presenting a list of tasks to a user (in this case the state will always be ready, so we won't include it)
|
||||||
|
@ -233,7 +233,7 @@ We'll print information about our task as described above, as well as a dump of
|
||||||
We can get a list of all tasks regardless of type or state with :code:`workflow.get_tasks()`.
|
We can get a list of all tasks regardless of type or state with :code:`workflow.get_tasks()`.
|
||||||
|
|
||||||
The actual list of tasks will get quite long (some tasks are expanded internally by Spiff into
|
The actual list of tasks will get quite long (some tasks are expanded internally by Spiff into
|
||||||
multiple tasks, and all gateways and events are also treated as "tasks"). So we're filtering
|
multiple tasks, and all gateways and events are also treated as "tasks"). So we're filtering
|
||||||
the tasks to only display the ones that would have salience to a user here.
|
the tasks to only display the ones that would have salience to a user here.
|
||||||
|
|
||||||
We'll further filter those tasks for :code:`READY` and :code:`WAITING` tasks for a more
|
We'll further filter those tasks for :code:`READY` and :code:`WAITING` tasks for a more
|
||||||
|
|
|
@ -171,7 +171,7 @@ Our :code:`select_option` function simply repeats the prompt until the user
|
||||||
enters a value contained in the option list.
|
enters a value contained in the option list.
|
||||||
|
|
||||||
For other fields, we'll just store whatever the user enters, although in the case
|
For other fields, we'll just store whatever the user enters, although in the case
|
||||||
where they data type was specified to be a :code:`long`, we'll convert it to a
|
where the data type was specified to be a :code:`long`, we'll convert it to a
|
||||||
number.
|
number.
|
||||||
|
|
||||||
Finally, we need to explicitly store the user-provided response in a variable
|
Finally, we need to explicitly store the user-provided response in a variable
|
||||||
|
@ -219,4 +219,3 @@ The template string can be obtained from :code:`task.task_spec.documentation`.
|
||||||
|
|
||||||
As noted above, our template class comes from Jinja. We render the template
|
As noted above, our template class comes from Jinja. We render the template
|
||||||
using the task data, which is just a dictionary.
|
using the task data, which is just a dictionary.
|
||||||
|
|
||||||
|
|
|
@ -64,7 +64,7 @@ The following example also has one task, represented by the rectangle with curve
|
||||||
|
|
||||||
|
|
||||||
The sequence flow is represented with a solid line connector. When the node at
|
The sequence flow is represented with a solid line connector. When the node at
|
||||||
the tail of a sequence flow completes, the node at the arrowhead is enabled to start.
|
the tail of a sequence flow completes, the node at the arrowhead is enabled to start.
|
||||||
|
|
||||||
|
|
||||||
A More Complicated Workflow
|
A More Complicated Workflow
|
||||||
|
@ -78,7 +78,7 @@ A More Complicated Workflow
|
||||||
|
|
||||||
|
|
||||||
In this example, the diamond shape is called a gateway. It represents a branch
|
In this example, the diamond shape is called a gateway. It represents a branch
|
||||||
point in our flow. This gateway is an exclusive data-based gateway (also
|
point in our flow. This gateway is an exclusive data-based gateway (also
|
||||||
called an XOR gateway). With an exclusive gateway, you must take one path or
|
called an XOR gateway). With an exclusive gateway, you must take one path or
|
||||||
the other based on some data condition. BPMN has other gateway types.
|
the other based on some data condition. BPMN has other gateway types.
|
||||||
|
|
||||||
|
@ -122,4 +122,3 @@ attached to will be cancelled if the event is received) or Non-Interrupting (in
|
||||||
which case the task will continue). In both cases, flows may emanate from the
|
which case the task will continue). In both cases, flows may emanate from the
|
||||||
Boundary Event, which will trigger those paths if the events occur while the task
|
Boundary Event, which will trigger those paths if the events occur while the task
|
||||||
is being executed.
|
is being executed.
|
||||||
|
|
||||||
|
|
|
@ -4,7 +4,7 @@ Implementing Custom Tasks
|
||||||
Introduction
|
Introduction
|
||||||
------------
|
------------
|
||||||
|
|
||||||
In this second tutorial we are going to implement our own task, and
|
In this second tutorial, we are going to implement our own task, and
|
||||||
use serialization and deserialization to store and restore it.
|
use serialization and deserialization to store and restore it.
|
||||||
|
|
||||||
If you haven't already, you should complete the first
|
If you haven't already, you should complete the first
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
Non-BPMN support
|
Non-BPMN support
|
||||||
================
|
================
|
||||||
|
|
||||||
We have maintained support for legacy non-BPMN workflows, but we recommend using
|
We have maintained support for legacy non-BPMN workflows, but we recommend using
|
||||||
SpiffWorkflow with BPMN, as this is where current development is focused.
|
SpiffWorkflow with BPMN, as this is where current development is focused.
|
||||||
|
|
||||||
.. toctree::
|
.. toctree::
|
||||||
|
@ -9,4 +9,4 @@ SpiffWorkflow with BPMN, as this is where current development is focused.
|
||||||
|
|
||||||
tutorial/index
|
tutorial/index
|
||||||
custom-tasks/index
|
custom-tasks/index
|
||||||
patterns
|
patterns
|
||||||
|
|
|
@ -1,77 +1,77 @@
|
||||||
.. _patterns:
|
.. _patterns:
|
||||||
|
|
||||||
Supported Workflow Patterns
|
Supported Workflow Patterns
|
||||||
===========================
|
===========================
|
||||||
|
|
||||||
.. HINT::
|
.. HINT::
|
||||||
All examples are located
|
All examples are located
|
||||||
`here <https://github.com/knipknap/SpiffWorkflow/blob/master/tests/SpiffWorkflow/data/spiff/>`_.
|
`here <https://github.com/knipknap/SpiffWorkflow/blob/master/tests/SpiffWorkflow/data/spiff/>`_.
|
||||||
|
|
||||||
Control-Flow Patterns
|
Control-Flow Patterns
|
||||||
---------------------
|
---------------------
|
||||||
|
|
||||||
1. Sequence [control-flow/sequence.xml]
|
1. Sequence [control-flow/sequence.xml]
|
||||||
2. Parallel Split [control-flow/parallel_split.xml]
|
2. Parallel Split [control-flow/parallel_split.xml]
|
||||||
3. Synchronization [control-flow/synchronization.xml]
|
3. Synchronization [control-flow/synchronization.xml]
|
||||||
4. Exclusive Choice [control-flow/exclusive_choice.xml]
|
4. Exclusive Choice [control-flow/exclusive_choice.xml]
|
||||||
5. Simple Merge [control-flow/simple_merge.xml]
|
5. Simple Merge [control-flow/simple_merge.xml]
|
||||||
6. Multi-Choice [control-flow/multi_choice.xml]
|
6. Multi-Choice [control-flow/multi_choice.xml]
|
||||||
7. Structured Synchronizing Merge [control-flow/structured_synchronizing_merge.xml]
|
7. Structured Synchronizing Merge [control-flow/structured_synchronizing_merge.xml]
|
||||||
8. Multi-Merge [control-flow/multi_merge.xml]
|
8. Multi-Merge [control-flow/multi_merge.xml]
|
||||||
9. Structured Discriminator [control-flow/structured_discriminator.xml]
|
9. Structured Discriminator [control-flow/structured_discriminator.xml]
|
||||||
10. Arbitrary Cycles [control-flow/arbitrary_cycles.xml]
|
10. Arbitrary Cycles [control-flow/arbitrary_cycles.xml]
|
||||||
11. Implicit Termination [control-flow/implicit_termination.xml]
|
11. Implicit Termination [control-flow/implicit_termination.xml]
|
||||||
12. Multiple Instances without Synchronization [control-flow/multi_instance_without_synch.xml]
|
12. Multiple Instances without Synchronization [control-flow/multi_instance_without_synch.xml]
|
||||||
13. Multiple Instances with a Priori Design-Time Knowledge [control-flow/multi_instance_with_a_priori_design_time_knowledge.xml]
|
13. Multiple Instances with a Priori Design-Time Knowledge [control-flow/multi_instance_with_a_priori_design_time_knowledge.xml]
|
||||||
14. Multiple Instances with a Priori Run-Time Knowledge [control-flow/multi_instance_with_a_priori_run_time_knowledge.xml]
|
14. Multiple Instances with a Priori Run-Time Knowledge [control-flow/multi_instance_with_a_priori_run_time_knowledge.xml]
|
||||||
15. Multiple Instances without a Priori Run-Time Knowledge [control-flow/multi_instance_without_a_priori.xml]
|
15. Multiple Instances without a Priori Run-Time Knowledge [control-flow/multi_instance_without_a_priori.xml]
|
||||||
16. Deferred Choice [control-flow/deferred_choice.xml]
|
16. Deferred Choice [control-flow/deferred_choice.xml]
|
||||||
17. Interleaved Parallel Routing [control-flow/interleaved_parallel_routing.xml]
|
17. Interleaved Parallel Routing [control-flow/interleaved_parallel_routing.xml]
|
||||||
18. Milestone [control-flow/milestone.xml]
|
18. Milestone [control-flow/milestone.xml]
|
||||||
19. Cancel Task [control-flow/cancel_task.xml]
|
19. Cancel Task [control-flow/cancel_task.xml]
|
||||||
20. Cancel Case [control-flow/cancel_case.xml]
|
20. Cancel Case [control-flow/cancel_case.xml]
|
||||||
21. *NOT IMPLEMENTED*
|
21. *NOT IMPLEMENTED*
|
||||||
22. Recursion [control-flow/recursion.xml]
|
22. Recursion [control-flow/recursion.xml]
|
||||||
23. Transient Trigger [control-flow/transient_trigger.xml]
|
23. Transient Trigger [control-flow/transient_trigger.xml]
|
||||||
24. Persistent Trigger [control-flow/persistent_trigger.xml]
|
24. Persistent Trigger [control-flow/persistent_trigger.xml]
|
||||||
25. Cancel Region [control-flow/cancel_region.xml]
|
25. Cancel Region [control-flow/cancel_region.xml]
|
||||||
26. Cancel Multiple Instance Task [control-flow/cancel_multi_instance_task.xml]
|
26. Cancel Multiple Instance Task [control-flow/cancel_multi_instance_task.xml]
|
||||||
27. Complete Multiple Instance Task [control-flow/complete_multiple_instance_activity.xml]
|
27. Complete Multiple Instance Task [control-flow/complete_multiple_instance_activity.xml]
|
||||||
28. Blocking Discriminator [control-flow/blocking_discriminator.xml]
|
28. Blocking Discriminator [control-flow/blocking_discriminator.xml]
|
||||||
29. Cancelling Discriminator [control-flow/cancelling_discriminator.xml]
|
29. Cancelling Discriminator [control-flow/cancelling_discriminator.xml]
|
||||||
30. Structured Partial Join [control-flow/structured_partial_join.xml]
|
30. Structured Partial Join [control-flow/structured_partial_join.xml]
|
||||||
31. Blocking Partial Join [control-flow/blocking_partial_join.xml]
|
31. Blocking Partial Join [control-flow/blocking_partial_join.xml]
|
||||||
32. Cancelling Partial Join [control-flow/cancelling_partial_join.xml]
|
32. Cancelling Partial Join [control-flow/cancelling_partial_join.xml]
|
||||||
33. Generalized AND-Join [control-flow/generalized_and_join.xml]
|
33. Generalized AND-Join [control-flow/generalized_and_join.xml]
|
||||||
34. Static Partial Join for Multiple Instances [control-flow/static_partial_join_for_multi_instance.xml]
|
34. Static Partial Join for Multiple Instances [control-flow/static_partial_join_for_multi_instance.xml]
|
||||||
35. Cancelling Partial Join for Multiple Instances [control-flow/cancelling_partial_join_for_multi_instance.xml]
|
35. Cancelling Partial Join for Multiple Instances [control-flow/cancelling_partial_join_for_multi_instance.xml]
|
||||||
36. Dynamic Partial Join for Multiple Instances [control-flow/dynamic_partial_join_for_multi_instance.xml]
|
36. Dynamic Partial Join for Multiple Instances [control-flow/dynamic_partial_join_for_multi_instance.xml]
|
||||||
37. Acyclic Synchronizing Merge [control-flow/acyclic_synchronizing_merge.xml]
|
37. Acyclic Synchronizing Merge [control-flow/acyclic_synchronizing_merge.xml]
|
||||||
38. General Synchronizing Merge [control-flow/general_synchronizing_merge.xml]
|
38. General Synchronizing Merge [control-flow/general_synchronizing_merge.xml]
|
||||||
39. Critical Section [control-flow/critical_section.xml]
|
39. Critical Section [control-flow/critical_section.xml]
|
||||||
40. Interleaved Routing [control-flow/interleaved_routing.xml]
|
40. Interleaved Routing [control-flow/interleaved_routing.xml]
|
||||||
41. Thread Merge [control-flow/thread_merge.xml]
|
41. Thread Merge [control-flow/thread_merge.xml]
|
||||||
42. Thread Split [control-flow/thread_split.xml]
|
42. Thread Split [control-flow/thread_split.xml]
|
||||||
43. Explicit Termination [control-flow/explicit_termination.xml]
|
43. Explicit Termination [control-flow/explicit_termination.xml]
|
||||||
|
|
||||||
Workflow Data Patterns
|
Workflow Data Patterns
|
||||||
----------------------
|
----------------------
|
||||||
|
|
||||||
1. Task Data [data/task_data.xml]
|
1. Task Data [data/task_data.xml]
|
||||||
2. Block Data [data/block_data.xml]
|
2. Block Data [data/block_data.xml]
|
||||||
3. *NOT IMPLEMENTED*
|
3. *NOT IMPLEMENTED*
|
||||||
4. *NOT IMPLEMENTED*
|
4. *NOT IMPLEMENTED*
|
||||||
5. *NOT IMPLEMENTED*
|
5. *NOT IMPLEMENTED*
|
||||||
6. *NOT IMPLEMENTED*
|
6. *NOT IMPLEMENTED*
|
||||||
7. *NOT IMPLEMENTED*
|
7. *NOT IMPLEMENTED*
|
||||||
8. *NOT IMPLEMENTED*
|
8. *NOT IMPLEMENTED*
|
||||||
9. Task to Task [data/task_to_task.xml]
|
9. Task to Task [data/task_to_task.xml]
|
||||||
10. Block Task to Sub-Workflow Decomposition [data/block_to_subworkflow.xml]
|
10. Block Task to Sub-Workflow Decomposition [data/block_to_subworkflow.xml]
|
||||||
11. Sub-Workflow Decomposition to Block Task [data/subworkflow_to_block.xml]
|
11. Sub-Workflow Decomposition to Block Task [data/subworkflow_to_block.xml]
|
||||||
|
|
||||||
Specs that have no corresponding workflow pattern on workflowpatterns.com
|
Specs that have no corresponding workflow pattern on workflowpatterns.com
|
||||||
-------------------------------------------------------------------------
|
-------------------------------------------------------------------------
|
||||||
|
|
||||||
- Execute - spawns a subprocess and waits for the results
|
- Execute - spawns a subprocess and waits for the results
|
||||||
- Transform - executes commands that can be used for data transforms
|
- Transform - executes commands that can be used for data transforms
|
||||||
- Celery - executes a Celery task (see http://celeryproject.org/)
|
- Celery - executes a Celery task (see http://celeryproject.org/)
|
||||||
|
|
Loading…
Reference in New Issue