Squashed 'spiffworkflow-backend/' changes from 55f5c8113..52fad891c

52fad891c update spiff
3e21b7e2e Support reporting on task data (#47)
915fde676 Merge pull request #43 from sartography/send_filters
5c4e633bb Searching for call activities seems to be working now. I had the clear_caches in the wrong place previously - fixing.
fa4c8cfbf Pre-pr cleanup
59c71401d Get ./bin/pyl to pass
66e8ad455 Fix status not being a list
62da25810 Add flag to indicate if user filtered
5cb7dce9d It's good to clear out he cache, but we can't do that with correlation properties due to foreign key constraints that are built up over time.  So we'll just leave bad messages and correlations in the database for now -- we aren't doing a lot with messages yet anyway.
55020ed38 Merge branch 'main' of github.com:sartography/spiff-arena into send_filters
3334c7f9d update spiff
93526b03e pyl w/ burnettk cullerton
073ffb2ad added development permission for test user w/ burnettk cullerton
53a3f4c97 added some permissions for tasks
7415710d5 fixed broken test w/ burnettk
f648f3999 pyl w/ burnettk
b7ee3d24b added some permissions to the process model show page w/ burnettk
a77ee31b4 Merge pull request #39 from sartography/feature/call_activity_selection
324f58d3f need to filter on process here, or a DMN with the same name will obscure the BPMN.
4ed558470 use id_for_file_path when using the process model id as a path for windows and added some more permission stuff to the frontend w/ burnettk
5d06af147 Assure changes to process ids are updated in the cache on a file save, and remove old references that no longer exist.  Still some work to do here.
320a9c41a Getting ./bin/pyl to pass
51f02b20c Send filters back to client
167021e23 filters to_dict
ec68bcb24 Merge pull request #38 from sartography/feature/call_activity_selection
71b4f65b9 pyl
866c3699e Merge process_instance_list query filters with report filters (#37)
ed9936bfe Fixing a bug in SpiffWorkflow (new version in poetry.lock) fixing a test
89c17b7aa Fixing a bug in SpiffWorkflow (new version in poetry.lock) Don't explode when back-filling process models and hitting and error Assure processes are executable when setting them as the default primary process. The SpecReferenceCache now uses a unique constraint across two fields. (requires a new db)
18f45e90f added permission service to frontend to allow checking for permissions w/ burnettk
bd37cfefa added configuration nav item to help reduce nav items w/ burnettk
5fcc6fc87 fix mypy typing stuff. w/ jasquat
5008626b1 run_pyl
a3dcae016 Merge remote-tracking branch 'origin/main' into feature/call_activity_selection
2df8dd32d A little quick code cleanup.
58b702fa6 Adding a very simple api endpoint that just returns a list of every process known to the system.
a2a0ccac2 Assure that the list of cached Spec References includes all Process Instances and DMNs (even those that are not primary)
5dcdc225a fixed failing tests w/ burnettk
c121361d7 more refactoring for process instance list w/ burnettk
9bfb0f9e8 refactored pagination table to allow prefixing page options w/ burnettk
b03d531ab Mostly a name change from BpmnProcessIdLookup to SpecReferenceCache.  I landed on this unfortunate name because:
f7698c0ed use the same file path method in spec file service w/ burnettk
2482827c4 some fixes for windows and python 3.9 w/ burnettk
822c40525 merged in main and resolved pyl issues w/ burnettk
1cdf2af95 Merge remote-tracking branch 'origin/main' into feature/task_page
66b994efd added remaining task tables w/ burnettk
b7b6a97df work in spiff is approved and merged, updating dependency
37d9f7cc6 pyl w/ burnettk
d96aaa846 added message correlations to message instance list api call w/ burnettk
828e41c4d Adding a display name to the BPMN Process ID Lookup Table Removing (very nearly, except for script unit tests) all the XML Parsing we were doing, see related PR on SpiffWorkflow Moved the Custom Parser into its own file to solve some circular import issues
53d5a7b44 Merge branch 'main' into feature/call_activity_selection
5161fb6e7 Merge branch 'main' of github.com:sartography/spiff-arena into main
a2a203e8c add a search button to the call activity to allow finding a process id through some sort of admin interface.
430c03a58 noop, testing pre-commit hook.

git-subtree-dir: spiffworkflow-backend
git-subtree-split: 52fad891c58132ec8b64f56ac814409bb9e4ea4f
This commit is contained in:
burnettk 2022-11-20 19:48:30 -05:00
parent ac6706197c
commit 9275b67b0d
38 changed files with 2073 additions and 1267 deletions

View File

@ -1,9 +1,6 @@
Spiffworkflow Backend
==========
|Tests| |Codecov|
|pre-commit| |Black|
|Tests| |Codecov| |pre-commit| |Black|
.. |Tests| image:: https://github.com/sartography/spiffworkflow-backend/workflows/Tests/badge.svg
:target: https://github.com/sartography/spiffworkflow-backend/actions?workflow=Tests
@ -90,5 +87,3 @@ This project was generated from `@cjolowicz`_'s `Hypermodern Python Cookiecutter
.. github-only
.. _Contributor Guide: CONTRIBUTING.rst
.. _Usage: https://spiffworkflow-backend.readthedocs.io/en/latest/usage.html
(test)

View File

@ -1,9 +1,10 @@
import logging
from logging.config import fileConfig
from alembic import context
from flask import current_app
from alembic import context
# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config
@ -11,17 +12,17 @@ config = context.config
# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)
logger = logging.getLogger("alembic.env")
logger = logging.getLogger('alembic.env')
# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
config.set_main_option(
"sqlalchemy.url",
str(current_app.extensions["migrate"].db.get_engine().url).replace("%", "%%"),
)
target_metadata = current_app.extensions["migrate"].db.metadata
'sqlalchemy.url',
str(current_app.extensions['migrate'].db.get_engine().url).replace(
'%', '%%'))
target_metadata = current_app.extensions['migrate'].db.metadata
# other values from the config, defined by the needs of env.py,
# can be acquired:
@ -42,7 +43,9 @@ def run_migrations_offline():
"""
url = config.get_main_option("sqlalchemy.url")
context.configure(url=url, target_metadata=target_metadata, literal_binds=True)
context.configure(
url=url, target_metadata=target_metadata, literal_binds=True
)
with context.begin_transaction():
context.run_migrations()
@ -60,20 +63,20 @@ def run_migrations_online():
# when there are no changes to the schema
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
def process_revision_directives(context, revision, directives):
if getattr(config.cmd_opts, "autogenerate", False):
if getattr(config.cmd_opts, 'autogenerate', False):
script = directives[0]
if script.upgrade_ops.is_empty():
directives[:] = []
logger.info("No changes in schema detected.")
logger.info('No changes in schema detected.')
connectable = current_app.extensions["migrate"].db.get_engine()
connectable = current_app.extensions['migrate'].db.get_engine()
with connectable.connect() as connection:
context.configure(
connection=connection,
target_metadata=target_metadata,
process_revision_directives=process_revision_directives,
**current_app.extensions["migrate"].configure_args
**current_app.extensions['migrate'].configure_args
)
with context.begin_transaction():

View File

@ -0,0 +1,322 @@
"""empty message
Revision ID: b7790c9c8174
Revises:
Create Date: 2022-11-15 14:11:47.309399
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = 'b7790c9c8174'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('group',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('identifier', sa.String(length=255), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_table('message_model',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('identifier', sa.String(length=50), nullable=True),
sa.Column('name', sa.String(length=50), nullable=True),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_message_model_identifier'), 'message_model', ['identifier'], unique=True)
op.create_index(op.f('ix_message_model_name'), 'message_model', ['name'], unique=True)
op.create_table('permission_target',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('uri', sa.String(length=255), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('uri')
)
op.create_table('spec_reference_cache',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('identifier', sa.String(length=255), nullable=True),
sa.Column('display_name', sa.String(length=255), nullable=True),
sa.Column('process_model_id', sa.String(length=255), nullable=True),
sa.Column('type', sa.String(length=255), nullable=True),
sa.Column('file_name', sa.String(length=255), nullable=True),
sa.Column('relative_path', sa.String(length=255), nullable=True),
sa.Column('has_lanes', sa.Boolean(), nullable=True),
sa.Column('is_executable', sa.Boolean(), nullable=True),
sa.Column('is_primary', sa.Boolean(), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('identifier', 'type', name='_identifier_type_unique')
)
op.create_index(op.f('ix_spec_reference_cache_display_name'), 'spec_reference_cache', ['display_name'], unique=False)
op.create_index(op.f('ix_spec_reference_cache_identifier'), 'spec_reference_cache', ['identifier'], unique=False)
op.create_index(op.f('ix_spec_reference_cache_type'), 'spec_reference_cache', ['type'], unique=False)
op.create_table('spiff_logging',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_instance_id', sa.Integer(), nullable=False),
sa.Column('bpmn_process_identifier', sa.String(length=255), nullable=False),
sa.Column('bpmn_task_identifier', sa.String(length=255), nullable=False),
sa.Column('bpmn_task_name', sa.String(length=255), nullable=True),
sa.Column('bpmn_task_type', sa.String(length=255), nullable=True),
sa.Column('spiff_task_guid', sa.String(length=50), nullable=False),
sa.Column('timestamp', sa.DECIMAL(precision=17, scale=6), nullable=False),
sa.Column('message', sa.String(length=255), nullable=True),
sa.Column('current_user_id', sa.Integer(), nullable=True),
sa.Column('spiff_step', sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_table('user',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('username', sa.String(length=255), nullable=False),
sa.Column('uid', sa.String(length=50), nullable=True),
sa.Column('service', sa.String(length=50), nullable=False),
sa.Column('service_id', sa.String(length=255), nullable=False),
sa.Column('name', sa.String(length=255), nullable=True),
sa.Column('email', sa.String(length=255), nullable=True),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('service', 'service_id', name='service_key'),
sa.UniqueConstraint('uid'),
sa.UniqueConstraint('username')
)
op.create_table('message_correlation_property',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('identifier', sa.String(length=50), nullable=True),
sa.Column('message_model_id', sa.Integer(), nullable=False),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['message_model_id'], ['message_model.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('identifier', 'message_model_id', name='message_correlation_property_unique')
)
op.create_index(op.f('ix_message_correlation_property_identifier'), 'message_correlation_property', ['identifier'], unique=False)
op.create_table('message_triggerable_process_model',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('message_model_id', sa.Integer(), nullable=False),
sa.Column('process_model_identifier', sa.String(length=50), nullable=False),
sa.Column('process_group_identifier', sa.String(length=50), nullable=False),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['message_model_id'], ['message_model.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('message_model_id')
)
op.create_index(op.f('ix_message_triggerable_process_model_process_group_identifier'), 'message_triggerable_process_model', ['process_group_identifier'], unique=False)
op.create_index(op.f('ix_message_triggerable_process_model_process_model_identifier'), 'message_triggerable_process_model', ['process_model_identifier'], unique=False)
op.create_table('principal',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=True),
sa.Column('group_id', sa.Integer(), nullable=True),
sa.CheckConstraint('NOT(user_id IS NULL AND group_id IS NULL)'),
sa.ForeignKeyConstraint(['group_id'], ['group.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('group_id'),
sa.UniqueConstraint('user_id')
)
op.create_table('process_instance',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_model_identifier', sa.String(length=255), nullable=False),
sa.Column('process_group_identifier', sa.String(length=50), nullable=False),
sa.Column('process_initiator_id', sa.Integer(), nullable=False),
sa.Column('bpmn_json', sa.JSON(), nullable=True),
sa.Column('start_in_seconds', sa.Integer(), nullable=True),
sa.Column('end_in_seconds', sa.Integer(), nullable=True),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('status', sa.String(length=50), nullable=True),
sa.Column('bpmn_version_control_type', sa.String(length=50), nullable=True),
sa.Column('bpmn_version_control_identifier', sa.String(length=255), nullable=True),
sa.Column('spiff_step', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['process_initiator_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_process_instance_process_group_identifier'), 'process_instance', ['process_group_identifier'], unique=False)
op.create_index(op.f('ix_process_instance_process_model_identifier'), 'process_instance', ['process_model_identifier'], unique=False)
op.create_table('process_instance_report',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('identifier', sa.String(length=50), nullable=False),
sa.Column('report_metadata', sa.JSON(), nullable=True),
sa.Column('created_by_id', sa.Integer(), nullable=False),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['created_by_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('created_by_id', 'identifier', name='process_instance_report_unique')
)
op.create_index(op.f('ix_process_instance_report_created_by_id'), 'process_instance_report', ['created_by_id'], unique=False)
op.create_index(op.f('ix_process_instance_report_identifier'), 'process_instance_report', ['identifier'], unique=False)
op.create_table('refresh_token',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('token', sa.String(length=1024), nullable=False),
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user_id')
)
op.create_table('secret',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('key', sa.String(length=50), nullable=False),
sa.Column('value', sa.Text(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('key')
)
op.create_table('spiff_step_details',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_instance_id', sa.Integer(), nullable=False),
sa.Column('spiff_step', sa.Integer(), nullable=False),
sa.Column('task_json', sa.JSON(), nullable=False),
sa.Column('timestamp', sa.DECIMAL(precision=17, scale=6), nullable=False),
sa.Column('completed_by_user_id', sa.Integer(), nullable=True),
sa.Column('lane_assignment_id', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['lane_assignment_id'], ['group.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('user_group_assignment',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.Column('group_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['group_id'], ['group.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('user_id', 'group_id', name='user_group_assignment_unique')
)
op.create_table('active_task',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_instance_id', sa.Integer(), nullable=False),
sa.Column('actual_owner_id', sa.Integer(), nullable=True),
sa.Column('lane_assignment_id', sa.Integer(), nullable=True),
sa.Column('form_file_name', sa.String(length=50), nullable=True),
sa.Column('ui_form_file_name', sa.String(length=50), nullable=True),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('task_id', sa.String(length=50), nullable=True),
sa.Column('task_name', sa.String(length=50), nullable=True),
sa.Column('task_title', sa.String(length=50), nullable=True),
sa.Column('task_type', sa.String(length=50), nullable=True),
sa.Column('task_status', sa.String(length=50), nullable=True),
sa.Column('process_model_display_name', sa.String(length=255), nullable=True),
sa.ForeignKeyConstraint(['actual_owner_id'], ['user.id'], ),
sa.ForeignKeyConstraint(['lane_assignment_id'], ['group.id'], ),
sa.ForeignKeyConstraint(['process_instance_id'], ['process_instance.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('task_id', 'process_instance_id', name='active_task_unique')
)
op.create_table('message_correlation',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_instance_id', sa.Integer(), nullable=False),
sa.Column('message_correlation_property_id', sa.Integer(), nullable=False),
sa.Column('name', sa.String(length=255), nullable=False),
sa.Column('value', sa.String(length=255), nullable=False),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['message_correlation_property_id'], ['message_correlation_property.id'], ),
sa.ForeignKeyConstraint(['process_instance_id'], ['process_instance.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('process_instance_id', 'message_correlation_property_id', 'name', name='message_instance_id_name_unique')
)
op.create_index(op.f('ix_message_correlation_message_correlation_property_id'), 'message_correlation', ['message_correlation_property_id'], unique=False)
op.create_index(op.f('ix_message_correlation_name'), 'message_correlation', ['name'], unique=False)
op.create_index(op.f('ix_message_correlation_process_instance_id'), 'message_correlation', ['process_instance_id'], unique=False)
op.create_index(op.f('ix_message_correlation_value'), 'message_correlation', ['value'], unique=False)
op.create_table('message_instance',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('process_instance_id', sa.Integer(), nullable=False),
sa.Column('message_model_id', sa.Integer(), nullable=False),
sa.Column('message_type', sa.String(length=20), nullable=False),
sa.Column('payload', sa.JSON(), nullable=True),
sa.Column('status', sa.String(length=20), nullable=False),
sa.Column('failure_cause', sa.Text(), nullable=True),
sa.Column('updated_at_in_seconds', sa.Integer(), nullable=True),
sa.Column('created_at_in_seconds', sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(['message_model_id'], ['message_model.id'], ),
sa.ForeignKeyConstraint(['process_instance_id'], ['process_instance.id'], ),
sa.PrimaryKeyConstraint('id')
)
op.create_table('permission_assignment',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('principal_id', sa.Integer(), nullable=False),
sa.Column('permission_target_id', sa.Integer(), nullable=False),
sa.Column('grant_type', sa.String(length=50), nullable=False),
sa.Column('permission', sa.String(length=50), nullable=False),
sa.ForeignKeyConstraint(['permission_target_id'], ['permission_target.id'], ),
sa.ForeignKeyConstraint(['principal_id'], ['principal.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('principal_id', 'permission_target_id', 'permission', name='permission_assignment_uniq')
)
op.create_table('active_task_user',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('active_task_id', sa.Integer(), nullable=False),
sa.Column('user_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['active_task_id'], ['active_task.id'], ),
sa.ForeignKeyConstraint(['user_id'], ['user.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('active_task_id', 'user_id', name='active_task_user_unique')
)
op.create_index(op.f('ix_active_task_user_active_task_id'), 'active_task_user', ['active_task_id'], unique=False)
op.create_index(op.f('ix_active_task_user_user_id'), 'active_task_user', ['user_id'], unique=False)
op.create_table('message_correlation_message_instance',
sa.Column('id', sa.Integer(), nullable=False),
sa.Column('message_instance_id', sa.Integer(), nullable=False),
sa.Column('message_correlation_id', sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(['message_correlation_id'], ['message_correlation.id'], ),
sa.ForeignKeyConstraint(['message_instance_id'], ['message_instance.id'], ),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('message_instance_id', 'message_correlation_id', name='message_correlation_message_instance_unique')
)
op.create_index(op.f('ix_message_correlation_message_instance_message_correlation_id'), 'message_correlation_message_instance', ['message_correlation_id'], unique=False)
op.create_index(op.f('ix_message_correlation_message_instance_message_instance_id'), 'message_correlation_message_instance', ['message_instance_id'], unique=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(op.f('ix_message_correlation_message_instance_message_instance_id'), table_name='message_correlation_message_instance')
op.drop_index(op.f('ix_message_correlation_message_instance_message_correlation_id'), table_name='message_correlation_message_instance')
op.drop_table('message_correlation_message_instance')
op.drop_index(op.f('ix_active_task_user_user_id'), table_name='active_task_user')
op.drop_index(op.f('ix_active_task_user_active_task_id'), table_name='active_task_user')
op.drop_table('active_task_user')
op.drop_table('permission_assignment')
op.drop_table('message_instance')
op.drop_index(op.f('ix_message_correlation_value'), table_name='message_correlation')
op.drop_index(op.f('ix_message_correlation_process_instance_id'), table_name='message_correlation')
op.drop_index(op.f('ix_message_correlation_name'), table_name='message_correlation')
op.drop_index(op.f('ix_message_correlation_message_correlation_property_id'), table_name='message_correlation')
op.drop_table('message_correlation')
op.drop_table('active_task')
op.drop_table('user_group_assignment')
op.drop_table('spiff_step_details')
op.drop_table('secret')
op.drop_table('refresh_token')
op.drop_index(op.f('ix_process_instance_report_identifier'), table_name='process_instance_report')
op.drop_index(op.f('ix_process_instance_report_created_by_id'), table_name='process_instance_report')
op.drop_table('process_instance_report')
op.drop_index(op.f('ix_process_instance_process_model_identifier'), table_name='process_instance')
op.drop_index(op.f('ix_process_instance_process_group_identifier'), table_name='process_instance')
op.drop_table('process_instance')
op.drop_table('principal')
op.drop_index(op.f('ix_message_triggerable_process_model_process_model_identifier'), table_name='message_triggerable_process_model')
op.drop_index(op.f('ix_message_triggerable_process_model_process_group_identifier'), table_name='message_triggerable_process_model')
op.drop_table('message_triggerable_process_model')
op.drop_index(op.f('ix_message_correlation_property_identifier'), table_name='message_correlation_property')
op.drop_table('message_correlation_property')
op.drop_table('user')
op.drop_table('spiff_logging')
op.drop_index(op.f('ix_spec_reference_cache_type'), table_name='spec_reference_cache')
op.drop_index(op.f('ix_spec_reference_cache_identifier'), table_name='spec_reference_cache')
op.drop_index(op.f('ix_spec_reference_cache_display_name'), table_name='spec_reference_cache')
op.drop_table('spec_reference_cache')
op.drop_table('permission_target')
op.drop_index(op.f('ix_message_model_name'), table_name='message_model')
op.drop_index(op.f('ix_message_model_identifier'), table_name='message_model')
op.drop_table('message_model')
op.drop_table('group')
# ### end Alembic commands ###

View File

@ -1,549 +0,0 @@
"""empty message
Revision ID: fd00c59e1f60
Revises:
Create Date: 2022-11-09 14:04:14.169379
"""
import sqlalchemy as sa
from alembic import op
# revision identifiers, used by Alembic.
revision = "fd00c59e1f60"
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table(
"bpmn_process_id_lookup",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("bpmn_process_identifier", sa.String(length=255), nullable=True),
sa.Column("bpmn_file_relative_path", sa.String(length=255), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_bpmn_process_id_lookup_bpmn_process_identifier"),
"bpmn_process_id_lookup",
["bpmn_process_identifier"],
unique=True,
)
op.create_table(
"group",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=True),
sa.Column("identifier", sa.String(length=255), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_table(
"message_model",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("identifier", sa.String(length=50), nullable=True),
sa.Column("name", sa.String(length=50), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_message_model_identifier"),
"message_model",
["identifier"],
unique=True,
)
op.create_index(
op.f("ix_message_model_name"), "message_model", ["name"], unique=True
)
op.create_table(
"permission_target",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("uri", sa.String(length=255), nullable=False),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("uri"),
)
op.create_table(
"spiff_logging",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_instance_id", sa.Integer(), nullable=False),
sa.Column("bpmn_process_identifier", sa.String(length=255), nullable=False),
sa.Column("bpmn_task_identifier", sa.String(length=255), nullable=False),
sa.Column("bpmn_task_name", sa.String(length=255), nullable=True),
sa.Column("bpmn_task_type", sa.String(length=255), nullable=True),
sa.Column("spiff_task_guid", sa.String(length=50), nullable=False),
sa.Column("timestamp", sa.DECIMAL(precision=17, scale=6), nullable=False),
sa.Column("message", sa.String(length=255), nullable=True),
sa.Column("current_user_id", sa.Integer(), nullable=True),
sa.Column("spiff_step", sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint("id"),
)
op.create_table(
"spiff_step_details",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_instance_id", sa.Integer(), nullable=False),
sa.Column("spiff_step", sa.Integer(), nullable=False),
sa.Column("task_json", sa.JSON(), nullable=False),
sa.Column("timestamp", sa.DECIMAL(precision=17, scale=6), nullable=False),
sa.Column("completed_by_user_id", sa.Integer(), nullable=True),
sa.PrimaryKeyConstraint("id"),
)
op.create_table(
"user",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("username", sa.String(length=255), nullable=False),
sa.Column("uid", sa.String(length=50), nullable=True),
sa.Column("service", sa.String(length=50), nullable=False),
sa.Column("service_id", sa.String(length=255), nullable=False),
sa.Column("name", sa.String(length=255), nullable=True),
sa.Column("email", sa.String(length=255), nullable=True),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("service", "service_id", name="service_key"),
sa.UniqueConstraint("uid"),
sa.UniqueConstraint("username"),
)
op.create_table(
"message_correlation_property",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("identifier", sa.String(length=50), nullable=True),
sa.Column("message_model_id", sa.Integer(), nullable=False),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["message_model_id"],
["message_model.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"identifier", "message_model_id", name="message_correlation_property_unique"
),
)
op.create_index(
op.f("ix_message_correlation_property_identifier"),
"message_correlation_property",
["identifier"],
unique=False,
)
op.create_table(
"message_triggerable_process_model",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("message_model_id", sa.Integer(), nullable=False),
sa.Column("process_model_identifier", sa.String(length=50), nullable=False),
sa.Column("process_group_identifier", sa.String(length=50), nullable=False),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["message_model_id"],
["message_model.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("message_model_id"),
)
op.create_index(
op.f("ix_message_triggerable_process_model_process_group_identifier"),
"message_triggerable_process_model",
["process_group_identifier"],
unique=False,
)
op.create_index(
op.f("ix_message_triggerable_process_model_process_model_identifier"),
"message_triggerable_process_model",
["process_model_identifier"],
unique=False,
)
op.create_table(
"principal",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=True),
sa.Column("group_id", sa.Integer(), nullable=True),
sa.CheckConstraint("NOT(user_id IS NULL AND group_id IS NULL)"),
sa.ForeignKeyConstraint(
["group_id"],
["group.id"],
),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("group_id"),
sa.UniqueConstraint("user_id"),
)
op.create_table(
"process_instance",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_model_identifier", sa.String(length=255), nullable=False),
sa.Column("process_group_identifier", sa.String(length=50), nullable=False),
sa.Column("process_initiator_id", sa.Integer(), nullable=False),
sa.Column("bpmn_json", sa.JSON(), nullable=True),
sa.Column("start_in_seconds", sa.Integer(), nullable=True),
sa.Column("end_in_seconds", sa.Integer(), nullable=True),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("status", sa.String(length=50), nullable=True),
sa.Column("bpmn_version_control_type", sa.String(length=50), nullable=True),
sa.Column(
"bpmn_version_control_identifier", sa.String(length=255), nullable=True
),
sa.Column("spiff_step", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["process_initiator_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_index(
op.f("ix_process_instance_process_group_identifier"),
"process_instance",
["process_group_identifier"],
unique=False,
)
op.create_index(
op.f("ix_process_instance_process_model_identifier"),
"process_instance",
["process_model_identifier"],
unique=False,
)
op.create_table(
"process_instance_report",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("identifier", sa.String(length=50), nullable=False),
sa.Column("report_metadata", sa.JSON(), nullable=True),
sa.Column("created_by_id", sa.Integer(), nullable=False),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["created_by_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"created_by_id", "identifier", name="process_instance_report_unique"
),
)
op.create_index(
op.f("ix_process_instance_report_created_by_id"),
"process_instance_report",
["created_by_id"],
unique=False,
)
op.create_index(
op.f("ix_process_instance_report_identifier"),
"process_instance_report",
["identifier"],
unique=False,
)
op.create_table(
"refresh_token",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("token", sa.String(length=1024), nullable=False),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id"),
)
op.create_table(
"secret",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("key", sa.String(length=50), nullable=False),
sa.Column("value", sa.Text(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("key"),
)
op.create_table(
"user_group_assignment",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.Column("group_id", sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
["group_id"],
["group.id"],
),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint("user_id", "group_id", name="user_group_assignment_unique"),
)
op.create_table(
"active_task",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_instance_id", sa.Integer(), nullable=False),
sa.Column("actual_owner_id", sa.Integer(), nullable=True),
sa.Column("lane_assignment_id", sa.Integer(), nullable=True),
sa.Column("form_file_name", sa.String(length=50), nullable=True),
sa.Column("ui_form_file_name", sa.String(length=50), nullable=True),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("task_id", sa.String(length=50), nullable=True),
sa.Column("task_name", sa.String(length=50), nullable=True),
sa.Column("task_title", sa.String(length=50), nullable=True),
sa.Column("task_type", sa.String(length=50), nullable=True),
sa.Column("task_status", sa.String(length=50), nullable=True),
sa.Column("process_model_display_name", sa.String(length=255), nullable=True),
sa.ForeignKeyConstraint(
["actual_owner_id"],
["user.id"],
),
sa.ForeignKeyConstraint(
["lane_assignment_id"],
["group.id"],
),
sa.ForeignKeyConstraint(
["process_instance_id"],
["process_instance.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"task_id", "process_instance_id", name="active_task_unique"
),
)
op.create_table(
"message_correlation",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_instance_id", sa.Integer(), nullable=False),
sa.Column("message_correlation_property_id", sa.Integer(), nullable=False),
sa.Column("name", sa.String(length=255), nullable=False),
sa.Column("value", sa.String(length=255), nullable=False),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["message_correlation_property_id"],
["message_correlation_property.id"],
),
sa.ForeignKeyConstraint(
["process_instance_id"],
["process_instance.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"process_instance_id",
"message_correlation_property_id",
"name",
name="message_instance_id_name_unique",
),
)
op.create_index(
op.f("ix_message_correlation_message_correlation_property_id"),
"message_correlation",
["message_correlation_property_id"],
unique=False,
)
op.create_index(
op.f("ix_message_correlation_name"),
"message_correlation",
["name"],
unique=False,
)
op.create_index(
op.f("ix_message_correlation_process_instance_id"),
"message_correlation",
["process_instance_id"],
unique=False,
)
op.create_index(
op.f("ix_message_correlation_value"),
"message_correlation",
["value"],
unique=False,
)
op.create_table(
"message_instance",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("process_instance_id", sa.Integer(), nullable=False),
sa.Column("message_model_id", sa.Integer(), nullable=False),
sa.Column("message_type", sa.String(length=20), nullable=False),
sa.Column("payload", sa.JSON(), nullable=True),
sa.Column("status", sa.String(length=20), nullable=False),
sa.Column("failure_cause", sa.Text(), nullable=True),
sa.Column("updated_at_in_seconds", sa.Integer(), nullable=True),
sa.Column("created_at_in_seconds", sa.Integer(), nullable=True),
sa.ForeignKeyConstraint(
["message_model_id"],
["message_model.id"],
),
sa.ForeignKeyConstraint(
["process_instance_id"],
["process_instance.id"],
),
sa.PrimaryKeyConstraint("id"),
)
op.create_table(
"permission_assignment",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("principal_id", sa.Integer(), nullable=False),
sa.Column("permission_target_id", sa.Integer(), nullable=False),
sa.Column("grant_type", sa.String(length=50), nullable=False),
sa.Column("permission", sa.String(length=50), nullable=False),
sa.ForeignKeyConstraint(
["permission_target_id"],
["permission_target.id"],
),
sa.ForeignKeyConstraint(
["principal_id"],
["principal.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"principal_id",
"permission_target_id",
"permission",
name="permission_assignment_uniq",
),
)
op.create_table(
"active_task_user",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("active_task_id", sa.Integer(), nullable=False),
sa.Column("user_id", sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
["active_task_id"],
["active_task.id"],
),
sa.ForeignKeyConstraint(
["user_id"],
["user.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"active_task_id", "user_id", name="active_task_user_unique"
),
)
op.create_index(
op.f("ix_active_task_user_active_task_id"),
"active_task_user",
["active_task_id"],
unique=False,
)
op.create_index(
op.f("ix_active_task_user_user_id"),
"active_task_user",
["user_id"],
unique=False,
)
op.create_table(
"message_correlation_message_instance",
sa.Column("id", sa.Integer(), nullable=False),
sa.Column("message_instance_id", sa.Integer(), nullable=False),
sa.Column("message_correlation_id", sa.Integer(), nullable=False),
sa.ForeignKeyConstraint(
["message_correlation_id"],
["message_correlation.id"],
),
sa.ForeignKeyConstraint(
["message_instance_id"],
["message_instance.id"],
),
sa.PrimaryKeyConstraint("id"),
sa.UniqueConstraint(
"message_instance_id",
"message_correlation_id",
name="message_correlation_message_instance_unique",
),
)
op.create_index(
op.f("ix_message_correlation_message_instance_message_correlation_id"),
"message_correlation_message_instance",
["message_correlation_id"],
unique=False,
)
op.create_index(
op.f("ix_message_correlation_message_instance_message_instance_id"),
"message_correlation_message_instance",
["message_instance_id"],
unique=False,
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_index(
op.f("ix_message_correlation_message_instance_message_instance_id"),
table_name="message_correlation_message_instance",
)
op.drop_index(
op.f("ix_message_correlation_message_instance_message_correlation_id"),
table_name="message_correlation_message_instance",
)
op.drop_table("message_correlation_message_instance")
op.drop_index(op.f("ix_active_task_user_user_id"), table_name="active_task_user")
op.drop_index(
op.f("ix_active_task_user_active_task_id"), table_name="active_task_user"
)
op.drop_table("active_task_user")
op.drop_table("permission_assignment")
op.drop_table("message_instance")
op.drop_index(
op.f("ix_message_correlation_value"), table_name="message_correlation"
)
op.drop_index(
op.f("ix_message_correlation_process_instance_id"),
table_name="message_correlation",
)
op.drop_index(op.f("ix_message_correlation_name"), table_name="message_correlation")
op.drop_index(
op.f("ix_message_correlation_message_correlation_property_id"),
table_name="message_correlation",
)
op.drop_table("message_correlation")
op.drop_table("active_task")
op.drop_table("user_group_assignment")
op.drop_table("secret")
op.drop_table("refresh_token")
op.drop_index(
op.f("ix_process_instance_report_identifier"),
table_name="process_instance_report",
)
op.drop_index(
op.f("ix_process_instance_report_created_by_id"),
table_name="process_instance_report",
)
op.drop_table("process_instance_report")
op.drop_index(
op.f("ix_process_instance_process_model_identifier"),
table_name="process_instance",
)
op.drop_index(
op.f("ix_process_instance_process_group_identifier"),
table_name="process_instance",
)
op.drop_table("process_instance")
op.drop_table("principal")
op.drop_index(
op.f("ix_message_triggerable_process_model_process_model_identifier"),
table_name="message_triggerable_process_model",
)
op.drop_index(
op.f("ix_message_triggerable_process_model_process_group_identifier"),
table_name="message_triggerable_process_model",
)
op.drop_table("message_triggerable_process_model")
op.drop_index(
op.f("ix_message_correlation_property_identifier"),
table_name="message_correlation_property",
)
op.drop_table("message_correlation_property")
op.drop_table("user")
op.drop_table("spiff_step_details")
op.drop_table("spiff_logging")
op.drop_table("permission_target")
op.drop_index(op.f("ix_message_model_name"), table_name="message_model")
op.drop_index(op.f("ix_message_model_identifier"), table_name="message_model")
op.drop_table("message_model")
op.drop_table("group")
op.drop_index(
op.f("ix_bpmn_process_id_lookup_bpmn_process_identifier"),
table_name="bpmn_process_id_lookup",
)
op.drop_table("bpmn_process_id_lookup")
# ### end Alembic commands ###

27
poetry.lock generated
View File

@ -643,7 +643,7 @@ werkzeug = "*"
type = "git"
url = "https://github.com/sartography/flask-bpmn"
reference = "main"
resolved_reference = "df9ab9a12078e4f908c87778371725e0af414a11"
resolved_reference = "860f2387bebdaa9220e9fbf6f8fa7f74e805d0d4"
[[package]]
name = "Flask-Cors"
@ -825,7 +825,7 @@ python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*"
[[package]]
name = "importlib-metadata"
version = "5.0.0"
version = "4.13.0"
description = "Read metadata from Python packages"
category = "main"
optional = false
@ -1876,7 +1876,7 @@ lxml = "*"
type = "git"
url = "https://github.com/sartography/SpiffWorkflow"
reference = "main"
resolved_reference = "580939cc8cb0b7ade1571483bd1e28f554434ac4"
resolved_reference = "46f410a2852baeedc8f9ac5165347ce6d4470594"
[[package]]
name = "SQLAlchemy"
@ -2596,6 +2596,7 @@ greenlet = [
{file = "greenlet-2.0.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d5b0ff9878333823226d270417f24f4d06f235cb3e54d1103b71ea537a6a86ce"},
{file = "greenlet-2.0.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:be9e0fb2ada7e5124f5282d6381903183ecc73ea019568d6d63d33f25b2a9000"},
{file = "greenlet-2.0.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b493db84d124805865adc587532ebad30efa68f79ad68f11b336e0a51ec86c2"},
{file = "greenlet-2.0.1-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:0459d94f73265744fee4c2d5ec44c6f34aa8a31017e6e9de770f7bcf29710be9"},
{file = "greenlet-2.0.1-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:a20d33124935d27b80e6fdacbd34205732660e0a1d35d8b10b3328179a2b51a1"},
{file = "greenlet-2.0.1-cp37-cp37m-win32.whl", hash = "sha256:ea688d11707d30e212e0110a1aac7f7f3f542a259235d396f88be68b649e47d1"},
{file = "greenlet-2.0.1-cp37-cp37m-win_amd64.whl", hash = "sha256:afe07421c969e259e9403c3bb658968702bc3b78ec0b6fde3ae1e73440529c23"},
@ -2604,6 +2605,7 @@ greenlet = [
{file = "greenlet-2.0.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:659f167f419a4609bc0516fb18ea69ed39dbb25594934bd2dd4d0401660e8a1e"},
{file = "greenlet-2.0.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:356e4519d4dfa766d50ecc498544b44c0249b6de66426041d7f8b751de4d6b48"},
{file = "greenlet-2.0.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:811e1d37d60b47cb8126e0a929b58c046251f28117cb16fcd371eed61f66b764"},
{file = "greenlet-2.0.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:d38ffd0e81ba8ef347d2be0772e899c289b59ff150ebbbbe05dc61b1246eb4e0"},
{file = "greenlet-2.0.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:0109af1138afbfb8ae647e31a2b1ab030f58b21dd8528c27beaeb0093b7938a9"},
{file = "greenlet-2.0.1-cp38-cp38-win32.whl", hash = "sha256:88c8d517e78acdf7df8a2134a3c4b964415b575d2840a2746ddb1cc6175f8608"},
{file = "greenlet-2.0.1-cp38-cp38-win_amd64.whl", hash = "sha256:d6ee1aa7ab36475035eb48c01efae87d37936a8173fc4d7b10bb02c2d75dd8f6"},
@ -2612,6 +2614,7 @@ greenlet = [
{file = "greenlet-2.0.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:505138d4fa69462447a562a7c2ef723c6025ba12ac04478bc1ce2fcc279a2db5"},
{file = "greenlet-2.0.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:cce1e90dd302f45716a7715517c6aa0468af0bf38e814ad4eab58e88fc09f7f7"},
{file = "greenlet-2.0.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e9744c657d896c7b580455e739899e492a4a452e2dd4d2b3e459f6b244a638d"},
{file = "greenlet-2.0.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:662e8f7cad915ba75d8017b3e601afc01ef20deeeabf281bd00369de196d7726"},
{file = "greenlet-2.0.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:41b825d65f31e394b523c84db84f9383a2f7eefc13d987f308f4663794d2687e"},
{file = "greenlet-2.0.1-cp39-cp39-win32.whl", hash = "sha256:db38f80540083ea33bdab614a9d28bcec4b54daa5aff1668d7827a9fc769ae0a"},
{file = "greenlet-2.0.1-cp39-cp39-win_amd64.whl", hash = "sha256:b23d2a46d53210b498e5b701a1913697671988f4bf8e10f935433f6e7c332fb6"},
@ -2634,8 +2637,8 @@ imagesize = [
{file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"},
]
importlib-metadata = [
{file = "importlib_metadata-5.0.0-py3-none-any.whl", hash = "sha256:ddb0e35065e8938f867ed4928d0ae5bf2a53b7773871bfe6bcc7e4fcdc7dea43"},
{file = "importlib_metadata-5.0.0.tar.gz", hash = "sha256:da31db32b304314d044d3c12c79bd59e307889b287ad12ff387b3500835fc2ab"},
{file = "importlib_metadata-4.13.0-py3-none-any.whl", hash = "sha256:8a8a81bcf996e74fee46f0d16bd3eaa382a7eb20fd82445c3ad11f4090334116"},
{file = "importlib_metadata-4.13.0.tar.gz", hash = "sha256:dd0173e8f150d6815e098fd354f6414b0f079af4644ddfe90c71e2fc6174346d"},
]
inflection = [
{file = "inflection-0.5.1-py2.py3-none-any.whl", hash = "sha256:f38b2b640938a4f35ade69ac3d053042959b62a0f1076a5bbaa1b9526605a8a2"},
@ -2940,10 +2943,7 @@ orjson = [
{file = "orjson-3.8.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:b68a42a31f8429728183c21fb440c21de1b62e5378d0d73f280e2d894ef8942e"},
{file = "orjson-3.8.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:ff13410ddbdda5d4197a4a4c09969cb78c722a67550f0a63c02c07aadc624833"},
{file = "orjson-3.8.0-cp310-none-win_amd64.whl", hash = "sha256:2d81e6e56bbea44be0222fb53f7b255b4e7426290516771592738ca01dbd053b"},
{file = "orjson-3.8.0-cp311-cp311-macosx_10_7_x86_64.whl", hash = "sha256:200eae21c33f1f8b02a11f5d88d76950cd6fd986d88f1afe497a8ae2627c49aa"},
{file = "orjson-3.8.0-cp311-cp311-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:9529990f3eab54b976d327360aa1ff244a4b12cb5e4c5b3712fcdd96e8fe56d4"},
{file = "orjson-3.8.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:e2defd9527651ad39ec20ae03c812adf47ef7662bdd6bc07dabb10888d70dc62"},
{file = "orjson-3.8.0-cp311-none-win_amd64.whl", hash = "sha256:b21c7af0ff6228ca7105f54f0800636eb49201133e15ddb80ac20c1ce973ef07"},
{file = "orjson-3.8.0-cp37-cp37m-macosx_10_7_x86_64.whl", hash = "sha256:9e6ac22cec72d5b39035b566e4b86c74b84866f12b5b0b6541506a080fb67d6d"},
{file = "orjson-3.8.0-cp37-cp37m-macosx_10_9_x86_64.macosx_11_0_arm64.macosx_10_9_universal2.whl", hash = "sha256:e2f4a5542f50e3d336a18cb224fc757245ca66b1fd0b70b5dd4471b8ff5f2b0e"},
{file = "orjson-3.8.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e1418feeb8b698b9224b1f024555895169d481604d5d884498c1838d7412794c"},
@ -3056,7 +3056,18 @@ py = [
{file = "py-1.11.0.tar.gz", hash = "sha256:51c75c4126074b472f746a24399ad32f6053d1b34b68d2fa41e558e6f4a98719"},
]
pyasn1 = [
{file = "pyasn1-0.4.8-py2.4.egg", hash = "sha256:fec3e9d8e36808a28efb59b489e4528c10ad0f480e57dcc32b4de5c9d8c9fdf3"},
{file = "pyasn1-0.4.8-py2.5.egg", hash = "sha256:0458773cfe65b153891ac249bcf1b5f8f320b7c2ce462151f8fa74de8934becf"},
{file = "pyasn1-0.4.8-py2.6.egg", hash = "sha256:5c9414dcfede6e441f7e8f81b43b34e834731003427e5b09e4e00e3172a10f00"},
{file = "pyasn1-0.4.8-py2.7.egg", hash = "sha256:6e7545f1a61025a4e58bb336952c5061697da694db1cae97b116e9c46abcf7c8"},
{file = "pyasn1-0.4.8-py2.py3-none-any.whl", hash = "sha256:39c7e2ec30515947ff4e87fb6f456dfc6e84857d34be479c9d4a4ba4bf46aa5d"},
{file = "pyasn1-0.4.8-py3.1.egg", hash = "sha256:78fa6da68ed2727915c4767bb386ab32cdba863caa7dbe473eaae45f9959da86"},
{file = "pyasn1-0.4.8-py3.2.egg", hash = "sha256:08c3c53b75eaa48d71cf8c710312316392ed40899cb34710d092e96745a358b7"},
{file = "pyasn1-0.4.8-py3.3.egg", hash = "sha256:03840c999ba71680a131cfaee6fab142e1ed9bbd9c693e285cc6aca0d555e576"},
{file = "pyasn1-0.4.8-py3.4.egg", hash = "sha256:7ab8a544af125fb704feadb008c99a88805126fb525280b2270bb25cc1d78a12"},
{file = "pyasn1-0.4.8-py3.5.egg", hash = "sha256:e89bf84b5437b532b0803ba5c9a5e054d21fec423a89952a74f87fa2c9b7bce2"},
{file = "pyasn1-0.4.8-py3.6.egg", hash = "sha256:014c0e9976956a08139dc0712ae195324a75e142284d5f87f1a87ee1b068a359"},
{file = "pyasn1-0.4.8-py3.7.egg", hash = "sha256:99fcc3c8d804d1bc6d9a099921e39d827026409a58f2a720dcdb89374ea0c776"},
{file = "pyasn1-0.4.8.tar.gz", hash = "sha256:aef77c9fb94a3ac588e87841208bdec464471d9871bd5050a287cc9a475cd0ba"},
]
pycodestyle = [

View File

@ -258,7 +258,6 @@ paths:
description: The number of models to show per page. Defaults to page 10.
schema:
type: integer
# process_model_list
get:
operationId: spiffworkflow_backend.routes.process_api_blueprint.process_model_list
summary: Return a list of process models for a given process group
@ -273,9 +272,10 @@ paths:
type: array
items:
$ref: "#/components/schemas/ProcessModel"
# process_model_add
/process-models/{modified_process_group_id}:
post:
operationId: spiffworkflow_backend.routes.process_api_blueprint.process_model_add
operationId: spiffworkflow_backend.routes.process_api_blueprint.process_model_create
summary: Creates a new process model with the given parameters.
tags:
- Process Models
@ -372,6 +372,23 @@ paths:
schema:
$ref: "#/components/schemas/OkTrue"
/processes:
get:
operationId: spiffworkflow_backend.routes.process_api_blueprint.process_list
summary: Return a list of all processes (not just primary process of a process model)
useful for finding processes for call activites.
tags:
- Process Models
responses:
"200":
description: Successfully return the requested processes
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/Process"
/process-instances:
parameters:
- name: process_model_identifier
@ -422,6 +439,12 @@ paths:
description: For filtering - not_started, user_input_required, waiting, complete, error, or suspended
schema:
type: string
- name: user_filter
in: query
required: false
description: For filtering - indicates the user has manually entered a query
schema:
type: boolean
# process_instance_list
get:
operationId: spiffworkflow_backend.routes.process_api_blueprint.process_instance_list
@ -901,7 +924,7 @@ paths:
items:
$ref: "#/components/schemas/Task"
/tasks/for-processes-started-by-others:
/tasks/for-me:
parameters:
- name: page
in: query
@ -918,7 +941,7 @@ paths:
get:
tags:
- Process Instances
operationId: spiffworkflow_backend.routes.process_api_blueprint.task_list_for_processes_started_by_others
operationId: spiffworkflow_backend.routes.process_api_blueprint.task_list_for_me
summary: returns the list of tasks for given user's open process instances
responses:
"200":
@ -930,7 +953,36 @@ paths:
items:
$ref: "#/components/schemas/Task"
/process-instance/{process_instance_id}/tasks:
/tasks/for-my-groups:
parameters:
- name: page
in: query
required: false
description: The page number to return. Defaults to page 1.
schema:
type: integer
- name: per_page
in: query
required: false
description: The page number to return. Defaults to page 1.
schema:
type: integer
get:
tags:
- Process Instances
operationId: spiffworkflow_backend.routes.process_api_blueprint.task_list_for_my_groups
summary: returns the list of tasks for given user's open process instances
responses:
"200":
description: list of tasks
content:
application/json:
schema:
type: array
items:
$ref: "#/components/schemas/Task"
/process-instances/{modified_process_model_id}/{process_instance_id}/tasks:
parameters:
- name: process_instance_id
in: path
@ -1530,7 +1582,26 @@ components:
type: string
x-nullable: true
example: Some Value
Process:
properties:
identifier:
type: string
display_name:
type: string
process_group_id:
type: string
process_model_id:
type: string
type:
type: string
file_name:
type: string
has_lanes:
type: boolean
is_executable:
type: boolean
is_primary:
type: boolean
ProcessModel:
properties:
id:

View File

@ -51,6 +51,24 @@ permissions:
allowed_permissions: [create, read, update, delete]
uri: /v1.0/tasks/*
process-model-read-all:
groups: [everybody]
users: []
allowed_permissions: [read]
uri: /v1.0/process-models/*
process-group-read-all:
groups: [everybody]
users: []
allowed_permissions: [read]
uri: /v1.0/process-groups/*
process-instance-list:
groups: [everybody]
users: []
allowed_permissions: [read]
uri: /v1.0/process-instances
# TODO: all uris should really have the same structure
finance-admin-group:
groups: ["Finance Team"]
@ -69,3 +87,9 @@ permissions:
users: []
allowed_permissions: [read]
uri: /*
invoice-approval-tasks-read:
groups: ["Finance Team"]
users: []
allowed_permissions: [read]
uri: /v1.0/process-instances/category_number_one:lanes/*

View File

@ -18,8 +18,8 @@ from spiffworkflow_backend.models.principal import PrincipalModel # noqa: F401
from spiffworkflow_backend.models.active_task import ActiveTaskModel # noqa: F401
from spiffworkflow_backend.models.bpmn_process_id_lookup import (
BpmnProcessIdLookup,
from spiffworkflow_backend.models.spec_reference import (
SpecReferenceCache,
) # noqa: F401
from spiffworkflow_backend.models.message_correlation_property import (
MessageCorrelationPropertyModel,

View File

@ -1,13 +0,0 @@
"""Message_model."""
from flask_bpmn.models.db import db
from flask_bpmn.models.db import SpiffworkflowBaseDBModel
class BpmnProcessIdLookup(SpiffworkflowBaseDBModel):
"""BpmnProcessIdLookup."""
__tablename__ = "bpmn_process_id_lookup"
id = db.Column(db.Integer, primary_key=True)
bpmn_process_identifier = db.Column(db.String(255), unique=True, index=True)
bpmn_file_relative_path = db.Column(db.String(255))

View File

@ -9,6 +9,7 @@ from marshmallow import INCLUDE
from marshmallow import Schema
from spiffworkflow_backend.helpers.spiff_enum import SpiffEnum
from spiffworkflow_backend.models.spec_reference import SpecReference
class FileType(SpiffEnum):
@ -62,20 +63,6 @@ CONTENT_TYPES = {
}
@dataclass()
class FileReference:
"""File Reference Information.
Includes items such as the process id and name for a BPMN,
or the Decision id and Decision name for a DMN file. There may be more than
one reference that points to a particular file.
"""
id: str
name: str
type: str # can be 'process', 'decision', or just 'file'
@dataclass(order=True)
class File:
"""File."""
@ -87,7 +74,7 @@ class File:
type: str
last_modified: datetime
size: int
references: Optional[list[FileReference]] = None
references: Optional[list[SpecReference]] = None
file_contents: Optional[bytes] = None
process_model_id: Optional[str] = None
process_group_id: Optional[str] = None
@ -140,16 +127,5 @@ class FileSchema(Schema):
]
unknown = INCLUDE
references = marshmallow.fields.List(
marshmallow.fields.Nested("FileReferenceSchema")
marshmallow.fields.Nested("SpecReferenceSchema")
)
class FileReferenceSchema(Schema):
"""FileSchema."""
class Meta:
"""Meta."""
model = FileReference
fields = ["id", "name", "type"]
unknown = INCLUDE

View File

@ -44,6 +44,7 @@ class MessageCorrelationModel(SpiffworkflowBaseDBModel):
updated_at_in_seconds: int = db.Column(db.Integer)
created_at_in_seconds: int = db.Column(db.Integer)
message_correlation_property = relationship("MessageCorrelationPropertyModel")
message_correlations_message_instances = relationship(
"MessageCorrelationMessageInstanceModel", cascade="delete"
)

View File

@ -59,6 +59,8 @@ class MessageInstanceModel(SpiffworkflowBaseDBModel):
updated_at_in_seconds: int = db.Column(db.Integer)
created_at_in_seconds: int = db.Column(db.Integer)
message_correlations: Optional[dict] = None
@validates("message_type")
def validate_message_type(self, key: str, value: Any) -> Any:
"""Validate_message_type."""

View File

@ -16,6 +16,7 @@ class MessageTriggerableProcessModel(SpiffworkflowBaseDBModel):
ForeignKey(MessageModel.id), nullable=False, unique=True
)
process_model_identifier: str = db.Column(db.String(50), nullable=False, index=True)
# fixme: Maybe we don't need this anymore?
process_group_identifier: str = db.Column(db.String(50), nullable=False, index=True)
updated_at_in_seconds: int = db.Column(db.Integer)

View File

@ -0,0 +1,87 @@
"""Message_model."""
from dataclasses import dataclass
from flask_bpmn.models.db import db
from flask_bpmn.models.db import SpiffworkflowBaseDBModel
from flask_marshmallow import Schema # type: ignore
from marshmallow import INCLUDE
from sqlalchemy import UniqueConstraint
@dataclass()
class SpecReference:
"""File Reference Information.
Includes items such as the process id and name for a BPMN,
or the Decision id and Decision name for a DMN file. There may be more than
one reference that points to a particular file - if for instance, there are
three executable processes in a collaboration within a BPMN Diagram.
"""
identifier: str # The id of the process or decision. "Process_1234"
display_name: str # The name of the process or decision. "Invoice Submission"
process_model_id: str
type: str # can be 'process' or 'decision'
file_name: str # The name of the file where this process or decision is defined.
relative_path: str # The path to the file.
has_lanes: bool # If this is a process, whether it has lanes or not.
is_executable: bool # Whether this process or decision is designated as executable.
is_primary: bool # Whether this is the primary process of a process model
messages: dict # Any messages defined in the same file where this process is defined.
correlations: dict # Any correlations defined in the same file with this process.
start_messages: list # The names of any messages that would start this process.
class SpecReferenceCache(SpiffworkflowBaseDBModel):
"""A cache of information about all the Processes and Decisions defined in all files."""
__tablename__ = "spec_reference_cache"
__table_args__ = (
UniqueConstraint("identifier", "type", name="_identifier_type_unique"),
)
id = db.Column(db.Integer, primary_key=True)
identifier = db.Column(db.String(255), index=True)
display_name = db.Column(db.String(255), index=True)
process_model_id = db.Column(db.String(255))
type = db.Column(db.String(255), index=True) # either 'process' or 'decision'
file_name = db.Column(db.String(255))
relative_path = db.Column(db.String(255))
has_lanes = db.Column(db.Boolean())
is_executable = db.Column(db.Boolean())
is_primary = db.Column(db.Boolean())
@classmethod
def from_spec_reference(cls, ref: SpecReference) -> "SpecReferenceCache":
"""From_spec_reference."""
return cls(
identifier=ref.identifier,
display_name=ref.display_name,
process_model_id=ref.process_model_id,
type=ref.type,
file_name=ref.file_name,
has_lanes=ref.has_lanes,
is_executable=ref.is_executable,
is_primary=ref.is_primary,
relative_path=ref.relative_path,
)
class SpecReferenceSchema(Schema): # type: ignore
"""FileSchema."""
class Meta:
"""Meta."""
model = SpecReference
fields = [
"identifier",
"display_name",
"process_group_id",
"process_model_id",
"type",
"file_name",
"has_lanes",
"is_executable",
"is_primary",
]
unknown = INCLUDE

View File

@ -1,10 +1,14 @@
"""Spiff_step_details."""
from dataclasses import dataclass
from typing import Optional
from flask_bpmn.models.db import db
from flask_bpmn.models.db import SpiffworkflowBaseDBModel
from sqlalchemy import ForeignKey
from sqlalchemy.orm import deferred
from spiffworkflow_backend.models.group import GroupModel
@dataclass
class SpiffStepDetailsModel(SpiffworkflowBaseDBModel):
@ -17,3 +21,6 @@ class SpiffStepDetailsModel(SpiffworkflowBaseDBModel):
task_json: str = deferred(db.Column(db.JSON, nullable=False)) # type: ignore
timestamp: float = db.Column(db.DECIMAL(17, 6), nullable=False)
completed_by_user_id: int = db.Column(db.Integer, nullable=True)
lane_assignment_id: Optional[int] = db.Column(
ForeignKey(GroupModel.id), nullable=True
)

View File

@ -39,6 +39,7 @@ from spiffworkflow_backend.models.active_task import ActiveTaskModel
from spiffworkflow_backend.models.active_task_user import ActiveTaskUserModel
from spiffworkflow_backend.models.file import FileSchema
from spiffworkflow_backend.models.group import GroupModel
from spiffworkflow_backend.models.message_correlation import MessageCorrelationModel
from spiffworkflow_backend.models.message_instance import MessageInstanceModel
from spiffworkflow_backend.models.message_model import MessageModel
from spiffworkflow_backend.models.message_triggerable_process_model import (
@ -58,19 +59,25 @@ from spiffworkflow_backend.models.process_model import ProcessModelInfo
from spiffworkflow_backend.models.process_model import ProcessModelInfoSchema
from spiffworkflow_backend.models.secret_model import SecretModel
from spiffworkflow_backend.models.secret_model import SecretModelSchema
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.models.spec_reference import SpecReferenceSchema
from spiffworkflow_backend.models.spiff_logging import SpiffLoggingModel
from spiffworkflow_backend.models.spiff_step_details import SpiffStepDetailsModel
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.routes.user import verify_token
from spiffworkflow_backend.services.authorization_service import AuthorizationService
from spiffworkflow_backend.services.error_handling_service import ErrorHandlingService
from spiffworkflow_backend.services.file_system_service import FileSystemService
from spiffworkflow_backend.services.git_service import GitService
from spiffworkflow_backend.services.message_service import MessageService
from spiffworkflow_backend.services.process_instance_processor import MyCustomParser
from spiffworkflow_backend.services.process_instance_processor import (
ProcessInstanceProcessor,
)
from spiffworkflow_backend.services.process_instance_report_service import (
ProcessInstanceReportFilter,
)
from spiffworkflow_backend.services.process_instance_report_service import (
ProcessInstanceReportService,
)
from spiffworkflow_backend.services.process_instance_service import (
ProcessInstanceService,
)
@ -228,11 +235,17 @@ def process_group_show(
return make_response(jsonify(process_group), 200)
def process_model_add(
body: Dict[str, Union[str, bool, int]]
def process_model_create(
modified_process_group_id: str, body: Dict[str, Union[str, bool, int]]
) -> flask.wrappers.Response:
"""Add_process_model."""
"""Process_model_create."""
process_model_info = ProcessModelInfoSchema().load(body)
if modified_process_group_id is None:
raise ApiError(
error_code="process_group_id_not_specified",
message="Process Model could not be created when process_group_id path param is unspecified",
status_code=400,
)
if process_model_info is None:
raise ApiError(
error_code="process_model_could_not_be_created",
@ -308,9 +321,7 @@ def process_model_show(modified_process_model_identifier: str) -> Any:
files = sorted(SpecFileService.get_files(process_model))
process_model.files = files
for file in process_model.files:
file.references = SpecFileService.get_references_for_file(
file, process_model, MyCustomParser
)
file.references = SpecFileService.get_references_for_file(file, process_model)
process_model_json = ProcessModelInfoSchema().dump(process_model)
return process_model_json
@ -337,10 +348,19 @@ def process_model_list(
"pages": pages,
},
}
return Response(json.dumps(response_json), status=200, mimetype="application/json")
def process_list() -> Any:
"""Returns a list of all known processes.
This includes processes that are not the
primary process - helpful for finding possible call activities.
"""
references = SpecReferenceCache.query.filter_by(type="process").all()
return SpecReferenceSchema(many=True).dump(references)
def get_file(modified_process_model_id: str, file_name: str) -> Any:
"""Get_file."""
process_model_identifier = modified_process_model_id.replace(":", "/")
@ -581,16 +601,35 @@ def message_instance_list(
MessageInstanceModel.created_at_in_seconds.desc(), # type: ignore
MessageInstanceModel.id.desc(), # type: ignore
)
.join(MessageModel)
.join(MessageModel, MessageModel.id == MessageInstanceModel.message_model_id)
.join(ProcessInstanceModel)
.add_columns(
MessageModel.identifier.label("message_identifier"),
ProcessInstanceModel.process_model_identifier,
ProcessInstanceModel.process_group_identifier,
)
.paginate(page=page, per_page=per_page, error_out=False)
)
for message_instance in message_instances:
message_correlations: dict = {}
for (
mcmi
) in (
message_instance.MessageInstanceModel.message_correlations_message_instances
):
mc = MessageCorrelationModel.query.filter_by(
id=mcmi.message_correlation_id
).all()
for m in mc:
if m.name not in message_correlations:
message_correlations[m.name] = {}
message_correlations[m.name][
m.message_correlation_property.identifier
] = m.value
message_instance.MessageInstanceModel.message_correlations = (
message_correlations
)
response_json = {
"results": message_instances.items,
"pagination": {
@ -696,13 +735,38 @@ def process_instance_list(
end_from: Optional[int] = None,
end_to: Optional[int] = None,
process_status: Optional[str] = None,
user_filter: Optional[bool] = False,
) -> flask.wrappers.Response:
"""Process_instance_list."""
process_instance_report = ProcessInstanceReportModel.default_report(g.user)
if user_filter:
report_filter = ProcessInstanceReportFilter(
process_model_identifier,
start_from,
start_to,
end_from,
end_to,
process_status.split(",") if process_status else None,
)
else:
report_filter = (
ProcessInstanceReportService.filter_from_metadata_with_overrides(
process_instance_report,
process_model_identifier,
start_from,
start_to,
end_from,
end_to,
process_status,
)
)
# process_model_identifier = un_modify_modified_process_model_id(modified_process_model_identifier)
process_instance_query = ProcessInstanceModel.query
if process_model_identifier is not None:
if report_filter.process_model_identifier is not None:
process_model = get_process_model(
f"{process_model_identifier}",
f"{report_filter.process_model_identifier}",
)
process_instance_query = process_instance_query.filter_by(
@ -722,53 +786,43 @@ def process_instance_list(
)
)
if start_from is not None:
if report_filter.start_from is not None:
process_instance_query = process_instance_query.filter(
ProcessInstanceModel.start_in_seconds >= start_from
ProcessInstanceModel.start_in_seconds >= report_filter.start_from
)
if start_to is not None:
if report_filter.start_to is not None:
process_instance_query = process_instance_query.filter(
ProcessInstanceModel.start_in_seconds <= start_to
ProcessInstanceModel.start_in_seconds <= report_filter.start_to
)
if end_from is not None:
if report_filter.end_from is not None:
process_instance_query = process_instance_query.filter(
ProcessInstanceModel.end_in_seconds >= end_from
ProcessInstanceModel.end_in_seconds >= report_filter.end_from
)
if end_to is not None:
if report_filter.end_to is not None:
process_instance_query = process_instance_query.filter(
ProcessInstanceModel.end_in_seconds <= end_to
ProcessInstanceModel.end_in_seconds <= report_filter.end_to
)
if process_status is not None:
process_status_array = process_status.split(",")
if report_filter.process_status is not None:
process_instance_query = process_instance_query.filter(
ProcessInstanceModel.status.in_(process_status_array) # type: ignore
ProcessInstanceModel.status.in_(report_filter.process_status) # type: ignore
)
process_instances = process_instance_query.order_by(
ProcessInstanceModel.start_in_seconds.desc(), ProcessInstanceModel.id.desc() # type: ignore
).paginate(page=page, per_page=per_page, error_out=False)
process_instance_report = ProcessInstanceReportModel.default_report(g.user)
# TODO need to look into this more - how the filter here interacts with the
# one defined in the report.
# TODO need to look into test failures when the results from result_dict is
# used instead of the process instances
# substitution_variables = request.args.to_dict()
# result_dict = process_instance_report.generate_report(
# process_instances.items, substitution_variables
# )
# results = result_dict["results"]
# report_metadata = result_dict["report_metadata"]
results = process_instances.items
results = list(
map(
ProcessInstanceService.serialize_flat_with_task_data,
process_instances.items,
)
)
report_metadata = process_instance_report.report_metadata
response_json = {
"report_metadata": report_metadata,
"results": results,
"filters": report_filter.to_dict(),
"pagination": {
"count": len(results),
"total": process_instances.total,
@ -1009,7 +1063,17 @@ def task_list_for_my_open_processes(
return get_tasks(page=page, per_page=per_page)
def task_list_for_processes_started_by_others(
def task_list_for_me(page: int = 1, per_page: int = 100) -> flask.wrappers.Response:
"""Task_list_for_processes_started_by_others."""
return get_tasks(
processes_started_by_user=False,
has_lane_assignment_id=False,
page=page,
per_page=per_page,
)
def task_list_for_my_groups(
page: int = 1, per_page: int = 100
) -> flask.wrappers.Response:
"""Task_list_for_processes_started_by_others."""
@ -1017,14 +1081,21 @@ def task_list_for_processes_started_by_others(
def get_tasks(
processes_started_by_user: bool = True, page: int = 1, per_page: int = 100
processes_started_by_user: bool = True,
has_lane_assignment_id: bool = True,
page: int = 1,
per_page: int = 100,
) -> flask.wrappers.Response:
"""Get_tasks."""
user_id = g.user.id
# use distinct to ensure we only get one row per active task otherwise
# we can get back multiple for the same active task row which throws off
# pagination later on
# https://stackoverflow.com/q/34582014/6090676
active_tasks_query = (
ActiveTaskModel.query.outerjoin(
GroupModel, GroupModel.id == ActiveTaskModel.lane_assignment_id
)
ActiveTaskModel.query.distinct()
.outerjoin(GroupModel, GroupModel.id == ActiveTaskModel.lane_assignment_id)
.join(ProcessInstanceModel)
.join(UserModel, UserModel.id == ProcessInstanceModel.process_initiator_id)
)
@ -1032,11 +1103,29 @@ def get_tasks(
if processes_started_by_user:
active_tasks_query = active_tasks_query.filter(
ProcessInstanceModel.process_initiator_id == user_id
).outerjoin(ActiveTaskUserModel, and_(ActiveTaskUserModel.user_id == user_id))
).outerjoin(
ActiveTaskUserModel,
and_(
ActiveTaskUserModel.user_id == user_id,
ActiveTaskModel.id == ActiveTaskUserModel.active_task_id,
),
)
else:
active_tasks_query = active_tasks_query.filter(
ProcessInstanceModel.process_initiator_id != user_id
).join(ActiveTaskUserModel, and_(ActiveTaskUserModel.user_id == user_id))
).join(
ActiveTaskUserModel,
and_(
ActiveTaskUserModel.user_id == user_id,
ActiveTaskModel.id == ActiveTaskUserModel.active_task_id,
),
)
if has_lane_assignment_id:
active_tasks_query = active_tasks_query.filter(
ActiveTaskModel.lane_assignment_id.is_not(None) # type: ignore
)
else:
active_tasks_query = active_tasks_query.filter(ActiveTaskModel.lane_assignment_id.is_(None)) # type: ignore
active_tasks = active_tasks_query.add_columns(
ProcessInstanceModel.process_model_identifier,
@ -1064,7 +1153,10 @@ def get_tasks(
def process_instance_task_list(
process_instance_id: int, all_tasks: bool = False, spiff_step: int = 0
modified_process_model_id: str,
process_instance_id: int,
all_tasks: bool = False,
spiff_step: int = 0,
) -> flask.wrappers.Response:
"""Process_instance_task_list."""
process_instance = find_process_instance_by_id_or_raise(process_instance_id)
@ -1127,26 +1219,8 @@ def task_show(process_instance_id: int, task_id: str) -> flask.wrappers.Response
task = ProcessInstanceService.spiff_task_to_api_task(spiff_task)
task.data = spiff_task.data
task.process_model_display_name = process_model.display_name
task.process_model_identifier = process_model.id
process_model_with_form = process_model
all_processes = SpecFileService.get_all_bpmn_process_identifiers_for_process_model(
process_model
)
if task.process_name not in all_processes:
bpmn_file_full_path = (
ProcessInstanceProcessor.bpmn_file_full_path_from_bpmn_process_identifier(
task.process_name
)
)
relative_path = os.path.relpath(
bpmn_file_full_path, start=FileSystemService.root_path()
)
process_model_relative_path = os.path.dirname(relative_path)
process_model_with_form = (
ProcessModelService.get_process_model_from_relative_path(
process_model_relative_path
)
)
if task.type == "User Task":
if not form_schema_file_name:
@ -1232,7 +1306,25 @@ def task_submit(
if terminate_loop and spiff_task.is_looping():
spiff_task.terminate_loop()
ProcessInstanceService.complete_form_task(processor, spiff_task, body, g.user)
active_task = ActiveTaskModel.query.filter_by(
process_instance_id=process_instance_id, task_id=task_id
).first()
if active_task is None:
raise (
ApiError(
error_code="no_active_task",
message="Cannot find an active task with task id '{task_id}' for process instance {process_instance_id}.",
status_code=500,
)
)
ProcessInstanceService.complete_form_task(
processor=processor,
spiff_task=spiff_task,
data=body,
user=g.user,
active_task=active_task,
)
# If we need to update all tasks, then get the next ready task and if it a multi-instance with the same
# task spec, complete that form as well.
@ -1283,9 +1375,7 @@ def script_unit_test_create(
# TODO: move this to an xml service or something
file_contents = SpecFileService.get_data(process_model, file.name)
bpmn_etree_element = SpecFileService.get_etree_element_from_binary_data(
file_contents, file.name
)
bpmn_etree_element = etree.fromstring(file_contents)
nsmap = bpmn_etree_element.nsmap
spiff_element_maker = ElementMaker(
@ -1538,7 +1628,7 @@ def add_secret(body: Dict) -> Response:
def update_secret(key: str, body: dict) -> Response:
"""Update secret."""
SecretService().update_secret(key, body["value"], body["user_id"])
SecretService().update_secret(key, body["value"], g.user.id)
return Response(json.dumps({"ok": True}), status=200, mimetype="application/json")

View File

@ -11,6 +11,7 @@ from flask import request
from flask_bpmn.api.api_error import ApiError
from flask_bpmn.models.db import db
from SpiffWorkflow.task import Task as SpiffTask # type: ignore
from sqlalchemy import or_
from sqlalchemy import text
from spiffworkflow_backend.models.active_task import ActiveTaskModel
@ -57,7 +58,14 @@ class AuthorizationService:
)
.filter_by(permission=permission)
.join(PermissionTargetModel)
.filter(text(f"'{target_uri}' LIKE permission_target.uri"))
.filter(
or_(
text(f"'{target_uri}' LIKE permission_target.uri"),
# to check for exact matches as well
# see test_user_can_access_base_path_when_given_wildcard_permission unit test
text(f"'{target_uri}' = replace(permission_target.uri, '/%', '')"),
)
)
.all()
)

View File

@ -0,0 +1,10 @@
"""Custom_parser."""
from SpiffWorkflow.dmn.parser.BpmnDmnParser import BpmnDmnParser # type: ignore
from SpiffWorkflow.spiff.parser.process import SpiffBpmnParser # type: ignore
class MyCustomParser(BpmnDmnParser): # type: ignore
"""A BPMN and DMN parser that can also parse spiffworkflow-specific extensions."""
OVERRIDE_PARSER_CLASSES = BpmnDmnParser.OVERRIDE_PARSER_CLASSES
OVERRIDE_PARSER_CLASSES.update(SpiffBpmnParser.OVERRIDE_PARSER_CLASSES)

View File

@ -1,5 +1,6 @@
"""Data_setup_service."""
from flask import current_app
from flask_bpmn.models.db import db
from spiffworkflow_backend.services.process_model_service import ProcessModelService
from spiffworkflow_backend.services.spec_file_service import SpecFileService
@ -15,81 +16,42 @@ class DataSetupService:
@classmethod
def save_all_process_models(cls) -> list:
"""Save_all."""
"""Build a cache of all processes, messages, correlation keys, and start events.
These all exist within processes located on the file system, so we can quickly reference them
from the database.
"""
# Clear out all of the cached data.
SpecFileService.clear_caches()
current_app.logger.debug("DataSetupService.save_all_process_models() start")
failing_process_models = []
process_models = ProcessModelService().get_process_models()
SpecFileService.clear_caches()
for process_model in process_models:
process_model_files = SpecFileService.get_files(
process_model, extension_filter=".bpmn"
)
for process_model_file in process_model_files:
bpmn_xml_file_contents = SpecFileService.get_data(
process_model, process_model_file.name
)
bad_files = [
"B.1.0.bpmn",
"C.1.0.bpmn",
"C.2.0.bpmn",
"C.6.0.bpmn",
"TC-5.1.bpmn",
]
if process_model_file.name in bad_files:
continue
current_app.logger.debug(
f"primary_file_name: {process_model_file.name}"
)
try:
SpecFileService.update_file(
process_model,
process_model_file.name,
bpmn_xml_file_contents,
)
except Exception as ex:
failing_process_models.append(
(
f"{process_model.process_group}/{process_model.id}/{process_model_file.name}",
str(ex),
current_app.logger.debug(f"Process Model: {process_model.display_name}")
try:
refs = SpecFileService.get_references_for_process(process_model)
for ref in refs:
try:
SpecFileService.update_caches(ref)
except Exception as ex:
failing_process_models.append(
(
f"{ref.process_model_id}/{ref.file_name}",
str(ex),
)
)
)
# files = SpecFileService.get_files(
# process_model, extension_filter="bpmn"
# )
# bpmn_etree_element: EtreeElement = (
# SpecFileService.get_etree_element_from_binary_data(
# bpmn_xml_file_contents, process_model.primary_file_name
# )
# )
# if len(files) == 1:
# try:
# new_bpmn_process_identifier = (
# SpecFileService.get_bpmn_process_identifier(
# bpmn_etree_element
# )
# )
# if (
# process_model.primary_process_id
# != new_bpmn_process_identifier
# ):
# print(
# "primary_process_id: ", process_model.primary_process_id
# )
# # attributes_to_update = {
# # "primary_process_id": new_bpmn_process_identifier
# # }
# # ProcessModelService().update_spec(
# # process_model, attributes_to_update
# # )
# # except Exception as exception:
# except Exception:
# print(f"BAD ONE: {process_model.id}")
# # raise exception
else:
except Exception as ex2:
failing_process_models.append(
(
f"{process_model.process_group}/{process_model.id}",
"primary_file_name not set",
f"{process_model.id}",
str(ex2),
)
)
current_app.logger.debug("DataSetupService.save_all_process_models() end")
current_app.logger.debug(
"DataSetupService.save_all_process_models() end"
)
db.session.commit()
return failing_process_models

View File

@ -54,14 +54,16 @@ class FileSystemService:
@staticmethod
def process_group_path_for_spec(spec: ProcessModelInfo) -> str:
"""Category_path_for_spec."""
process_group_id, _ = os.path.split(spec.id)
# os.path.split apparently returns 2 element tulple like: (first/path, last_item)
process_group_id, _ = os.path.split(spec.id_for_file_path())
return FileSystemService.process_group_path(process_group_id)
@staticmethod
def workflow_path(spec: ProcessModelInfo) -> str:
"""Workflow_path."""
process_model_path = os.path.join(FileSystemService.root_path(), spec.id)
# process_group_path = FileSystemService.process_group_path_for_spec(spec)
process_model_path = os.path.join(
FileSystemService.root_path(), spec.id_for_file_path()
)
return process_model_path
@staticmethod

View File

@ -38,7 +38,6 @@ from SpiffWorkflow.dmn.parser.BpmnDmnParser import BpmnDmnParser # type: ignore
from SpiffWorkflow.dmn.serializer.task_spec_converters import BusinessRuleTaskConverter # type: ignore
from SpiffWorkflow.exceptions import WorkflowException # type: ignore
from SpiffWorkflow.serializer.exceptions import MissingSpecError # type: ignore
from SpiffWorkflow.spiff.parser.process import SpiffBpmnParser # type: ignore
from SpiffWorkflow.spiff.serializer.task_spec_converters import BoundaryEventConverter # type: ignore
from SpiffWorkflow.spiff.serializer.task_spec_converters import (
CallActivityTaskConverter,
@ -68,7 +67,6 @@ from SpiffWorkflow.util.deep_merge import DeepMerge # type: ignore
from spiffworkflow_backend.models.active_task import ActiveTaskModel
from spiffworkflow_backend.models.active_task_user import ActiveTaskUserModel
from spiffworkflow_backend.models.bpmn_process_id_lookup import BpmnProcessIdLookup
from spiffworkflow_backend.models.file import File
from spiffworkflow_backend.models.file import FileType
from spiffworkflow_backend.models.group import GroupModel
@ -87,16 +85,15 @@ from spiffworkflow_backend.models.process_model import ProcessModelInfo
from spiffworkflow_backend.models.script_attributes_context import (
ScriptAttributesContext,
)
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.models.spiff_step_details import SpiffStepDetailsModel
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.models.user import UserModelSchema
from spiffworkflow_backend.scripts.script import Script
from spiffworkflow_backend.services.custom_parser import MyCustomParser
from spiffworkflow_backend.services.file_system_service import FileSystemService
from spiffworkflow_backend.services.process_model_service import ProcessModelService
from spiffworkflow_backend.services.service_task_service import ServiceTaskDelegate
from spiffworkflow_backend.services.spec_file_service import (
ProcessModelFileNotFoundError,
)
from spiffworkflow_backend.services.spec_file_service import SpecFileService
from spiffworkflow_backend.services.user_service import UserService
@ -239,13 +236,6 @@ class CustomBpmnScriptEngine(PythonScriptEngine): # type: ignore
)
class MyCustomParser(BpmnDmnParser): # type: ignore
"""A BPMN and DMN parser that can also parse spiffworkflow-specific extensions."""
OVERRIDE_PARSER_CLASSES = BpmnDmnParser.OVERRIDE_PARSER_CLASSES
OVERRIDE_PARSER_CLASSES.update(SpiffBpmnParser.OVERRIDE_PARSER_CLASSES)
IdToBpmnProcessSpecMapping = NewType(
"IdToBpmnProcessSpecMapping", dict[str, BpmnProcessSpec]
)
@ -590,9 +580,10 @@ class ProcessInstanceProcessor:
)
return details_model
def save_spiff_step_details(self) -> None:
def save_spiff_step_details(self, active_task: ActiveTaskModel) -> None:
"""SaveSpiffStepDetails."""
details_model = self.spiff_step_details()
details_model.lane_assignment_id = active_task.lane_assignment_id
db.session.add(details_model)
db.session.commit()
@ -680,41 +671,24 @@ class ProcessInstanceProcessor:
return parser
@staticmethod
def backfill_missing_bpmn_process_id_lookup_records(
def backfill_missing_spec_reference_records(
bpmn_process_identifier: str,
) -> Optional[str]:
"""Backfill_missing_bpmn_process_id_lookup_records."""
"""Backfill_missing_spec_reference_records."""
process_models = ProcessModelService().get_process_models()
for process_model in process_models:
if process_model.primary_file_name:
try:
etree_element = SpecFileService.get_etree_element_from_file_name(
process_model, process_model.primary_file_name
)
bpmn_process_identifiers = []
except ProcessModelFileNotFoundError:
# if primary_file_name doesn't actually exist on disk, then just go on to the next process_model
continue
try:
bpmn_process_identifiers = (
SpecFileService.get_executable_bpmn_process_identifiers(
etree_element
)
)
except ValidationException:
# ignore validation errors here
pass
try:
refs = SpecFileService.reference_map(
SpecFileService.get_references_for_process(process_model)
)
bpmn_process_identifiers = refs.keys()
if bpmn_process_identifier in bpmn_process_identifiers:
SpecFileService.store_bpmn_process_identifiers(
process_model,
process_model.primary_file_name,
etree_element,
)
SpecFileService.update_process_cache(refs[bpmn_process_identifier])
return FileSystemService.full_path_to_process_model_file(
process_model
)
except Exception:
current_app.logger.warning("Failed to parse process ", process_model.id)
return None
@staticmethod
@ -727,18 +701,22 @@ class ProcessInstanceProcessor:
"bpmn_file_full_path_from_bpmn_process_identifier: bpmn_process_identifier is unexpectedly None"
)
bpmn_process_id_lookup = BpmnProcessIdLookup.query.filter_by(
bpmn_process_identifier=bpmn_process_identifier
).first()
spec_reference = (
SpecReferenceCache.query.filter_by(identifier=bpmn_process_identifier)
.filter_by(type="process")
.first()
)
bpmn_file_full_path = None
if bpmn_process_id_lookup is None:
bpmn_file_full_path = ProcessInstanceProcessor.backfill_missing_bpmn_process_id_lookup_records(
bpmn_process_identifier
if spec_reference is None:
bpmn_file_full_path = (
ProcessInstanceProcessor.backfill_missing_spec_reference_records(
bpmn_process_identifier
)
)
else:
bpmn_file_full_path = os.path.join(
FileSystemService.root_path(),
bpmn_process_id_lookup.bpmn_file_relative_path,
spec_reference.relative_path,
)
if bpmn_file_full_path is None:
raise (
@ -1161,11 +1139,11 @@ class ProcessInstanceProcessor:
)
return user_tasks # type: ignore
def complete_task(self, task: SpiffTask) -> None:
def complete_task(self, task: SpiffTask, active_task: ActiveTaskModel) -> None:
"""Complete_task."""
self.increment_spiff_step()
self.bpmn_process_instance.complete_task_from_id(task.id)
self.save_spiff_step_details()
self.save_spiff_step_details(active_task)
def get_data(self) -> dict[str, Any]:
"""Get_data."""

View File

@ -0,0 +1,118 @@
"""Process_instance_report_service."""
from dataclasses import dataclass
from typing import Optional
from spiffworkflow_backend.models.process_instance_report import (
ProcessInstanceReportModel,
)
@dataclass
class ProcessInstanceReportFilter:
"""ProcessInstanceReportFilter."""
process_model_identifier: Optional[str] = None
start_from: Optional[int] = None
start_to: Optional[int] = None
end_from: Optional[int] = None
end_to: Optional[int] = None
process_status: Optional[list[str]] = None
def to_dict(self) -> dict[str, str]:
"""To_dict."""
d = {}
if self.process_model_identifier is not None:
d["process_model_identifier"] = self.process_model_identifier
if self.start_from is not None:
d["start_from"] = str(self.start_from)
if self.start_to is not None:
d["start_to"] = str(self.start_to)
if self.end_from is not None:
d["end_from"] = str(self.end_from)
if self.end_to is not None:
d["end_to"] = str(self.end_to)
if self.process_status is not None:
d["process_status"] = ",".join(self.process_status)
return d
class ProcessInstanceReportService:
"""ProcessInstanceReportService."""
@classmethod
def filter_by_to_dict(
cls, process_instance_report: ProcessInstanceReportModel
) -> dict[str, str]:
"""Filter_by_to_dict."""
metadata = process_instance_report.report_metadata
filter_by = metadata.get("filter_by", [])
filters = {
d["field_name"]: d["field_value"]
for d in filter_by
if "field_name" in d and "field_value" in d
}
return filters
@classmethod
def filter_from_metadata(
cls, process_instance_report: ProcessInstanceReportModel
) -> ProcessInstanceReportFilter:
"""Filter_from_metadata."""
filters = cls.filter_by_to_dict(process_instance_report)
def int_value(key: str) -> Optional[int]:
"""Int_value."""
return int(filters[key]) if key in filters else None
def list_value(key: str) -> Optional[list[str]]:
"""List_value."""
return filters[key].split(",") if key in filters else None
process_model_identifier = filters.get("process_model_identifier")
start_from = int_value("start_from")
start_to = int_value("start_to")
end_from = int_value("end_from")
end_to = int_value("end_to")
process_status = list_value("process_status")
report_filter = ProcessInstanceReportFilter(
process_model_identifier,
start_from,
start_to,
end_from,
end_to,
process_status,
)
return report_filter
@classmethod
def filter_from_metadata_with_overrides(
cls,
process_instance_report: ProcessInstanceReportModel,
process_model_identifier: Optional[str] = None,
start_from: Optional[int] = None,
start_to: Optional[int] = None,
end_from: Optional[int] = None,
end_to: Optional[int] = None,
process_status: Optional[str] = None,
) -> ProcessInstanceReportFilter:
"""Filter_from_metadata_with_overrides."""
report_filter = cls.filter_from_metadata(process_instance_report)
if process_model_identifier is not None:
report_filter.process_model_identifier = process_model_identifier
if start_from is not None:
report_filter.start_from = start_from
if start_to is not None:
report_filter.start_to = start_to
if end_from is not None:
report_filter.end_from = end_from
if end_to is not None:
report_filter.end_to = end_to
if process_status is not None:
report_filter.process_status = process_status.split(",")
return report_filter

View File

@ -8,6 +8,7 @@ from flask_bpmn.api.api_error import ApiError
from flask_bpmn.models.db import db
from SpiffWorkflow.task import Task as SpiffTask # type: ignore
from spiffworkflow_backend.models.active_task import ActiveTaskModel
from spiffworkflow_backend.models.process_instance import ProcessInstanceApi
from spiffworkflow_backend.models.process_instance import ProcessInstanceModel
from spiffworkflow_backend.models.process_instance import ProcessInstanceStatus
@ -188,6 +189,7 @@ class ProcessInstanceService:
spiff_task: SpiffTask,
data: dict[str, Any],
user: UserModel,
active_task: ActiveTaskModel,
) -> None:
"""All the things that need to happen when we complete a form.
@ -201,7 +203,7 @@ class ProcessInstanceService:
dot_dct = ProcessInstanceService.create_dot_dict(data)
spiff_task.update_data(dot_dct)
# ProcessInstanceService.post_process_form(spiff_task) # some properties may update the data store.
processor.complete_task(spiff_task)
processor.complete_task(spiff_task, active_task)
processor.do_engine_steps(save=True)
@staticmethod
@ -313,3 +315,22 @@ class ProcessInstanceService:
)
return task
@staticmethod
def serialize_flat_with_task_data(
process_instance: ProcessInstanceModel,
) -> dict[str, Any]:
"""serialize_flat_with_task_data."""
results = {}
try:
original_status = process_instance.status
processor = ProcessInstanceProcessor(process_instance)
process_instance.data = processor.get_current_data()
results = process_instance.serialized_flat
# this process seems to mutate the status of the process_instance which
# can result in different results than expected from process_instance_list,
# so set the status back to the expected value
results["status"] = original_status
except ApiError:
results = process_instance.serialized
return results

View File

@ -2,21 +2,15 @@
import os
import shutil
from datetime import datetime
from typing import Any
from typing import List
from typing import Optional
from flask_bpmn.api.api_error import ApiError
from flask_bpmn.models.db import db
from lxml import etree # type: ignore
from lxml.etree import _Element # type: ignore
from lxml.etree import Element as EtreeElement
from SpiffWorkflow.bpmn.parser.ValidationException import ValidationException # type: ignore
from spiffworkflow_backend.models.bpmn_process_id_lookup import BpmnProcessIdLookup
from spiffworkflow_backend.models.file import File
from spiffworkflow_backend.models.file import FileReference
from spiffworkflow_backend.models.file import FileType
from spiffworkflow_backend.models.file import SpecReference
from spiffworkflow_backend.models.message_correlation_property import (
MessageCorrelationPropertyModel,
)
@ -25,6 +19,8 @@ from spiffworkflow_backend.models.message_triggerable_process_model import (
MessageTriggerableProcessModel,
)
from spiffworkflow_backend.models.process_model import ProcessModelInfo
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.services.custom_parser import MyCustomParser
from spiffworkflow_backend.services.file_system_service import FileSystemService
from spiffworkflow_backend.services.process_model_service import ProcessModelService
@ -49,7 +45,9 @@ class SpecFileService(FileSystemService):
) -> List[File]:
"""Return all files associated with a workflow specification."""
# path = SpecFileService.workflow_path(process_model_info)
path = os.path.join(FileSystemService.root_path(), process_model_info.id)
path = os.path.join(
FileSystemService.root_path(), process_model_info.id_for_file_path()
)
files = SpecFileService._get_files(path, file_name)
if extension_filter != "":
files = list(
@ -57,37 +55,86 @@ class SpecFileService(FileSystemService):
)
return files
@staticmethod
def reference_map(references: list[SpecReference]) -> dict[str, SpecReference]:
"""Creates a dict with provided references organized by id."""
ref_map = {}
for ref in references:
ref_map[ref.identifier] = ref
return ref_map
@staticmethod
def get_references_for_process(
process_model_info: ProcessModelInfo,
) -> list[SpecReference]:
"""Get_references_for_process."""
files = SpecFileService.get_files(process_model_info)
references = []
for file in files:
references.extend(
SpecFileService.get_references_for_file(file, process_model_info)
)
return references
@staticmethod
def get_references_for_file(
file: File, process_model_info: ProcessModelInfo, parser_class: Any
) -> list[FileReference]:
file: File, process_model_info: ProcessModelInfo
) -> list[SpecReference]:
"""Uses spiffworkflow to parse BPMN and DMN files to determine how they can be externally referenced.
Returns a list of Reference objects that contain the type of reference, the id, the name.
Ex.
id = {str} 'Level3'
name = {str} 'Level 3'
type = {str} 'process'
type = {str} 'process' / 'decision'
"""
references: list[FileReference] = []
file_path = SpecFileService.file_path(process_model_info, file.name)
parser = parser_class()
references: list[SpecReference] = []
full_file_path = SpecFileService.full_file_path(process_model_info, file.name)
file_path = os.path.join(process_model_info.id_for_file_path(), file.name)
parser = MyCustomParser()
parser_type = None
sub_parser = None
has_lanes = False
is_executable = True
is_primary = False
messages = {}
correlations = {}
start_messages = []
if file.type == FileType.bpmn.value:
parser.add_bpmn_file(file_path)
parser.add_bpmn_file(full_file_path)
parser_type = "process"
sub_parsers = list(parser.process_parsers.values())
messages = parser.messages
correlations = parser.correlations
elif file.type == FileType.dmn.value:
parser.add_dmn_file(file_path)
parser.add_dmn_file(full_file_path)
sub_parsers = list(parser.dmn_parsers.values())
parser_type = "decision"
else:
return references
for sub_parser in sub_parsers:
if parser_type == "process":
has_lanes = sub_parser.has_lanes()
is_executable = sub_parser.process_executable
start_messages = sub_parser.start_messages()
is_primary = (
sub_parser.get_id() == process_model_info.primary_process_id
)
references.append(
FileReference(
id=sub_parser.get_id(), name=sub_parser.get_name(), type=parser_type
SpecReference(
identifier=sub_parser.get_id(),
display_name=sub_parser.get_name(),
process_model_id=process_model_info.id,
type=parser_type,
file_name=file.name,
relative_path=file_path,
has_lanes=has_lanes,
is_executable=is_executable,
messages=messages,
is_primary=is_primary,
correlations=correlations,
start_messages=start_messages,
)
)
return references
@ -106,61 +153,63 @@ class SpecFileService(FileSystemService):
) -> File:
"""Update_file."""
SpecFileService.assert_valid_file_name(file_name)
# file_path = SpecFileService.file_path(process_model_info, file_name)
file_path = os.path.join(
FileSystemService.root_path(), process_model_info.id, file_name
full_file_path = SpecFileService.full_file_path(process_model_info, file_name)
SpecFileService.write_file_data_to_system(full_file_path, binary_data)
file = SpecFileService.to_file_object(file_name, full_file_path)
references = SpecFileService.get_references_for_file(file, process_model_info)
primary_process_ref = next(
(ref for ref in references if ref.is_primary and ref.is_executable), None
)
SpecFileService.write_file_data_to_system(file_path, binary_data)
file = SpecFileService.to_file_object(file_name, file_path)
if file.type == FileType.bpmn.value:
set_primary_file = False
if (
process_model_info.primary_file_name is None
or file_name == process_model_info.primary_file_name
):
# If no primary process exists, make this primary process.
set_primary_file = True
SpecFileService.process_bpmn_file(
process_model_info,
file_name,
binary_data,
set_primary_file=set_primary_file,
)
SpecFileService.clear_caches_for_file(file_name, process_model_info)
for ref in references:
# If no valid primary process is defined, default to the first process in the
# updated file.
if not primary_process_ref and ref.type == "process" and ref.is_executable:
ref.is_primary = True
if ref.is_primary:
ProcessModelService().update_spec(
process_model_info,
{
"primary_process_id": ref.identifier,
"primary_file_name": file_name,
"is_review": ref.has_lanes,
},
)
SpecFileService.update_caches(ref)
return file
@staticmethod
def get_data(process_model_info: ProcessModelInfo, file_name: str) -> bytes:
"""Get_data."""
# file_path = SpecFileService.file_path(process_model_info, file_name)
file_path = os.path.join(
FileSystemService.root_path(), process_model_info.id, file_name
)
if not os.path.exists(file_path):
full_file_path = SpecFileService.full_file_path(process_model_info, file_name)
if not os.path.exists(full_file_path):
raise ProcessModelFileNotFoundError(
f"No file found with name {file_name} in {process_model_info.display_name}"
)
with open(file_path, "rb") as f_handle:
with open(full_file_path, "rb") as f_handle:
spec_file_data = f_handle.read()
return spec_file_data
@staticmethod
def file_path(spec: ProcessModelInfo, file_name: str) -> str:
def full_file_path(spec: ProcessModelInfo, file_name: str) -> str:
"""File_path."""
return os.path.join(SpecFileService.workflow_path(spec), file_name)
@staticmethod
def last_modified(spec: ProcessModelInfo, file_name: str) -> datetime:
"""Last_modified."""
path = SpecFileService.file_path(spec, file_name)
return FileSystemService._last_modified(path)
full_file_path = SpecFileService.full_file_path(spec, file_name)
return FileSystemService._last_modified(full_file_path)
@staticmethod
def timestamp(spec: ProcessModelInfo, file_name: str) -> float:
"""Timestamp."""
path = SpecFileService.file_path(spec, file_name)
return FileSystemService._timestamp(path)
full_file_path = SpecFileService.full_file_path(spec, file_name)
return FileSystemService._timestamp(full_file_path)
@staticmethod
def delete_file(spec: ProcessModelInfo, file_name: str) -> None:
@ -170,9 +219,8 @@ class SpecFileService(FileSystemService):
# for lf in lookup_files:
# session.query(LookupDataModel).filter_by(lookup_file_model_id=lf.id).delete()
# session.query(LookupFileModel).filter_by(id=lf.id).delete()
# file_path = SpecFileService.file_path(spec, file_name)
file_path = os.path.join(FileSystemService.root_path(), spec.id, file_name)
os.remove(file_path)
full_file_path = SpecFileService.full_file_path(spec, file_name)
os.remove(full_file_path)
@staticmethod
def delete_all_files(spec: ProcessModelInfo) -> None:
@ -181,354 +229,141 @@ class SpecFileService(FileSystemService):
if os.path.exists(dir_path):
shutil.rmtree(dir_path)
@staticmethod
def get_etree_element_from_file_name(
process_model_info: ProcessModelInfo, file_name: str
) -> EtreeElement:
"""Get_etree_element_from_file_name."""
binary_data = SpecFileService.get_data(process_model_info, file_name)
return SpecFileService.get_etree_element_from_binary_data(
binary_data, file_name
)
# fixme: Place all the caching stuff in a different service.
@staticmethod
def get_etree_element_from_binary_data(
binary_data: bytes, file_name: str
) -> EtreeElement:
"""Get_etree_element_from_binary_data."""
try:
return etree.fromstring(binary_data)
except etree.XMLSyntaxError as xse:
raise ApiError(
"invalid_xml",
"Failed to parse xml: " + str(xse),
file_name=file_name,
) from xse
def update_caches(ref: SpecReference) -> None:
"""Update_caches."""
SpecFileService.update_process_cache(ref)
SpecFileService.update_message_cache(ref)
SpecFileService.update_message_trigger_cache(ref)
SpecFileService.update_correlation_cache(ref)
@staticmethod
def process_bpmn_file(
process_model_info: ProcessModelInfo,
file_name: str,
binary_data: Optional[bytes] = None,
set_primary_file: Optional[bool] = False,
def clear_caches_for_file(
file_name: str, process_model_info: ProcessModelInfo
) -> None:
"""Set_primary_bpmn."""
# If this is a BPMN, extract the process id, and determine if it is contains swim lanes.
extension = SpecFileService.get_extension(file_name)
file_type = FileType[extension]
if file_type == FileType.bpmn:
if not binary_data:
binary_data = SpecFileService.get_data(process_model_info, file_name)
"""Clear all caches related to a file."""
db.session.query(SpecReferenceCache).filter(
SpecReferenceCache.file_name == file_name
).filter(SpecReferenceCache.process_model_id == process_model_info.id).delete()
# fixme: likely the other caches should be cleared as well, but we don't have a clean way to do so yet.
bpmn_etree_element: EtreeElement = (
SpecFileService.get_etree_element_from_binary_data(
binary_data, file_name
)
)
@staticmethod
def clear_caches() -> None:
"""Clear_caches."""
db.session.query(SpecReferenceCache).delete()
# fixme: likely the other caches should be cleared as well, but we don't have a clean way to do so yet.
try:
if set_primary_file:
attributes_to_update = {
"primary_process_id": (
SpecFileService.get_bpmn_process_identifier(
bpmn_etree_element
)
),
"primary_file_name": file_name,
"is_review": SpecFileService.has_swimlane(bpmn_etree_element),
}
ProcessModelService().update_spec(
process_model_info, attributes_to_update
)
SpecFileService.check_for_message_models(
bpmn_etree_element, process_model_info
)
SpecFileService.store_bpmn_process_identifiers(
process_model_info, file_name, bpmn_etree_element
)
except ValidationException as ve:
if ve.args[0].find("No executable process tag found") >= 0:
raise ApiError(
error_code="missing_executable_option",
message="No executable process tag found. Please make sure the Executable option is set in the workflow.",
) from ve
else:
raise ApiError(
error_code="validation_error",
message=f"There was an error validating your workflow. Original message is: {ve}",
) from ve
@staticmethod
def update_process_cache(ref: SpecReference) -> None:
"""Update_process_cache."""
process_id_lookup = (
SpecReferenceCache.query.filter_by(identifier=ref.identifier)
.filter_by(type=ref.type)
.first()
)
if process_id_lookup is None:
process_id_lookup = SpecReferenceCache.from_spec_reference(ref)
db.session.add(process_id_lookup)
db.session.commit()
else:
raise ApiError(
"invalid_xml",
"Only a BPMN can be the primary file.",
file_name=file_name,
)
@staticmethod
def has_swimlane(et_root: _Element) -> bool:
"""Look through XML and determine if there are any lanes present that have a label."""
elements = et_root.xpath(
"//bpmn:lane",
namespaces={"bpmn": "http://www.omg.org/spec/BPMN/20100524/MODEL"},
)
retval = False
for el in elements:
if el.get("name"):
retval = True
return retval
@staticmethod
def append_identifier_of_process_to_array(
process_element: _Element, process_identifiers: list[str]
) -> None:
"""Append_identifier_of_process_to_array."""
process_id_key = "id"
if "name" in process_element.attrib:
process_id_key = "name"
process_identifiers.append(process_element.attrib[process_id_key])
@staticmethod
def get_all_bpmn_process_identifiers_for_process_model(
process_model_info: ProcessModelInfo,
) -> list[str]:
"""Get_all_bpmn_process_identifiers_for_process_model."""
if process_model_info.primary_file_name is None:
return []
binary_data = SpecFileService.get_data(
process_model_info, process_model_info.primary_file_name
)
et_root: EtreeElement = SpecFileService.get_etree_element_from_binary_data(
binary_data, process_model_info.primary_file_name
)
process_identifiers: list[str] = []
for child in et_root:
if child.tag.endswith("process") and child.attrib.get(
"isExecutable", False
):
subprocesses = child.xpath(
"//bpmn:subProcess",
namespaces={"bpmn": "http://www.omg.org/spec/BPMN/20100524/MODEL"},
if ref.relative_path != process_id_lookup.relative_path:
full_bpmn_file_path = SpecFileService.full_path_from_relative_path(
process_id_lookup.relative_path
)
for subprocess in subprocesses:
SpecFileService.append_identifier_of_process_to_array(
subprocess, process_identifiers
# if the old relative bpmn file no longer exists, then assume things were moved around
# on the file system. Otherwise, assume it is a duplicate process id and error.
if os.path.isfile(full_bpmn_file_path):
raise ValidationException(
f"Process id ({ref.identifier}) has already been used for "
f"{process_id_lookup.relative_path}. It cannot be reused."
)
SpecFileService.append_identifier_of_process_to_array(
child, process_identifiers
)
if len(process_identifiers) == 0:
raise ValidationException("No executable process tag found")
return process_identifiers
else:
process_id_lookup.relative_path = ref.relative_path
db.session.add(process_id_lookup)
db.session.commit()
@staticmethod
def get_executable_process_elements(et_root: _Element) -> list[_Element]:
"""Get_executable_process_elements."""
process_elements = []
for child in et_root:
if child.tag.endswith("process") and child.attrib.get(
"isExecutable", False
):
process_elements.append(child)
if len(process_elements) == 0:
raise ValidationException("No executable process tag found")
return process_elements
@staticmethod
def get_executable_bpmn_process_identifiers(et_root: _Element) -> list[str]:
"""Get_executable_bpmn_process_identifiers."""
process_elements = SpecFileService.get_executable_process_elements(et_root)
bpmn_process_identifiers = [pe.attrib["id"] for pe in process_elements]
return bpmn_process_identifiers
@staticmethod
def get_bpmn_process_identifier(et_root: _Element) -> str:
"""Get_bpmn_process_identifier."""
process_elements = SpecFileService.get_executable_process_elements(et_root)
# There are multiple root elements
if len(process_elements) > 1:
# Look for the element that has the startEvent in it
for e in process_elements:
this_element: EtreeElement = e
for child_element in list(this_element):
if child_element.tag.endswith("startEvent"):
# coorce Any to string
return str(this_element.attrib["id"])
raise ValidationException(
"No start event found in %s" % et_root.attrib["id"]
)
return str(process_elements[0].attrib["id"])
@staticmethod
def store_bpmn_process_identifiers(
process_model_info: ProcessModelInfo, bpmn_file_name: str, et_root: _Element
) -> None:
"""Store_bpmn_process_identifiers."""
relative_process_model_path = process_model_info.id_for_file_path()
relative_bpmn_file_path = os.path.join(
relative_process_model_path, bpmn_file_name
)
bpmn_process_identifiers = (
SpecFileService.get_executable_bpmn_process_identifiers(et_root)
)
for bpmn_process_identifier in bpmn_process_identifiers:
process_id_lookup = BpmnProcessIdLookup.query.filter_by(
bpmn_process_identifier=bpmn_process_identifier
def update_message_cache(ref: SpecReference) -> None:
"""Assure we have a record in the database of all possible message ids and names."""
for message_model_identifier in ref.messages.keys():
message_model = MessageModel.query.filter_by(
identifier=message_model_identifier
).first()
if process_id_lookup is None:
process_id_lookup = BpmnProcessIdLookup(
bpmn_process_identifier=bpmn_process_identifier,
bpmn_file_relative_path=relative_bpmn_file_path,
if message_model is None:
message_model = MessageModel(
identifier=message_model_identifier,
name=ref.messages[message_model_identifier],
)
db.session.add(process_id_lookup)
db.session.add(message_model)
db.session.commit()
@staticmethod
def update_message_trigger_cache(ref: SpecReference) -> None:
"""Assure we know which messages can trigger the start of a process."""
for message_model_identifier in ref.start_messages:
message_model = MessageModel.query.filter_by(
identifier=message_model_identifier
).first()
if message_model is None:
raise ValidationException(
f"Could not find message model with identifier '{message_model_identifier}'"
f"Required by a Start Event in : {ref.file_name}"
)
message_triggerable_process_model = (
MessageTriggerableProcessModel.query.filter_by(
message_model_id=message_model.id,
).first()
)
if message_triggerable_process_model is None:
message_triggerable_process_model = MessageTriggerableProcessModel(
message_model_id=message_model.id,
process_model_identifier=ref.process_model_id,
process_group_identifier="process_group_identifier",
)
db.session.add(message_triggerable_process_model)
db.session.commit()
else:
if relative_bpmn_file_path != process_id_lookup.bpmn_file_relative_path:
full_bpmn_file_path = SpecFileService.full_path_from_relative_path(
process_id_lookup.bpmn_file_relative_path
if (
message_triggerable_process_model.process_model_identifier
!= ref.process_model_id
# or message_triggerable_process_model.process_group_identifier
# != process_model_info.process_group_id
):
raise ValidationException(
f"Message model is already used to start process model {ref.process_model_id}"
)
# if the old relative bpmn file no longer exists, then assume things were moved around
# on the file system. Otherwise, assume it is a duplicate process id and error.
if os.path.isfile(full_bpmn_file_path):
raise ValidationException(
f"Process id ({bpmn_process_identifier}) has already been used for "
f"{process_id_lookup.bpmn_file_relative_path}. It cannot be reused."
)
else:
process_id_lookup.bpmn_file_relative_path = (
relative_bpmn_file_path
)
db.session.add(process_id_lookup)
db.session.commit()
@staticmethod
def check_for_message_models(
et_root: _Element, process_model_info: ProcessModelInfo
) -> None:
"""Check_for_message_models."""
for child in et_root:
if child.tag.endswith("message"):
message_model_identifier = child.attrib.get("id")
message_name = child.attrib.get("name")
if message_model_identifier is None:
raise ValidationException(
"Message identifier is missing from bpmn xml"
)
def update_correlation_cache(ref: SpecReference) -> None:
"""Update_correlation_cache."""
for correlation_identifier in ref.correlations.keys():
correlation_property_retrieval_expressions = ref.correlations[
correlation_identifier
]["retrieval_expressions"]
for cpre in correlation_property_retrieval_expressions:
message_model_identifier = cpre["messageRef"]
message_model = MessageModel.query.filter_by(
identifier=message_model_identifier
).first()
if message_model is None:
message_model = MessageModel(
identifier=message_model_identifier, name=message_name
raise ValidationException(
f"Could not find message model with identifier '{message_model_identifier}'"
f"specified by correlation property: {cpre}"
)
db.session.add(message_model)
# fixme: I think we are currently ignoring the correction properties.
message_correlation_property = (
MessageCorrelationPropertyModel.query.filter_by(
identifier=correlation_identifier,
message_model_id=message_model.id,
).first()
)
if message_correlation_property is None:
message_correlation_property = MessageCorrelationPropertyModel(
identifier=correlation_identifier,
message_model_id=message_model.id,
)
db.session.add(message_correlation_property)
db.session.commit()
for child in et_root:
if child.tag.endswith("}process"):
message_event_definitions = child.xpath(
"//bpmn:startEvent/bpmn:messageEventDefinition",
namespaces={"bpmn": "http://www.omg.org/spec/BPMN/20100524/MODEL"},
)
if message_event_definitions:
message_event_definition = message_event_definitions[0]
message_model_identifier = message_event_definition.attrib.get(
"messageRef"
)
if message_model_identifier is None:
raise ValidationException(
"Could not find messageRef from message event definition: {message_event_definition}"
)
message_model = MessageModel.query.filter_by(
identifier=message_model_identifier
).first()
if message_model is None:
raise ValidationException(
f"Could not find message model with identifier '{message_model_identifier}'"
f"specified by message event definition: {message_event_definition}"
)
message_triggerable_process_model = (
MessageTriggerableProcessModel.query.filter_by(
message_model_id=message_model.id,
).first()
)
if message_triggerable_process_model is None:
message_triggerable_process_model = (
MessageTriggerableProcessModel(
message_model_id=message_model.id,
process_model_identifier=process_model_info.id,
process_group_identifier="process_group_identifier",
)
)
db.session.add(message_triggerable_process_model)
db.session.commit()
else:
if (
message_triggerable_process_model.process_model_identifier
!= process_model_info.id
# or message_triggerable_process_model.process_group_identifier
# != process_model_info.process_group_id
):
raise ValidationException(
f"Message model is already used to start process model {process_model_info.id}"
)
for child in et_root:
if child.tag.endswith("correlationProperty"):
correlation_identifier = child.attrib.get("id")
if correlation_identifier is None:
raise ValidationException(
"Correlation identifier is missing from bpmn xml"
)
correlation_property_retrieval_expressions = child.xpath(
"//bpmn:correlationPropertyRetrievalExpression",
namespaces={"bpmn": "http://www.omg.org/spec/BPMN/20100524/MODEL"},
)
if not correlation_property_retrieval_expressions:
raise ValidationException(
f"Correlation is missing correlation property retrieval expressions: {correlation_identifier}"
)
for cpre in correlation_property_retrieval_expressions:
message_model_identifier = cpre.attrib.get("messageRef")
if message_model_identifier is None:
raise ValidationException(
f"Message identifier is missing from correlation property: {correlation_identifier}"
)
message_model = MessageModel.query.filter_by(
identifier=message_model_identifier
).first()
if message_model is None:
raise ValidationException(
f"Could not find message model with identifier '{message_model_identifier}'"
f"specified by correlation property: {cpre}"
)
message_correlation_property = (
MessageCorrelationPropertyModel.query.filter_by(
identifier=correlation_identifier,
message_model_id=message_model.id,
).first()
)
if message_correlation_property is None:
message_correlation_property = MessageCorrelationPropertyModel(
identifier=correlation_identifier,
message_model_id=message_model.id,
)
db.session.add(message_correlation_property)
db.session.commit()

View File

@ -40,7 +40,7 @@
}</spiffworkflow:messagePayload>
</bpmn:extensionElements>
</bpmn:message>
<bpmn:process id="message_receiver_process_one" name="Message Receiver Process" isExecutable="true">
<bpmn:process id="message_receiver_process_one" name="Message Receiver Process One" isExecutable="true">
<bpmn:sequenceFlow id="Flow_11r9uiw" sourceRef="send_message_response" targetRef="Event_0q5otqd" />
<bpmn:endEvent id="Event_0q5otqd">
<bpmn:incoming>Flow_11r9uiw</bpmn:incoming>

View File

@ -40,7 +40,7 @@
}</spiffworkflow:messagePayload>
</bpmn:extensionElements>
</bpmn:message>
<bpmn:process id="message_receiver_process_two" name="Message Receiver Process" isExecutable="true">
<bpmn:process id="message_receiver_process_two" name="Message Receiver Process Two" isExecutable="true">
<bpmn:sequenceFlow id="Flow_11r9uiw" sourceRef="send_message_response" targetRef="Event_0q5otqd" />
<bpmn:endEvent id="Event_0q5otqd">
<bpmn:incoming>Flow_11r9uiw</bpmn:incoming>

View File

@ -136,6 +136,7 @@ class BaseTest:
# make sure we have a group
process_group_id, _ = os.path.split(process_model_id)
modified_process_group_id = process_group_id.replace("/", ":")
process_group_path = f"{FileSystemService.root_path()}/{process_group_id}"
if ProcessModelService().is_group(process_group_path):
@ -156,11 +157,12 @@ class BaseTest:
user = self.find_or_create_user()
response = client.post(
"/v1.0/process-models",
f"/v1.0/process-models/{modified_process_group_id}",
content_type="application/json",
data=json.dumps(ProcessModelInfoSchema().dump(model)),
headers=self.logged_in_headers(user),
)
assert response.status_code == 201
return response

View File

@ -25,7 +25,6 @@ class ExampleDataLoader:
"""Assumes that process_model_source_directory exists in static/bpmn and contains bpmn_file_name.
further assumes that bpmn_file_name is the primary file for the process model.
if bpmn_file_name is None we load all files in process_model_source_directory,
otherwise, we only load bpmn_file_name
"""
@ -80,15 +79,16 @@ class ExampleDataLoader:
try:
file = open(file_path, "rb")
data = file.read()
SpecFileService.add_file(
file_info = SpecFileService.add_file(
process_model_info=spec, file_name=filename, binary_data=data
)
if is_primary:
SpecFileService.process_bpmn_file(
spec, filename, data, set_primary_file=True
references = SpecFileService.get_references_for_file(
file_info, spec
)
workflow_spec_service = ProcessModelService()
workflow_spec_service.save_process_model(spec)
spec.primary_process_id = references[0].identifier
spec.primary_file_name = filename
ProcessModelService().save_process_model(spec)
finally:
if file:
file.close()

View File

@ -143,7 +143,6 @@ class TestNestedGroups(BaseTest):
response = client.get( # noqa: F841
target_uri, headers=self.logged_in_headers(user)
)
print("test_nested_groups")
def test_add_nested_group(
self,
@ -153,10 +152,6 @@ class TestNestedGroups(BaseTest):
with_super_admin_user: UserModel,
) -> None:
"""Test_add_nested_group."""
# user = self.find_or_create_user()
# self.add_permissions_to_user(
# user, target_uri=target_uri, permission_names=["read", "create"]
# )
process_group_a = ProcessGroup(
id="group_a",
display_name="Group A",
@ -194,16 +189,14 @@ class TestNestedGroups(BaseTest):
data=json.dumps(ProcessGroupSchema().dump(process_group_c)),
)
print("test_add_nested_group")
def test_process_model_add(
def test_process_model_create(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Test_process_model_add."""
"""Test_process_model_create."""
process_group_a = ProcessGroup(
id="group_a",
display_name="Group A",
@ -242,7 +235,6 @@ class TestNestedGroups(BaseTest):
content_type="application/json",
data=json.dumps(ProcessModelInfoSchema().dump(process_model)),
)
print("test_process_model_add")
def test_process_group_show(
self,

View File

@ -25,6 +25,7 @@ from spiffworkflow_backend.models.process_instance_report import (
)
from spiffworkflow_backend.models.process_model import NotificationType
from spiffworkflow_backend.models.process_model import ProcessModelInfoSchema
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.services.authorization_service import AuthorizationService
from spiffworkflow_backend.services.file_system_service import FileSystemService
@ -104,14 +105,14 @@ class TestProcessApi(BaseTest):
assert response.json is not None
assert response.json == expected_response_body
def test_process_model_add(
def test_process_model_create(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Test_add_new_process_model."""
"""Test_process_model_create."""
process_group_id = "test_process_group"
process_group_display_name = "Test Process Group"
# creates the group directory, and the json file
@ -439,6 +440,49 @@ class TestProcessApi(BaseTest):
assert response.json["pagination"]["total"] == 5
assert response.json["pagination"]["pages"] == 2
def test_process_list(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""It should be possible to get a list of all processes known to the system."""
load_test_spec(
"test_group_one/simple_form",
process_model_source_directory="simple_form",
bpmn_file_name="simple_form",
)
# When adding a process model with one Process, no decisions, and some json files, only one process is recorded.
assert len(SpecReferenceCache.query.all()) == 1
self.create_group_and_model_with_bpmn(
client=client,
user=with_super_admin_user,
process_group_id="test_group_two",
process_model_id="call_activity_nested",
bpmn_file_location="call_activity_nested",
)
# When adding a process model with 4 processes and a decision, 5 new records will be in the Cache
assert len(SpecReferenceCache.query.all()) == 6
# get the results
response = client.get(
"/v1.0/processes",
headers=self.logged_in_headers(with_super_admin_user),
)
assert response.json is not None
# We should get 5 back, as one of the items in the cache is a decision.
assert len(response.json) == 5
simple_form = next(
p for p in response.json if p["identifier"] == "Proccess_WithForm"
)
assert simple_form["display_name"] == "Process With Form"
assert simple_form["process_model_id"] == "test_group_one/simple_form"
assert simple_form["has_lanes"] is False
assert simple_form["is_executable"] is True
assert simple_form["is_primary"] is True
def test_process_group_add(
self,
app: Flask,

View File

@ -74,7 +74,11 @@ class TestGetLocaltime(BaseTest):
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {"timezone": "US/Pacific"}, initiator_user
processor,
spiff_task,
{"timezone": "US/Pacific"},
initiator_user,
active_task,
)
active_task = process_instance.active_tasks[0]

View File

@ -126,7 +126,7 @@ class TestAuthorizationService(BaseTest):
active_task.task_name, processor.bpmn_process_instance
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
active_task = process_instance.active_tasks[0]
@ -137,5 +137,5 @@ class TestAuthorizationService(BaseTest):
{"username": "testuser2", "sub": "open_id"}
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user
processor, spiff_task, {}, finance_user, active_task
)

View File

@ -47,6 +47,7 @@ class TestDotNotation(BaseTest):
processor = ProcessInstanceProcessor(process_instance)
processor.do_engine_steps(save=True)
active_task = process_instance.active_tasks[0]
user_task = processor.get_ready_user_tasks()[0]
form_data = {
@ -57,7 +58,7 @@ class TestDotNotation(BaseTest):
"invoice.dueDate": "09/30/2022",
}
ProcessInstanceService.complete_form_task(
processor, user_task, form_data, with_super_admin_user
processor, user_task, form_data, with_super_admin_user, active_task
)
expected = {

View File

@ -163,3 +163,29 @@ class TestPermissions(BaseTest):
self.assert_user_has_permission(
group_a_admin, "update", f"/{process_group_b_id}"
)
def test_user_can_access_base_path_when_given_wildcard_permission(
self, app: Flask, with_db_and_bpmn_file_cleanup: None
) -> None:
"""Test_user_can_access_base_path_when_given_wildcard_permission."""
group_a_admin = self.find_or_create_user()
permission_target = PermissionTargetModel(uri="/process-models/%")
db.session.add(permission_target)
db.session.commit()
permission_assignment = PermissionAssignmentModel(
permission_target_id=permission_target.id,
principal_id=group_a_admin.principal.id,
permission="update",
grant_type="permit",
)
db.session.add(permission_assignment)
db.session.commit()
self.assert_user_has_permission(group_a_admin, "update", "/process-models/hey")
self.assert_user_has_permission(group_a_admin, "update", "/process-models/")
self.assert_user_has_permission(group_a_admin, "update", "/process-models")
self.assert_user_has_permission(
group_a_admin, "update", "/process-modelshey", expected_result=False
)

View File

@ -91,10 +91,10 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user
processor, spiff_task, {}, finance_user, active_task
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
assert len(process_instance.active_tasks) == 1
@ -108,11 +108,11 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user
processor, spiff_task, {}, finance_user, active_task
)
assert len(process_instance.active_tasks) == 1
active_task = process_instance.active_tasks[0]
@ -124,7 +124,7 @@ class TestProcessInstanceProcessor(BaseTest):
active_task.task_name, processor.bpmn_process_instance
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
assert process_instance.status == ProcessInstanceStatus.complete.value
@ -173,10 +173,10 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user_three
processor, spiff_task, {}, finance_user_three, active_task
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
assert len(process_instance.active_tasks) == 1
@ -190,12 +190,12 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
g.user = finance_user_three
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user_three
processor, spiff_task, {}, finance_user_three, active_task
)
assert len(process_instance.active_tasks) == 1
active_task = process_instance.active_tasks[0]
@ -208,11 +208,11 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, finance_user_four
processor, spiff_task, {}, finance_user_four, active_task
)
assert len(process_instance.active_tasks) == 1
active_task = process_instance.active_tasks[0]
@ -224,7 +224,7 @@ class TestProcessInstanceProcessor(BaseTest):
active_task.task_name, processor.bpmn_process_instance
)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
assert len(process_instance.active_tasks) == 1
@ -234,8 +234,10 @@ class TestProcessInstanceProcessor(BaseTest):
)
with pytest.raises(UserDoesNotHaveAccessToTaskError):
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, initiator_user
processor, spiff_task, {}, initiator_user, active_task
)
ProcessInstanceService.complete_form_task(processor, spiff_task, {}, testadmin1)
ProcessInstanceService.complete_form_task(
processor, spiff_task, {}, testadmin1, active_task
)
assert process_instance.status == ProcessInstanceStatus.complete.value

View File

@ -0,0 +1,745 @@
"""Test_process_instance_report_service."""
from typing import Optional
from flask import Flask
from flask.testing import FlaskClient
from tests.spiffworkflow_backend.helpers.base_test import BaseTest
from spiffworkflow_backend.models.process_instance_report import (
ProcessInstanceReportModel,
)
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.services.process_instance_report_service import (
ProcessInstanceReportFilter,
)
from spiffworkflow_backend.services.process_instance_report_service import (
ProcessInstanceReportService,
)
class TestProcessInstanceReportFilter(BaseTest):
"""TestProcessInstanceReportFilter."""
def test_empty_filter_to_dict(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
d = ProcessInstanceReportFilter().to_dict()
assert d == {}
def test_string_value_filter_to_dict(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
d = ProcessInstanceReportFilter(process_model_identifier="bob").to_dict()
assert d == {"process_model_identifier": "bob"}
def test_int_value_filter_to_dict(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
d = ProcessInstanceReportFilter(
start_from=1,
start_to=2,
end_from=3,
end_to=4,
).to_dict()
assert d == {
"start_from": "1",
"start_to": "2",
"end_from": "3",
"end_to": "4",
}
def test_list_single_value_filter_to_dict(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
d = ProcessInstanceReportFilter(process_status=["bob"]).to_dict()
assert d == {"process_status": "bob"}
def test_list_multiple_value_filter_to_dict(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
d = ProcessInstanceReportFilter(process_status=["joe", "bob", "sue"]).to_dict()
assert d == {"process_status": "joe,bob,sue"}
class TestProcessInstanceReportService(BaseTest):
"""TestProcessInstanceReportService."""
def _filter_from_metadata(
self, report_metadata: dict
) -> ProcessInstanceReportFilter:
"""Docstring."""
report = ProcessInstanceReportModel(
identifier="test",
created_by_id=1,
report_metadata=report_metadata,
)
return ProcessInstanceReportService.filter_from_metadata(report)
def _filter_from_metadata_with_overrides(
self,
report_metadata: dict,
process_model_identifier: Optional[str] = None,
start_from: Optional[int] = None,
start_to: Optional[int] = None,
end_from: Optional[int] = None,
end_to: Optional[int] = None,
process_status: Optional[str] = None,
) -> ProcessInstanceReportFilter:
"""Docstring."""
report = ProcessInstanceReportModel(
identifier="test",
created_by_id=1,
report_metadata=report_metadata,
)
return ProcessInstanceReportService.filter_from_metadata_with_overrides(
report,
process_model_identifier,
start_from,
start_to,
end_from,
end_to,
process_status,
)
def _filter_by_dict_from_metadata(self, report_metadata: dict) -> dict[str, str]:
"""Docstring."""
report = ProcessInstanceReportModel(
identifier="test",
created_by_id=1,
report_metadata=report_metadata,
)
return ProcessInstanceReportService.filter_by_to_dict(report)
def test_filter_by_to_dict_no_filter_by(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
filters = self._filter_by_dict_from_metadata(
{
"columns": [],
}
)
assert filters == {}
def test_filter_by_to_dict_empty_filter_by(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
filters = self._filter_by_dict_from_metadata(
{
"columns": [],
"filter_by": [],
}
)
assert filters == {}
def test_filter_by_to_dict_single_filter_by(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
filters = self._filter_by_dict_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "end_to", "field_value": "1234"}],
}
)
assert filters == {"end_to": "1234"}
def test_filter_by_to_dict_mulitple_filter_by(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
filters = self._filter_by_dict_from_metadata(
{
"columns": [],
"filter_by": [
{"field_name": "end_to", "field_value": "1234"},
{"field_name": "end_from", "field_value": "4321"},
],
}
)
assert filters == {"end_to": "1234", "end_from": "4321"}
def test_report_with_no_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_empty_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_unknown_filter_field_name(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "bob", "field_value": "joe"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_unknown_filter_keys(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"_name": "bob", "_value": "joe"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_process_model_identifier_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [
{"field_name": "process_model_identifier", "field_value": "bob"}
],
}
)
assert report_filter.process_model_identifier == "bob"
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_start_from_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "start_from", "field_value": "1234"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from == 1234
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_start_to_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "start_to", "field_value": "1234"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to == 1234
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_end_from_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "end_from", "field_value": "1234"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from == 1234
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_with_end_to_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "end_to", "field_value": "1234"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to == 1234
assert report_filter.process_status is None
def test_report_with_single_startus_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [{"field_name": "process_status", "field_value": "ready"}],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["ready"]
def test_report_with_multiple_startus_filters(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [
{
"field_name": "process_status",
"field_value": "ready,completed,other",
}
],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["ready", "completed", "other"]
def test_report_with_multiple_filters(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata(
{
"columns": [],
"filter_by": [
{"field_name": "start_from", "field_value": "44"},
{"field_name": "end_from", "field_value": "55"},
{"field_name": "process_status", "field_value": "ready"},
],
}
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from == 44
assert report_filter.start_to is None
assert report_filter.end_from == 55
assert report_filter.end_to is None
assert report_filter.process_status == ["ready"]
def test_report_no_override_with_no_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
},
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_override_with_no_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
},
end_to=54321,
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to == 54321
assert report_filter.process_status is None
def test_report_override_process_model_identifier_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [
{"field_name": "process_model_identifier", "field_value": "bob"}
],
},
process_model_identifier="joe",
)
assert report_filter.process_model_identifier == "joe"
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_override_start_from_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [{"field_name": "start_from", "field_value": "123"}],
},
start_from=321,
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from == 321
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_override_start_to_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [{"field_name": "start_to", "field_value": "123"}],
},
start_to=321,
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to == 321
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_override_end_from_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [{"field_name": "end_from", "field_value": "123"}],
},
end_from=321,
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from == 321
assert report_filter.end_to is None
assert report_filter.process_status is None
def test_report_override_end_to_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [{"field_name": "end_to", "field_value": "123"}],
},
end_to=321,
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to == 321
assert report_filter.process_status is None
def test_report_override_process_status_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [
{"field_name": "process_status", "field_value": "joe,bob"}
],
},
process_status="sue",
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["sue"]
def test_report_override_mulitple_process_status_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [{"field_name": "process_status", "field_value": "sue"}],
},
process_status="joe,bob",
)
assert report_filter.process_model_identifier is None
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["joe", "bob"]
def test_report_override_does_not_override_other_filters(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [
{"field_name": "process_model_identifier", "field_value": "sue"},
{"field_name": "process_status", "field_value": "sue"},
],
},
process_status="joe,bob",
)
assert report_filter.process_model_identifier == "sue"
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["joe", "bob"]
def test_report_override_of_none_does_not_override_filter(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""Docstring."""
report_filter = self._filter_from_metadata_with_overrides(
{
"columns": [],
"filter_by": [
{"field_name": "process_model_identifier", "field_value": "sue"},
{"field_name": "process_status", "field_value": "sue"},
],
},
process_status=None,
)
assert report_filter.process_model_identifier == "sue"
assert report_filter.start_from is None
assert report_filter.start_to is None
assert report_filter.end_from is None
assert report_filter.end_to is None
assert report_filter.process_status == ["sue"]

View File

@ -5,8 +5,8 @@ from flask_bpmn.models.db import db
from tests.spiffworkflow_backend.helpers.base_test import BaseTest
from tests.spiffworkflow_backend.helpers.test_data import load_test_spec
from spiffworkflow_backend.models.bpmn_process_id_lookup import BpmnProcessIdLookup
from spiffworkflow_backend.models.process_model import ProcessModelInfo
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.services.process_instance_processor import (
ProcessInstanceProcessor,
@ -116,7 +116,7 @@ class TestProcessModel(BaseTest):
# delete all of the id lookup items to force to processor to find the correct
# process model when running the process
db.session.query(BpmnProcessIdLookup).delete()
db.session.query(SpecReferenceCache).delete()
db.session.commit()
processor = ProcessInstanceProcessor(process_instance)
processor.do_engine_steps(save=True)

View File

@ -4,13 +4,12 @@ import os
import pytest
from flask import Flask
from flask.testing import FlaskClient
from flask_bpmn.api.api_error import ApiError
from flask_bpmn.models.db import db
from SpiffWorkflow.dmn.parser.BpmnDmnParser import BpmnDmnParser # type: ignore
from SpiffWorkflow.bpmn.parser.ValidationException import ValidationException # type: ignore
from tests.spiffworkflow_backend.helpers.base_test import BaseTest
from tests.spiffworkflow_backend.helpers.test_data import load_test_spec
from spiffworkflow_backend.models.bpmn_process_id_lookup import BpmnProcessIdLookup
from spiffworkflow_backend.models.spec_reference import SpecReferenceCache
from spiffworkflow_backend.models.user import UserModel
from spiffworkflow_backend.services.process_model_service import ProcessModelService
from spiffworkflow_backend.services.spec_file_service import SpecFileService
@ -43,11 +42,11 @@ class TestSpecFileService(BaseTest):
bpmn_file_name=self.bpmn_file_name,
bpmn_file_location="call_activity_nested",
)
bpmn_process_id_lookups = BpmnProcessIdLookup.query.all()
bpmn_process_id_lookups = SpecReferenceCache.query.all()
assert len(bpmn_process_id_lookups) == 1
assert bpmn_process_id_lookups[0].bpmn_process_identifier == "Level1"
assert bpmn_process_id_lookups[0].identifier == "Level1"
assert (
bpmn_process_id_lookups[0].bpmn_file_relative_path
bpmn_process_id_lookups[0].relative_path
== self.call_activity_nested_relative_file_path
)
@ -68,25 +67,23 @@ class TestSpecFileService(BaseTest):
bpmn_file_name=self.bpmn_file_name,
bpmn_file_location=self.process_model_id,
)
bpmn_process_id_lookups = BpmnProcessIdLookup.query.all()
bpmn_process_id_lookups = SpecReferenceCache.query.all()
assert len(bpmn_process_id_lookups) == 1
assert bpmn_process_id_lookups[0].identifier == bpmn_process_identifier
assert (
bpmn_process_id_lookups[0].bpmn_process_identifier
== bpmn_process_identifier
)
assert (
bpmn_process_id_lookups[0].bpmn_file_relative_path
bpmn_process_id_lookups[0].relative_path
== self.call_activity_nested_relative_file_path
)
with pytest.raises(ApiError) as exception:
with pytest.raises(ValidationException) as exception:
load_test_spec(
"call_activity_nested_duplicate",
process_model_source_directory="call_activity_duplicate",
bpmn_file_name="call_activity_nested_duplicate",
)
assert f"Process id ({bpmn_process_identifier}) has already been used" in str(
exception.value
)
assert (
f"Process id ({bpmn_process_identifier}) has already been used"
in str(exception.value)
)
def test_updates_relative_file_path_when_appropriate(
self,
@ -97,9 +94,10 @@ class TestSpecFileService(BaseTest):
) -> None:
"""Test_updates_relative_file_path_when_appropriate."""
bpmn_process_identifier = "Level1"
process_id_lookup = BpmnProcessIdLookup(
bpmn_process_identifier=bpmn_process_identifier,
bpmn_file_relative_path=self.call_activity_nested_relative_file_path,
process_id_lookup = SpecReferenceCache(
identifier=bpmn_process_identifier,
relative_path=self.call_activity_nested_relative_file_path,
type="process",
)
db.session.add(process_id_lookup)
db.session.commit()
@ -113,14 +111,48 @@ class TestSpecFileService(BaseTest):
bpmn_file_location=self.process_model_id,
)
bpmn_process_id_lookups = BpmnProcessIdLookup.query.all()
bpmn_process_id_lookups = SpecReferenceCache.query.all()
assert len(bpmn_process_id_lookups) == 1
assert bpmn_process_id_lookups[0].identifier == bpmn_process_identifier
assert (
bpmn_process_id_lookups[0].bpmn_process_identifier
== bpmn_process_identifier
bpmn_process_id_lookups[0].relative_path
== self.call_activity_nested_relative_file_path
)
def test_change_the_identifier_cleans_up_cache(
self,
app: Flask,
client: FlaskClient,
with_db_and_bpmn_file_cleanup: None,
with_super_admin_user: UserModel,
) -> None:
"""When a BPMN processes identifier is changed in a file, the old id is removed from the cache."""
old_identifier = "ye_old_identifier"
process_id_lookup = SpecReferenceCache(
identifier=old_identifier,
relative_path=self.call_activity_nested_relative_file_path,
file_name=self.bpmn_file_name,
process_model_id=f"{self.process_group_id}/{self.process_model_id}",
type="process",
)
db.session.add(process_id_lookup)
db.session.commit()
self.create_group_and_model_with_bpmn(
client=client,
user=with_super_admin_user,
process_group_id=self.process_group_id,
process_model_id=self.process_model_id,
bpmn_file_name=self.bpmn_file_name,
bpmn_file_location=self.process_model_id,
)
bpmn_process_id_lookups = SpecReferenceCache.query.all()
assert len(bpmn_process_id_lookups) == 1
assert bpmn_process_id_lookups[0].identifier != old_identifier
assert bpmn_process_id_lookups[0].identifier == "Level1"
assert (
bpmn_process_id_lookups[0].bpmn_file_relative_path
bpmn_process_id_lookups[0].relative_path
== self.call_activity_nested_relative_file_path
)
@ -162,19 +194,15 @@ class TestSpecFileService(BaseTest):
files = SpecFileService.get_files(process_model_info)
file = next(filter(lambda f: f.name == "call_activity_level_3.bpmn", files))
ca_3 = SpecFileService.get_references_for_file(
file, process_model_info, BpmnDmnParser
)
ca_3 = SpecFileService.get_references_for_file(file, process_model_info)
assert len(ca_3) == 1
assert ca_3[0].name == "Level 3"
assert ca_3[0].id == "Level3"
assert ca_3[0].display_name == "Level 3"
assert ca_3[0].identifier == "Level3"
assert ca_3[0].type == "process"
file = next(filter(lambda f: f.name == "level2c.dmn", files))
dmn1 = SpecFileService.get_references_for_file(
file, process_model_info, BpmnDmnParser
)
dmn1 = SpecFileService.get_references_for_file(file, process_model_info)
assert len(dmn1) == 1
assert dmn1[0].name == "Decision 1"
assert dmn1[0].id == "Decision_0vrtcmk"
assert dmn1[0].display_name == "Decision 1"
assert dmn1[0].identifier == "Decision_0vrtcmk"
assert dmn1[0].type == "decision"