Merge branch 'main' into deploy-app-dev
This commit is contained in:
commit
32376865c0
|
@ -0,0 +1,28 @@
|
||||||
|
name: Build Docs
|
||||||
|
|
||||||
|
on:
|
||||||
|
- push
|
||||||
|
- pull_request
|
||||||
|
|
||||||
|
defaults:
|
||||||
|
run:
|
||||||
|
working-directory: docs
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
build:
|
||||||
|
name: build-docs
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Check out the repository
|
||||||
|
uses: actions/checkout@v3.3.0
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
uses: actions/setup-python@v4.6.1
|
||||||
|
with:
|
||||||
|
python-version: 3.11
|
||||||
|
- name: Pip Install
|
||||||
|
run: |
|
||||||
|
pip install -r requirements.txt
|
||||||
|
- name: Build
|
||||||
|
run: |
|
||||||
|
./bin/build --ci
|
|
@ -2,7 +2,7 @@ name: Slack Notification
|
||||||
|
|
||||||
on:
|
on:
|
||||||
workflow_run:
|
workflow_run:
|
||||||
workflows: ["Backend Tests", "Frontend Tests", "Docker Image For Main Builds", "Release Builds"]
|
workflows: ["Backend Tests", "Frontend Tests", "Docker Image For Main Builds", "Release Builds", "Build Docs"]
|
||||||
types: [completed]
|
types: [completed]
|
||||||
|
|
||||||
jobs:
|
jobs:
|
||||||
|
|
|
@ -0,0 +1,27 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
function error_handler() {
|
||||||
|
>&2 echo "Exited with BAD EXIT CODE '${2}' in ${0} script at line: ${1}."
|
||||||
|
exit "$2"
|
||||||
|
}
|
||||||
|
trap 'error_handler ${LINENO} $?' ERR
|
||||||
|
set -o errtrace -o errexit -o nounset -o pipefail
|
||||||
|
|
||||||
|
run_ci="false"
|
||||||
|
if grep -qE -- "--ci\>" <<<"$@" ; then
|
||||||
|
run_ci="true"
|
||||||
|
fi
|
||||||
|
|
||||||
|
rm -rf _build/html
|
||||||
|
|
||||||
|
sphinx_command="sphinx-autobuild"
|
||||||
|
if [[ "$run_ci" == "true" ]]; then
|
||||||
|
sphinx_command="sphinx-build"
|
||||||
|
fi
|
||||||
|
|
||||||
|
#>> sphinx-build --help 2>&1 | grep -E '^ +\-[aWn]\>'
|
||||||
|
# -a write all files (default: only write new and changed
|
||||||
|
# -j N build in parallel with N processes where possible
|
||||||
|
# -n nit-picky mode, warn about all missing references
|
||||||
|
# -W turn warnings into errors
|
||||||
|
"$sphinx_command" . _build/html -W -a -n -j auto
|
|
@ -31,9 +31,3 @@ If the Instance can not be find by searhing the list, follow these steps to sear
|
||||||
|
|
||||||
[How to find an Instance assigned to someone else](https://github.com/sartography/spiff-arena/blob/main/docs/how_to/find_an_Instance_assigned_to_someone_else.md)
|
[How to find an Instance assigned to someone else](https://github.com/sartography/spiff-arena/blob/main/docs/how_to/find_an_Instance_assigned_to_someone_else.md)
|
||||||
|
|
||||||
## **Outcome**
|
|
||||||
|
|
||||||
| ✅ Success | 🚫 Error |
|
|
||||||
| ------------------------------------------------------------ | :------------------------------------------------------------ |
|
|
||||||
| The system will then display the parent process that contains the active instance searched for. [How to view process variables](https://github.com/sartography/spiff-arena/blob/main/docs/how_to/view_process_variables.md) | Process instance Id does not exist ![suspend_status](images\process_instance_not_found.png) Repeat Step 2 with correct Id. Note: if the instance is not active anymore, you should be able to search for it.|
|
|
||||||
|
|
||||||
|
|
|
@ -48,10 +48,7 @@ If the desired task or process you would like to view is not found in the parent
|
||||||
|
|
||||||
A pop-up menu will appear. Select ‘View Call Activity Diagram’ to navigate to the sub-process.
|
A pop-up menu will appear. Select ‘View Call Activity Diagram’ to navigate to the sub-process.
|
||||||
|
|
||||||
```{image} images/call_activity_popup.png
|
![Untitled](images/call_activity_popup.png)
|
||||||
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
|
|
||||||
## **Outcome**
|
## **Outcome**
|
||||||
|
|
|
@ -4,5 +4,4 @@
|
||||||
| ⚙ How do I get there \| Menu hierarchy |
|
| ⚙ How do I get there \| Menu hierarchy |
|
||||||
| --- |
|
| --- |
|
||||||
| Find an **active** Process Instance |
|
| Find an **active** Process Instance |
|
||||||
---
|
|
||||||
|
|
||||||
|
|
|
@ -4,5 +4,4 @@
|
||||||
| ⚙ How do I get there \| Menu hierarchy |
|
| ⚙ How do I get there \| Menu hierarchy |
|
||||||
| --- |
|
| --- |
|
||||||
| Find an **active** Process Instance |
|
| Find an **active** Process Instance |
|
||||||
---
|
|
||||||
|
|
||||||
|
|
|
@ -1,8 +1,9 @@
|
||||||
# Welcome to SpiffWorkflow's documentation
|
# Welcome to SpiffWorkflow's documentation
|
||||||
|
|
||||||
```{toctree}
|
```{toctree}
|
||||||
:maxdepth: 2
|
:maxdepth: 3
|
||||||
:caption: Getting Started
|
:caption: Getting Started
|
||||||
|
UsingSpiffdemo/Getting_Started.md
|
||||||
learn_basics/learn_basics.md
|
learn_basics/learn_basics.md
|
||||||
quick_start/quick_start.md
|
quick_start/quick_start.md
|
||||||
```
|
```
|
||||||
|
@ -51,6 +52,20 @@ learn_basics/bpmn_terminology.md
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
|
```{toctree}
|
||||||
|
:maxdepth: 3
|
||||||
|
:caption: Fix me unlinked files
|
||||||
|
how_to/complete_a_task_on_behalf_of_another_user.md
|
||||||
|
how_to/edit_process_variables.md
|
||||||
|
how_to/find_an_Instance_assigned_to_myself.md
|
||||||
|
how_to/find_an_Instance_assigned_to_someone_else.md
|
||||||
|
how_to/navigate_to_an_active_process_instance.md
|
||||||
|
how_to/resume_a_process.md
|
||||||
|
how_to/suspend_a_process.md
|
||||||
|
how_to/view_process_variables.md
|
||||||
|
spiffworkflow/process_instance.md
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
## Indices and tables
|
## Indices and tables
|
||||||
|
|
||||||
|
|
|
@ -36,7 +36,7 @@ Tasks represent activities or work that needs to be performed as part of a proce
|
||||||
|
|
||||||
| **Task** | **Symbol** | **Description** |
|
| **Task** | **Symbol** | **Description** |
|
||||||
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| Service |![Untitled](images/Service_task.png) | Task that uses a Web service, an automated application, or other kinds of service in completing the task. |
|
| Service |<div style="height:50px;width:50px"> ![Untitled](images/Service_task.png) | Task that uses a Web service, an automated application, or other kinds of service in completing the task. |
|
||||||
| Send |![Untitled](images/Send.png) | Task that sends a Message to another pool. The Task is completed once the Message has been sent. |
|
| Send |![Untitled](images/Send.png) | Task that sends a Message to another pool. The Task is completed once the Message has been sent. |
|
||||||
| Receive | ![Untitled](images/Receive.png) | A Receive Task indicates that the process has to wait for a message to arrive in order to continue. The Task is completed once the| message has received. |
|
| Receive | ![Untitled](images/Receive.png) | A Receive Task indicates that the process has to wait for a message to arrive in order to continue. The Task is completed once the| message has received. |
|
||||||
| User | ![Untitled](images/User.png) | A User Task represents that a human performer performs the Task with the use of a software application. |
|
| User | ![Untitled](images/User.png) | A User Task represents that a human performer performs the Task with the use of a software application. |
|
||||||
|
@ -50,7 +50,7 @@ Tasks represent activities or work that needs to be performed as part of a proce
|
||||||
Artifacts are used to provide additional information or documentation within a process. They include data objects (representing information or data needed for the process), annotations (providing explanatory or descriptive text), and groups (used to visually group related elements).
|
Artifacts are used to provide additional information or documentation within a process. They include data objects (representing information or data needed for the process), annotations (providing explanatory or descriptive text), and groups (used to visually group related elements).
|
||||||
| **Artifact** | **Symbol** | **Description** |
|
| **Artifact** | **Symbol** | **Description** |
|
||||||
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
|---------------|------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
||||||
| Data Object |![Untitled](images/Data_Object.png) | Data objects can represent data placed to the process, data resulting from the process, data that needs to be collected |
|
| Data Object |<div style="height:50px;width:50px"> ![Untitled](images/Data_Object.png) | Data objects can represent data placed to the process, data resulting from the process, data that needs to be collected |
|
||||||
| Data Storage |![Untitled](images/Data_Storage.png) | Data storage provides the ability to store or access data that is associated with a business model. If your process outputs any data, it will become necessary to store that data. |
|
| Data Storage |![Untitled](images/Data_Storage.png) | Data storage provides the ability to store or access data that is associated with a business model. If your process outputs any data, it will become necessary to store that data. |
|
||||||
| Group | <div style="height:50px;width:50px"> ![Untitled](images/Group.png) | Groups organize tasks or processes that have significance in the overall process. |
|
| Group | <div style="height:50px;width:50px"> ![Untitled](images/Group.png) | Groups organize tasks or processes that have significance in the overall process. |
|
||||||
| Annotation | <div style="height:50px;width:50px"> ![Untitled](images/Annotation.png) | Annotations allow you to describe the business process and flow objects in more detail. |
|
| Annotation | <div style="height:50px;width:50px"> ![Untitled](images/Annotation.png) | Annotations allow you to describe the business process and flow objects in more detail. |
|
||||||
|
|
|
@ -1 +0,0 @@
|
||||||
# Navigating Spiffworkflow
|
|
|
@ -94,3 +94,10 @@ if [[ -n "${SPIFFWORKFLOW_BACKEND_ENV:-}" ]] && ! grep -Eq '^(local_development|
|
||||||
fi
|
fi
|
||||||
FLASK_APP=src/spiffworkflow_backend poetry run flask db upgrade
|
FLASK_APP=src/spiffworkflow_backend poetry run flask db upgrade
|
||||||
fi
|
fi
|
||||||
|
|
||||||
|
# for ./bin/tests-par (parallel tests with xdist)
|
||||||
|
if [[ -f "./src/instance/db_unit_testing.sqlite3" ]] ; then
|
||||||
|
for i in $(seq 0 16); do
|
||||||
|
cp ./src/instance/db_unit_testing.sqlite3 ./src/instance/db_unit_testing_gw$i.sqlite3
|
||||||
|
done
|
||||||
|
fi
|
||||||
|
|
|
@ -0,0 +1,21 @@
|
||||||
|
#!/usr/bin/env bash
|
||||||
|
|
||||||
|
function error_handler() {
|
||||||
|
>&2 echo "Exited with BAD EXIT CODE '${2}' in ${0} script at line: ${1}."
|
||||||
|
exit "$2"
|
||||||
|
}
|
||||||
|
trap 'error_handler ${LINENO} $?' ERR
|
||||||
|
set -o errtrace -o errexit -o nounset -o pipefail
|
||||||
|
|
||||||
|
if [[ ! -f ./src/instance/db_unit_testing_gw0.sqlite3 ]] ; then
|
||||||
|
>&2 echo -e "ERROR: please run the following command first in order to set up and migrate the sqlite unit_testing database:\n\n\tSPIFFWORKFLOW_BACKEND_DATABASE_TYPE=sqlite ./bin/recreate_db clean"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# check if python package pytest-xdist is installed
|
||||||
|
if ! python -c "import xdist" &>/dev/null; then
|
||||||
|
>&2 echo -e "ERROR: please install the python package pytest-xdist by running poetry install"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
SPIFFWORKFLOW_BACKEND_DATABASE_TYPE=sqlite poet test -n auto -x --ff
|
|
@ -643,6 +643,21 @@ files = [
|
||||||
[package.extras]
|
[package.extras]
|
||||||
test = ["pytest (>=6)"]
|
test = ["pytest (>=6)"]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "execnet"
|
||||||
|
version = "1.9.0"
|
||||||
|
description = "execnet: rapid multi-Python deployment"
|
||||||
|
category = "main"
|
||||||
|
optional = false
|
||||||
|
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
|
||||||
|
files = [
|
||||||
|
{file = "execnet-1.9.0-py2.py3-none-any.whl", hash = "sha256:a295f7cc774947aac58dde7fdc85f4aa00c42adf5d8f5468fc630c1acf30a142"},
|
||||||
|
{file = "execnet-1.9.0.tar.gz", hash = "sha256:8f694f3ba9cc92cab508b152dcfe322153975c29bda272e2fd7f3f00f36e47c5"},
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.extras]
|
||||||
|
testing = ["pre-commit"]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "filelock"
|
name = "filelock"
|
||||||
version = "3.11.0"
|
version = "3.11.0"
|
||||||
|
@ -1852,6 +1867,27 @@ pytest = ">=5.0"
|
||||||
[package.extras]
|
[package.extras]
|
||||||
dev = ["pre-commit", "pytest-asyncio", "tox"]
|
dev = ["pre-commit", "pytest-asyncio", "tox"]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest-xdist"
|
||||||
|
version = "3.3.1"
|
||||||
|
description = "pytest xdist plugin for distributed testing, most importantly across multiple CPUs"
|
||||||
|
category = "main"
|
||||||
|
optional = false
|
||||||
|
python-versions = ">=3.7"
|
||||||
|
files = [
|
||||||
|
{file = "pytest-xdist-3.3.1.tar.gz", hash = "sha256:d5ee0520eb1b7bcca50a60a518ab7a7707992812c578198f8b44fdfac78e8c93"},
|
||||||
|
{file = "pytest_xdist-3.3.1-py3-none-any.whl", hash = "sha256:ff9daa7793569e6a68544850fd3927cd257cc03a7ef76c95e86915355e82b5f2"},
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.dependencies]
|
||||||
|
execnet = ">=1.1"
|
||||||
|
pytest = ">=6.2.0"
|
||||||
|
|
||||||
|
[package.extras]
|
||||||
|
psutil = ["psutil (>=3.0)"]
|
||||||
|
setproctitle = ["setproctitle"]
|
||||||
|
testing = ["filelock"]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "python-dateutil"
|
name = "python-dateutil"
|
||||||
version = "2.8.2"
|
version = "2.8.2"
|
||||||
|
@ -2839,4 +2875,4 @@ tests-strict = ["codecov (==2.0.15)", "pytest (==4.6.0)", "pytest (==4.6.0)", "p
|
||||||
[metadata]
|
[metadata]
|
||||||
lock-version = "2.0"
|
lock-version = "2.0"
|
||||||
python-versions = ">=3.10,<3.12"
|
python-versions = ">=3.10,<3.12"
|
||||||
content-hash = "67863394f8de94eaddd20964ae383c6dc3416bbdec623e399b5a8a0d163e5178"
|
content-hash = "de301503903ea357212400d7cf27feffe5a73b2733c0b4f2c39cabf3de3b9bc7"
|
||||||
|
|
|
@ -82,6 +82,7 @@ prometheus-flask-exporter = "^0.22.3"
|
||||||
sqlalchemy = "^2.0.7"
|
sqlalchemy = "^2.0.7"
|
||||||
marshmallow-sqlalchemy = "^0.29.0"
|
marshmallow-sqlalchemy = "^0.29.0"
|
||||||
spiff-element-units = "^0.3.0"
|
spiff-element-units = "^0.3.0"
|
||||||
|
pytest-xdist = "^3.3.1"
|
||||||
|
|
||||||
[tool.poetry.dev-dependencies]
|
[tool.poetry.dev-dependencies]
|
||||||
pytest = "^7.1.2"
|
pytest = "^7.1.2"
|
||||||
|
|
|
@ -18,11 +18,16 @@ class ConfigurationError(Exception):
|
||||||
|
|
||||||
|
|
||||||
def setup_database_configs(app: Flask) -> None:
|
def setup_database_configs(app: Flask) -> None:
|
||||||
|
worker_id = os.environ.get("PYTEST_XDIST_WORKER")
|
||||||
|
parallel_test_suffix = ""
|
||||||
|
if worker_id is not None:
|
||||||
|
parallel_test_suffix = f"_{worker_id}"
|
||||||
|
|
||||||
if app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_URI") is None:
|
if app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_URI") is None:
|
||||||
database_name = f"spiffworkflow_backend_{app.config['ENV_IDENTIFIER']}"
|
database_name = f"spiffworkflow_backend_{app.config['ENV_IDENTIFIER']}"
|
||||||
if app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_TYPE") == "sqlite":
|
if app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_TYPE") == "sqlite":
|
||||||
app.config["SQLALCHEMY_DATABASE_URI"] = (
|
app.config["SQLALCHEMY_DATABASE_URI"] = (
|
||||||
f"sqlite:///{app.instance_path}/db_{app.config['ENV_IDENTIFIER']}.sqlite3"
|
f"sqlite:///{app.instance_path}/db_{app.config['ENV_IDENTIFIER']}{parallel_test_suffix}.sqlite3"
|
||||||
)
|
)
|
||||||
elif app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_TYPE") == "postgres":
|
elif app.config.get("SPIFFWORKFLOW_BACKEND_DATABASE_TYPE") == "postgres":
|
||||||
app.config["SQLALCHEMY_DATABASE_URI"] = (
|
app.config["SQLALCHEMY_DATABASE_URI"] = (
|
||||||
|
|
|
@ -15,6 +15,10 @@ SPIFFWORKFLOW_BACKEND_GIT_COMMIT_ON_SAVE = False
|
||||||
|
|
||||||
# NOTE: set this here since nox shoves tests and src code to
|
# NOTE: set this here since nox shoves tests and src code to
|
||||||
# different places and this allows us to know exactly where we are at the start
|
# different places and this allows us to know exactly where we are at the start
|
||||||
|
worker_id = environ.get("PYTEST_XDIST_WORKER")
|
||||||
|
parallel_test_suffix = ""
|
||||||
|
if worker_id is not None:
|
||||||
|
parallel_test_suffix = f"_{worker_id}"
|
||||||
SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR = os.path.join(
|
SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR = os.path.join(
|
||||||
os.path.dirname(__file__),
|
os.path.dirname(__file__),
|
||||||
"..",
|
"..",
|
||||||
|
@ -23,5 +27,5 @@ SPIFFWORKFLOW_BACKEND_BPMN_SPEC_ABSOLUTE_DIR = os.path.join(
|
||||||
"tests",
|
"tests",
|
||||||
"spiffworkflow_backend",
|
"spiffworkflow_backend",
|
||||||
"files",
|
"files",
|
||||||
"bpmn_specs",
|
f"bpmn_specs{parallel_test_suffix}",
|
||||||
)
|
)
|
||||||
|
|
Loading…
Reference in New Issue