* tech docs

* more tech docs

* add process doc

* deploy docs

* doc deployment

---------

Co-authored-by: burnettk <burnettk@users.noreply.github.com>
This commit is contained in:
Kevin Burnett 2024-05-21 19:03:48 +00:00 committed by GitHub
parent da255adbc2
commit fc74924a6d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
10 changed files with 271 additions and 5 deletions

View File

@ -7,9 +7,9 @@ This powerful feature ensures adaptability to various business needs, from custo
Here are some of the key aspects of using Extensions: Here are some of the key aspects of using Extensions:
- Extensions are implemented within the process model repository. - Extensions are implemented within the process model repository.
- Once an extension is created, it can be made accessible via various UI elements which can be specified in its `extension-uischema.json` file. - Once an extension is created, it can be made accessible via various UI elements which can be specified in its `extension_uischema.json` file.
- Access to an extension can be set up via permissions. - Access to an extension can be set up via permissions.
- Configuration for an extension can be found and modified in its `extension-uischema.json` file. - Configuration for an extension can be found and modified in its `extension_uischema.json` file.
![Extensions](images/Extensions_dashboard.png) ![Extensions](images/Extensions_dashboard.png)
@ -18,8 +18,7 @@ Here are some of the key aspects of using Extensions:
### Environment Variable Activation ### Environment Variable Activation
To utilize extensions, an environment variable must be set. To utilize extensions, an environment variable must be set.
This variable activates the extensions feature in the SpiffWorkflow backend. This variable activates the extensions feature in the SpiffWorkflow backend:
Here is the environmental variable:
SPIFFWORKFLOW_BACKEND_EXTENSIONS_API_ENABLED=true SPIFFWORKFLOW_BACKEND_EXTENSIONS_API_ENABLED=true
@ -34,7 +33,7 @@ To create your own custom extension, follow these steps:
![Extension Process Group](images/Extension1.png) ![Extension Process Group](images/Extension1.png)
- Create a process model in this group. You can give it whatever name you want. Then create a file inside the process model called `extension-uischema.json`. This will control how the extension will work. - Create a process model in this group. You can give it whatever name you want. Then create a file inside the process model called `extension_uischema.json`. This will control how the extension will work.
![Extension](images/Extension_UI_schema.png) ![Extension](images/Extension_UI_schema.png)

View File

@ -0,0 +1,29 @@
# Deployment
The minimal deployment is to mimic the docker-compose.yml file at the root of spiff-arena.
Steps for a more hardened production setup after that baseline include:
1. setting up a MySQL or PostgreSQL database for Backend persistence (instead of sqlite)
2. setting up a Redis/Valkey or RabbitMQ server for a Celery broker.
2. separating out the Backend deployment into three deployments, 1) API, 2) Background, and 3) Celery worker.
```mermaid
graph TD;
subgraph Backend
A[API]
B[Background]
Ce[Celery Worker]
end
F[Frontend]
Co[Connector Proxy]
D[Database]
A --> D
B --> D
Ce --> D
F -- Communicates with --> Backend
Backend -- delegates to --> Co
```
API, Celery Worker, Connector Proxy, and Frontend can run any number of replicas.
The Background container is like a cron container, so it should run only one replica.

View File

@ -19,6 +19,7 @@ release = "0.1"
# https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration # https://www.sphinx-doc.org/en/master/usage/configuration.html#general-configuration
extensions = ["myst_parser", "sphinxcontrib.mermaid"] extensions = ["myst_parser", "sphinxcontrib.mermaid"]
myst_fence_as_directive = ["mermaid"]
templates_path = ["_templates"] templates_path = ["_templates"]
exclude_patterns = ["_build", "Thumbs.db", ".DS_Store", ".venv"] exclude_patterns = ["_build", "Thumbs.db", ".DS_Store", ".venv"]

59
docs/dev/backend.md Normal file
View File

@ -0,0 +1,59 @@
# Backend / API
spiffworkflow-backend is a python flask application that functions purely a REST API.
It has API documentation available at `/v1.0/ui` generated by openapi ([example hosted docs](https://api.spiffdemo.org/v1.0/ui)).
## Layers
```mermaid
graph LR
Controllers -- delegate work to --> Services
Services -- use --> Models
Models -- access --> DB[(DB)]
```
### Controllers / routes
Requests come in through the controller layer.
An example controller is src/spiffworkflow_backend/routes/health_controller.py
In order to know which controller a request should go to, src/spiffworkflow_backend/api.yml is used by the connexion library.
For example, in api.yml, a GET of /status is mapped to spiffworkflow_backend.routes.health_controller.status, where status is a function in health_controller.py.
Controllers can use services (preferred) and models (allowed) to do their work, but they can never use another controller.
### Services
Services are where most of the business logic lives.
Services can use other services, but the direction of usage must be in one way.
If serviceA uses serviceB, then serviceB cannot use serviceA.
Services are allowed to use models, but models are not allowed to use Services.
Services cannot use controllers.
Keeping calls flowing in a single direction makes things easier to understand and avoids circular imports.
* We have a general notion that services should not call other services (or at
least it must be calls in a single direction. if you call serviceB with serviceA,
then serviceB cannot call serviceA)
* Services should get called by routes.
* We have a general notion that services can call models, but models should
not call services (again, to avoid circular dependencies)
### Models
Models are how flask code interacts with the database
## Database
Backend uses the sqlalchemy library to connect to a required relational database.
This database can be one of MySQL, PostgreSQL, or SQLite.
All of these database engines are tested in CI.
## Serialization
When serializing models to json:
* avoid json.dumps when you are creating json. use jsonify (a flask thing) instead
* avoid marshmallow when possible and instead use @dataclass on your model
* if you need to represent your object in a very custom way (the default dataclass columns are not working out), write a method called serialized on your model (this is used by the default serializer)
## Exceptions
Do not define BlahError (exceptions) inside other classes.
All exception classes should be defined in 1) one file, if there are not too many or 2) files that contain only other exception class definitions, again to avoid circular imports.
## Deployment
The gunicorn web server is used to serve the application in the default Dockerfile.
Many of the environment variables that can be set are documented in src/spiffworkflow_backend/config/default.py.

View File

@ -0,0 +1,4 @@
# Connector Proxy
A connector-proxy is an application that is generally deployed alongside frontend and backend.
Please see [Connector Proxy in 5 mins](https://github.com/sartography/spiff-arena/wiki/Connector-Proxy-in-5-mins).

44
docs/dev/frontend.md Normal file
View File

@ -0,0 +1,44 @@
# Frontend
spiffworkflow-frontend is a react application that relies on the spiffworkflow-backend REST API.
We try to keep library bloat to a minimum.
## Libaries
### bpmn-js
The bpmn-js library is used to render and edit BPMN diagrams.
This library is maintained by the Camunda team.
### bpmn-js-spiffworkflow
These are SpiffWorkflow extensions to bpmn-js to make for a better experience when executing BPMN diagrams using the SpiffWorkflow execution engine.
### rjsf/core, @rjsf/utils, etc
React JSON Schema Form is used to build forms from JSON schemas.
You can attach JSON schemas to tasks that are meant to be completed by people (User Tasks), and the frontend will render a form for that task.
You can specify what data is required as well as how the form should look.
### @tanstack/react-query
We haven't deeply integrated this library, but it is used by the system that caches permission calls.
## Layers
```mermaid
graph LR
Routes -- delegate work to --> Services
```
### Routes
When the browser sees a URL like /hithere, it will render a route component to handle the request.
### Services
The route component may or may not delegate some of its work to a service.
## Deployment
The generated docker image uses nginx to serve static html/css/js files that are generated by the vite build process.
These files can also be hosted on a CDN.

34
docs/dev/index.md Normal file
View File

@ -0,0 +1,34 @@
# Technical Overview
## Components
```mermaid
graph TD
subgraph spiff-arena
Backend
Frontend
end
subgraph Backend
subgraph SpiffWorkflow lib
end
end
subgraph Frontend
subgraph bpmn-js-spiffworkflow lib
end
end
Frontend -- uses REST API --> Backend
Backend -- delegates to --> C
Backend -- persists to --> DB
DB[(mysql/postgres)]
C[Connector Proxy]
```
SpiffArena is a system that allows users to build and execute BPMN diagrams.
It is composed of three applications, [spiffworkflow-frontend](frontend), [spiffworkflow-backend](backend), and, optionally, a [connector proxy](connector_proxy).
## Source code layout
From a source code perspective, there are three repositories that may be of interest:
* [spiff-arena](https://github.com/sartography/spiff-arena) - Includes spiffworkflow-frontend, spiffworkflow-backend, and connector-proxy-demo
* [SpiffWorkflow](https://github.com/sartography/SpiffWorkflow) - The core SpiffWorkflow library, 10 years old, python, awesome, [well-documented](https://spiffworkflow.readthedocs.io/).
* [bpmn-js-spiffworkflow](https://github.com/sartography/bpmn-js-spiffworkflow) - The frontend library that extends bpmn-js to work with SpiffWorkflow

49
docs/dev/process.md Normal file
View File

@ -0,0 +1,49 @@
# Process
## Ticket management
We use github projects to manage our work.
You can find our board [here](https://github.com/orgs/sartography/projects/3) and issues can be filed [here](https://github.com/sartography/spiff-arena/issues).
## CI
We use Github Actions for CI.
The workflows are defined in the `.github/workflows` directory.
The main things that happen, not necessarily in this order, are represented in this chart:
```mermaid
flowchart LR
subgraph "backend_tests"[Backend Tests]
pytest[Pytest]
mypy[MyPy]
safety[Safety]
black[Black]
typeguard[TypeGuard]
end
subgraph "frontend_tests"[Frontend Tests]
vitest[Vitest]
eslint[ESLint]
prettier[Prettier]
cypress[Cypress]
end
backend_tests --> frontend_tests
frontend_tests --> docker_images
```
## Security
We have security checks in place for both the backend and the frontend.
These include the security lib in backend, and snyk in frontend and backend.
Two independent security reviews have been performed on the codebase and mitigations have been implemented to the satisfaction of the reviewers.
## Contributing
It would be great to have you contributing to the project.
There is a [Contributing doc](https://github.com/sartography/spiff-arena/blob/main/CONTRIBUTING.rst) that you can follow.
You can find other like-minded people in our [Discord](https://discord.gg/F6Kb7HNK7B).
```mermaid
graph LR
code[Hammer code] --> PR
PR --> Profit
```

35
docs/dev/setup.md Normal file
View File

@ -0,0 +1,35 @@
# Developer Setup
There are few options here:
1. The make-based setup will spin up docker containers and allow you to edit based on latest source.
2. The docker-compose-based setup will spin up docker containers based on the latest release and not allow you to edit.
3. The non-docker setup will allow you to run the python and react apps from your machine directly.
Please pick the one that best fits your needs.
## 1. Use the default make task
You can set up a full development environment for SpiffWorkflow like this:
```sh
git clone https://github.com/sartography/spiff-arena.git
cd spiff-arena
make
```
[This video](https://youtu.be/BvLvGt0fYJU?si=0zZSkzA1ZTotQxDb) shows what you can expect from the `make` setup.
## 2. Run the docker compose setup
```sh
mkdir spiffworkflow
cd spiffworkflow
wget https://raw.githubusercontent.com/sartography/spiff-arena/main/docker-compose.yml
docker-compose pull
docker-compose up
```
There is a [Running SpiffWorkflow Locally with Docker](https://www.spiffworkflow.org/posts/articles/get_started_docker) blog post that accompanies this setup.
## 3. Non-docker setup
Please see the instructions in the [spiff-arena README](https://github.com/sartography/spiff-arena/?tab=readme-ov-file#backend-setup-local).

View File

@ -6,6 +6,17 @@
Getting_Started/quick_start.md Getting_Started/quick_start.md
``` ```
```{toctree}
:maxdepth: 1
:caption: Technical Docs
dev/index.md
dev/setup.md
dev/backend.md
dev/frontend.md
dev/connector_proxy.md
dev/process.md
```
```{toctree} ```{toctree}
:maxdepth: 1 :maxdepth: 1
:caption: Building Diagrams :caption: Building Diagrams
@ -42,6 +53,7 @@ Debugging_Diagrams/Private_data.md
```{toctree} ```{toctree}
:maxdepth: 1 :maxdepth: 1
:caption: DevOps - Installation & Integration :caption: DevOps - Installation & Integration
DevOps_installation_integration/deployment.md
DevOps_installation_integration/admin_and_permissions.md DevOps_installation_integration/admin_and_permissions.md
DevOps_installation_integration/permission_url.md DevOps_installation_integration/permission_url.md
DevOps_installation_integration/configure_connector_proxy.md DevOps_installation_integration/configure_connector_proxy.md