Go to file
Dan Funk c572b02c5c Trying to rectify some issues with deployment and dependencies in spiffworkflow that are only failing once pushed out to production. 2020-08-12 22:00:09 -04:00
config Merge branch 'dev' into feature/documents_publishing 2020-07-30 10:17:08 -04:00
crc adding additional data to the study details returned during validation/shield testing. 2020-08-12 17:04:59 -04:00
deploy adding requirements.txt and a way to regenerate it. 2020-05-20 14:21:39 -04:00
docs Cleanup. 2020-01-09 10:09:59 -05:00
migrations Adds user uid and cascades study event in study model. Fixes migration to properly downgrade enums. 2020-08-12 10:13:23 -04:00
postgres Switches image to cr-connect-db 2020-07-10 10:27:14 -04:00
readme_images Updates to the README and removing a problematic dependency 2019-11-20 17:00:02 -05:00
schema Adds basic Flask, Connexion, and SpiffWorkflow structure 2019-11-19 15:57:46 -05:00
tests Merge branch 'dev' into cr-workflow-108 2020-08-12 10:50:19 -04:00
.gitignore Removes xlsx lock file. Adds LibreOffice lock files to gitignore 2020-04-27 22:58:12 -04:00
.sonarcloud.properties Generates coverage.xml report 2020-02-12 11:04:04 -05:00
.travis.yml Installs pytest and coverage as dev dependencies 2020-06-16 15:56:09 -04:00
Dockerfile Goes back to python base 2020-06-16 22:41:02 -04:00
LICENSE.md Adds basic Flask, Connexion, and SpiffWorkflow structure 2019-11-19 15:57:46 -05:00
Pipfile Trying to rectify some issues with deployment and dependencies in spiffworkflow that are only failing once pushed out to production. 2020-08-12 22:00:09 -04:00
Pipfile.lock Trying to rectify some issues with deployment and dependencies in spiffworkflow that are only failing once pushed out to production. 2020-08-12 22:00:09 -04:00
README.md Trying to rectify some issues with deployment and dependencies in spiffworkflow that are only failing once pushed out to production. 2020-08-12 22:00:09 -04:00
crconnect.wsgi o 2020-05-20 15:32:40 -04:00
deploy.sh Disables AWS SQS message to refresh DCOS 2020-06-16 11:09:48 -04:00
docker_run.sh Dropping the RRT-Data-Fix, it should have come out already, but had a failing test, so pulling it out now rather than delve into what is going wrong with obsolete code. 2020-07-20 11:39:50 -04:00
example_data.py Add a few more details to the workflow metadata model. 2020-07-21 15:18:08 -04:00
fact_runner.py Creating a dependency on the SpiffWorkflow fork we created. and placing the command line script into this code base and demonstrating that our workflow processor works at least on that level. Removing all kinds of garbage that had piled up before to keep this thing clean. 2019-12-11 11:45:44 -05:00
package-lock.json Missed committing a pile of changes last night. 2020-06-02 07:08:29 -04:00
run.py Allows port to be set via environment variable 2020-05-15 21:32:33 -04:00
setup.cfg Builds Docker image as WSGI + gunicorn 2020-05-24 12:37:11 -04:00
setup.py Adding box as a direct dependency. Really uncertain how this is working everwhere but in the actual deployment. 2020-07-30 15:04:09 -04:00
sonar-project.properties Adds sonar properties 2020-02-08 14:00:37 -05:00
wait-for-it.sh Scripts should be executable. 2020-02-06 09:02:02 -05:00
wsgi.py Fixes root path bug 2020-05-25 11:41:53 -04:00

README.md

sartography/cr-connect-workflow

Build Status

CR Connect Workflow Microservice

Development Setup

Tools

These instructions assume you're using these development and tools:

  • IDE: PyCharm Professional Edition
  • Operating System: Ubuntu

Environment Setup

Make sure all of the following are properly installed on your system:

  1. python3 & pip3:
  2. pipenv:
  3. install depdencies
    • pipenv install
    • pipenv install --dev (for development dependencies)

Running Postgres

Project Initialization

  1. Clone this repository.

  2. In PyCharm:

    • Go to File > New Project...
    • Click Pure Python (NOT Flask!!)
    • Click the folder icon in the Location field.
    • Select the directory where you cloned this repository and click Ok.
    • Expand the Project Interpreter section.
    • Select the New environment using radio button and choose Pipenv in the dropdown.
    • Under Base interpreter, select Python 3.7
    • In the Pipenv executable field, enter /home/your_username_goes_here/.local/bin/pipenv
    • Click Create Project Interpreter
  3. PyCharm should automatically install the necessary packages via pipenv. For me, the project interpreter did not set set up for me correctly on first attempt. I had to go to File -> Settings -> Project Interpreter and again set the project to use the correct PipEnv environment. Be sure that your settings like simliar to this, or attempt to add the interpreter again by clicking on the gear icon. Project Interpreter Settings screen

  4. With this properly setup for the project, you can now right click on the run.py and set up a new run configuration and set up a run configuration that looks like the following (be sure to save this run configuration so it doesn't go away.) : Run Configuration Screenshot

Running the Web API

Just click the "Play" button next to RUN in the top right corner of the screen. The Swagger based view of the API will be avialable at http://0.0.0.0:5000/v1.0/ui/

Running Tests

We use pytest to execute tests. You can run this from the command line with:

pipenv run coverage run -m pytest

To run the tests within PyCharm set up a run configuration using pytest (Go to Run, configurations, click the plus icon, select Python Tests, and under this select pytest, defaults should work good-a-plenty with no additional edits required.)

Documentation

Additional Documentation is available on ReadTheDocs

Additional Reading

  1. BPMN Is the tool we are using to create diagrams of our business processes. It's is a beautiful concise diagramming tool. We strongly recommend you read this complete tutorial, as this notation is the foundation on which this project as well as many other software systems for businesses are built. Know it well.

Notes on Creating Good BPMN Diagrams in Comunda

  1. Be sure to give each task a thoughtful (but unique!) id. This will make the command line and debugging far far easier. I've tended to pre-fix these, so task_ask_riddle if a task is asking a riddle for instance.