* added basic model for new reference cache w/ burnettk
* switched out SpecReferenceCache for ReferenceCacheModel w/ burnettk jbirddog
* pyl w/ burnettk jbirddog
* save items to the db using the new cache with generation table w/ burnettk
* bulk save for performance
* tests are passing
* actually use the new generation table - we still need a test to ensure we are using it
* added test to ensure using new cache generation
* corrected reference interface on frontend w/ burnettk
* do not perform git pull in webhook if the revision is the same as the current w/ burnettk jbirddog
---------
Co-authored-by: jasquat <jasquat@users.noreply.github.com>
Co-authored-by: burnettk <burnettk@users.noreply.github.com>
* some initial work to support user api keys w/ burnettk
* some updates to store and use service accounts - migrations do not work in sqlite atm
* pyl
* minor tweak to the migration
* refactored user route
* this is working if returning user that created the service account
* put back migrations from main w/ burnettk
* tests pass with new migration w/ burnettk
* do not remove service account permissions on refresh_permissions w/ burnettk
* added new component to make some api calls to populate child components and routes w/ burnettk
* allow displaying extensions in configuration tab w/ burnettk
* removed service accounts controller in favor of extension and encrypt the api keys
* add fuzz to username to make deleting and recreating service accounts easier
* allow specifying the process id to use when running an extension w/ burnettk
* allow extensions to navigate to each other on form submit w/ burnettk
* removed commented out debug code
---------
Co-authored-by: jasquat <jasquat@users.noreply.github.com>
* added ability to display navigation items in user profile toggle
* updated naming of some extension elements
* added user property table and updates for extensions to use it w/ burnettk
* moved extension ui interfaces to own file and linting issues
* some updates to render markdown results on load w/ burnettk
* added migration merge file w/ burnettk
* moved code to fix linting issues w/ burnettk
* resolved db migration conflict
* removed unnecessary migrations and added just one w/ burnettk
---------
Co-authored-by: jasquat <jasquat@users.noreply.github.com>
* Adding dependencies
* Disconnect from /v1/auths for auth list. hardcoded for now.
* Revert changes
* WIP
* Getting hardcoded v2 auths into the frontend
* Better url for v2 oauth
* Pass the auth token from the frontend, don't verify token to start the auth process
* Manually verify the token from the querystring
* WIP
* WIP
* WIP, refactor SPIFF_SECRET handling, move dependencies
* Construct remote_app
* WIP
* WIP
* WIP
* WIP
* Ugly but getting the grant screen
* WIP
* WIP
* Github oauth ok
* Verify token, save access token
* Let secret name work with regex
* Getting bin_pyl to pass
* New component
* Load up the current config in an editor
* Getting bin_pyl to pass
* End point to update auth config
* Linting
* Adding configuration model
* Adding configuration model
* Prep to read config from db
* Read config from the db
* Save/reload poor man's styling
* Getting bin_pyl to pass
* Getting bin_pyl to pass
* Getting bin_pyl to pass
* Better handling of invalid json
* Getting bin_pyl to pass
* added a new model to store task draft data in a join table
* cleaned up using the join table for draft table w/ burnettk
* created new single migration for changes w/ burnettk
* added hidden form which autosaves without validations w/ burnettk
* change close button name since it does indeed save on close now
---------
Co-authored-by: jasquat <jasquat@users.noreply.github.com>
* Wiring up the datastore
* Writes into the data store
* Bulk save needs the timestamps
* Prep to do the local query
* Local typeahead working
* Pre pr cleanup
* ignore migrations dir in pre-commit for ruff w/ burnettk
* Getting ./bin/pyl to pass
---------
Co-authored-by: jasquat <jasquat@users.noreply.github.com>
SpiffWorkflow -
- start_messages function should return message names, not ids.
- don't catch external thrown messages within the same workflow process
- add an expected value to the Correlation Property Model so we can use this well defined class as an external communication tool (rather than building an arbitrary dictionary)
- Added a "get_awaiting_correlations" to an event, so we can get a list of the correlation properties related to the workflows currently defined correlation values.
- workflows.waiting_events() function now returns the above awaiting correlations as the value on returned message events
Backend
- Dropping MessageModel and MessageCorrelationProperties - at least for now. We don't need them to send / receive messages though we may eventually want to track the messages and correlations defined across the system - these things (which are ever changing) should not be directly connected to the Messages which may be in flux - and the cross relationships between the tables could cause unexpected and unceissary errors. Commented out the caching logic so we can turn this back on later.
- Slight improvement to API Errors
- MessageInstances are no longer in a many-to-many relationship with Correlations - Each message instance has a unique set of message correlations specific to the instance.
- Message Instances have users, and can be linked through a "counterpart_id" so you can see what send is connected to what recieve.
- Message Correlations are connected to recieving message instances. It is not to a process instance, and not to a message model. They now include the expected value and retrieval expression required to validate an incoming message.
- A process instance is not connected to message correlations.
- Message Instances are not always tied to a process instance (for example, a Send Message from an API)
- API calls to create a message use the same logic as all other message catching code.
- Make use of the new waiting_events() method to check for any new recieve messages in the workflow (much easier than
churning through all of the tasks)
- One giant mother of a migration.
1. It's not just processes, it contains the list of all DMN Decisions as well.
2. It is closely linked to the SpecReference object that can be generated by looking through all the Spec files to find the processes and decisions they contain.
3. It is a cache of information, the file system is the source of truth. Seems likely we will cache more things in the future -- so setting things up this way made sense.