mirror of
https://github.com/sartography/spiff-arena.git
synced 2025-01-13 02:54:27 +00:00
5df1262dca
* update in place with python * split files into chunks * working chunking and updated quick start * edits * sanity check * give up on faq page, long docs work * debug * system prompt updates, etc * use temp file for output * refactor * remove dup import * generate diff file * check diff output to make sure it looks reasonable * add overall results * update script * update script * update script * edits * fix function --------- Co-authored-by: burnettk <burnettk@users.noreply.github.com>
1.3 KiB
1.3 KiB
Redis Celery Broker
SpiffWorkflow can be configured to use Celery for efficient processing. Redis can be used as both a broker and backend for Celery.
If configured in this way, there will be a queue called "celery," and you can inspect it from redis-cli like this:
redis-cli LLEN celery # how many queued entries
redis-cli LRANGE celery 0 -1 # get all queued entries. Be careful if you have a lot.
If you want to purge all entries from the queue:
poetry run celery -A src.spiffworkflow_backend.background_processing.celery_worker purge
If you want to inspect jobs that are currently being processed by workers:
poetry run celery -A src.spiffworkflow_backend.background_processing.celery_worker inspect active
When we publish a message to the queue, we log a message like this at the log level info:
Queueing process instance (3) for celery (9622ff55-9f23-4a94-b4a0-4e0a615a8d14)
If you want to get the results of this job after the worker processes it, you would run a query like this:
redis-cli get celery-task-meta-9622ff55-9f23-4a94-b4a0-4e0a615a8d14
As such, if you wanted to get ALL of the results, you could use a command like:
echo 'keys celery-task-meta-\*' | redis-cli | sed 's/^/get /' | redis-cli