b273f250d2
Signed-off-by: Alexis Pentori <alexis@status.im> |
||
---|---|---|
providers/airbyte | ||
tasks | ||
.gitignore | ||
README.md | ||
comm_extraction.py | ||
dbt.py | ||
dev_spiff.py | ||
forecast_sync.py | ||
forums_sync.py | ||
github_extraction.py | ||
infra_sync.py | ||
logos-org-map.py | ||
organisation_sync.py | ||
prod_spiff.py | ||
spiff_extraction.py | ||
temporal.py | ||
test_spiff.py | ||
treasure_dashboard.py | ||
website_sync.py |
README.md
Description
This repo contains implementations of Airflow workflows and tasks called respectively DAGs and Operators.
- DAGs - Direct Acyclic Graphs - Python scripts defining workflows in a way that reflects their relationships.
- Operators - Python functions which define the individual tasks that are executed as part of a DAG run.
To learn how to write DAGs and Operators read about core concepts and follow the official tutorial.
DAG
This repository contains:
website_sync
: DAG to launch the Airbyte jobs for the status-website charts.spiff_sync
: DAG to synchronize Spiff workflows datadbt
: DAG to run all the dbt models,gh_sync
: DAG to synchronize data from repository (logos, waku, codex)
The DBT models run in some DAG are stored in
dbt-models
.
Continuous Integration
Changes pushed to master
are automatically fetched to our Airflow instance by the airflow-webhook
service.
Branches
This repos has 3 working branches:
prod
: used by https://airflow.bi.status.im.test
: used by https://airflow.test.bi.status.im to test DAGs modification.example
: contains examples of DAGs
Infrastructure
All Airflow infrastructure is managed in the infra-bi repository.