Status BI python DAGs for Airflow
Go to file
Alexis Pentori 7424b1f1cd
treasure_dashboard: fix update config
Signed-off-by: Alexis Pentori <alexis@status.im>
2024-02-13 17:49:49 +01:00
providers/airbyte
tasks treasure_dashboard: Updating airbyte source config 2024-01-09 14:04:52 +01:00
.gitignore
README.md
dbt.py Merge branch 'test' into prod 2024-02-07 18:20:15 +01:00
github_extraction.py
spiff_extraction.py Refactoring DAG for spiff data extractions 2023-12-14 14:13:47 +01:00
treasure_dashboard.py treasure_dashboard: fix update config 2024-02-13 17:49:49 +01:00
website_sync.py website_sync: fix dbt 2024-02-02 10:02:00 +01:00

README.md

Description

This repo contains implementations of Airflow workflows and tasks called respectively DAGs and Operators.

  • DAGs - Direct Acyclic Graphs - Python scripts defining workflows in a way that reflects their relationships.
  • Operators - Python functions which define the individual tasks that are executed as part of a DAG run.

To learn how to write DAGs and Operators read about core concepts and follow the official tutorial.

DAG

This repository contains:

  • website_sync: DAG to launch the Airbyte jobs for the status-website charts.
  • spiff_sync: DAG to synchronize Spiff workflows data
  • dbt: DAG to run all the dbt models,
  • gh_sync: DAG to synchronize data from repository (logos, waku, codex)

The DBT models run in some DAG are stored in dbt-models.

Continuous Integration

Changes pushed to master are automatically fetched to our Airflow instance by the airflow-webhook service.

Branches

This repos has 3 working branches:

Infrastructure

All Airflow infrastructure is managed in the infra-bi repository.