ethereum-etl-airflow
ethereum-etl-airflow copied to clipboard
Add DAG that queries data from BigQuery and saves it to csv
The dag should run daily and query the last day's partition, save it to a temporary table and extract the result to a GCS bucket in gzipped csvs. The following tables should be exported:
- blocks
- transactions
- traces
- logs
- token_transfers
- contracts
The dag should have sensor to wait for checkpoint file in GCS, which is implemented here https://github.com/blockchain-etl/ethereum-etl-airflow/pull/142/files.