data-pipelines-with-apache-airflow
data-pipelines-with-apache-airflow copied to clipboard
Chapter 2 Rocket Launch DAG
Running the tutorial on Digital Ocean Droplet had to make the following edits to the code to get it to run:
-
Update Import Statements to :
from airflow.operators.bash_operator import BashOperator
from airflow.operators.python_operator import PythonOperator
-
Update the curl statement in the bash command passed to the BashOperator:
bash_command = "curl -Lk -o /tmp/launches.json 'https://ll.thespacedevs.com/2.0.0/launch/upcoming'"
I confirmed : same issue running Airflow docker image https://hub.docker.com/r/puckel/docker-airflow This is Airflow version 1.10.9
Fix works
-
When I run
docker-compose up -d
on mac os 14.2.1 I get the errorno matching manifest for linux/arm64/v8 in the manifest list entries
. Any suggestions of a fix -
If I run airflow in a Python 3.8 virtual environment the first task is successfully completed but doesn't proceed onto the second. I can't locate the launches.json file created in the first step. Could it be that the tmp directory is not persisted for the second task to use? Any ideas on how to fix?
updating the docker-compose.yml file line 13 to:
x-airflow-image: &airflow_image apache/airflow:2.8.0-python3.10
got the docker-compose up -d
working successfully.
Updating the bash command as suggested by @NomadAgile got all three tasks of the dag to run successfully.