terraform-aws-airflow icon indicating copy to clipboard operation
terraform-aws-airflow copied to clipboard

Flower

Open edbizarro opened this issue 5 years ago • 6 comments

edbizarro avatar Mar 27 '19 02:03 edbizarro

FYI - in latest run, Flower didn't start. Probably need to pass -D to webserver - daemonize it so 2nd part of command runs

domdip avatar Apr 12 '19 23:04 domdip

Thanks! yeah, you're right. I'll change the script

edbizarro avatar Apr 12 '19 23:04 edbizarro

Found out that flower does not support SQS

So I'm moving to redis instead

edbizarro avatar Apr 13 '19 00:04 edbizarro

FYI what actually needs to happen is something like this (used in my case where I'm putting the scheduler and webserver on one host).

#!/usr/bin/env bash
if [ "\$AIRFLOW_ROLE" == "SCHEDULER" ]
then exec airflow scheduler -n 10
elif [ "\$AIRFLOW_ROLE" == "WEBSERVER" ]; then 
    exec airflow webserver -D 
    exec airflow scheduler -D -n 10
elif [ "\$AIRFLOW_ROLE" == "WORKER" ]
then exec airflow worker
else echo "AIRFLOW_ROLE value unknown" && exit 1
fi

exec spawns things in a new (dedicated) process so it can't have bash helpers like &&, will need to be called twice.

domdip avatar Apr 13 '19 02:04 domdip

Yes, i did that, but flower throws errors wich led me to that issue saying flower doesnt support SQS

edbizarro avatar Apr 13 '19 02:04 edbizarro

Actually, be careful using that approach. This may not be an issue with Flower but Scheduler quits frequently and is managed by systemctl. Putting two things under that same script will mean that systemctl will wait for all of them to exit before restarting either.

I had to rewrite this a lot for my use case (scheduler + webserver on same host) ultimate setting up a systemctl for each of them, with slightly different service environments.

domdip avatar Apr 15 '19 21:04 domdip