docker-airflow icon indicating copy to clipboard operation
docker-airflow copied to clipboard

docker-compose-CeleryExecutor executes SequentialExecutor

Open tejas20shah opened this issue 5 years ago • 0 comments

Hello, I have a strange issue.

I have built my own docker using the same image and same entrypoint.sh file.

When I try to execute docker compose, it doesnt show any errors, all the containers starts successfully.

However, I notice that webserver is still using SequentialExecutor... Also, it picks up the dags from /root/airflow/dags. Following is the log of the webserver process..

2020-07-22 17:41:39,834] {dagbag.py:403} INFO - Filling up the DagBag from /root/airflow/dags
[2020-07-22 17:41:41 +0000] [20] [INFO] Handling signal: ttou
[2020-07-22 17:41:41 +0000] [294] [INFO] Worker exiting (pid: 294)
[2020-07-22 17:42:11 +0000] [20] [INFO] Handling signal: ttin
[2020-07-22 17:42:11 +0000] [335] [INFO] Booting worker with pid: 335
[2020-07-22 17:42:11,929] {__init__.py:51} INFO - Using executor SequentialExecutor

```When I launch airflow list_dags, it shows all the example dags even if I have disabled the examples. It doesnt even show my dag which seems to be copied successfully inside the /usr/local/airflow.

I am puzzled why despite successful launch of docker-compose-CeleryExecutor.yml, why it is still using SequentialExecutor and why none of the settings related to CeleryExecutor are being taken into account?

I am using following versions.

docker-compose version:  1.26.2, build eefe0d31

docker version:   19.03.12, build 48a66213fe

docker-compose-CeleryExecutor.yml Version: 2.1

airflow version: 1.10.9

I have also mapped a volume in docker-compose-CeleryExecutor.yml to use different version of entrypoint.sh than the one present in the image.. Not many changes though than the standard one.. I have just added few echo statements to check if everything is getting executed.. No surprises here. Everything gets executed as per the script.

I tried enabling more logs by setting following variable

AIRFLOW__CORE__LOGGING_LEVEL to DEBUG

LOAD_EX is also set to n.

I notice that whatever config variables I set in the entrypoint.sh, nothing is taken into consideration. It always executes the default configuration of airflow which is Sequential Executor and with all examples loaded. Not sure what is happening.

Please help.

tejas20shah avatar Jul 23 '20 17:07 tejas20shah