docker-airflow
docker-airflow copied to clipboard
initdb but table schema is unknown
I use airflow 1.10.9;
below is part of exception:
sqlalchemy.exc.OperationalError: (_mysql_exceptions.OperationalError) (1054, "Unknown column 'dag.root_dag_id' in 'field list'") [SQL: SELECT dag.dag_id AS dag_dag_id, dag.root_dag_id AS dag_root_dag_id, dag.is_paused AS dag_is_paused, dag.is_subdag AS dag_is_subdag, dag.is_active AS dag_is_active, dag.last_scheduler_run AS dag_last_scheduler_run, dag.last_pickled AS dag_last_pickled, dag.last_expired AS dag_last_expired, dag.scheduler_lock AS dag_scheduler_lock, dag.pickle_id AS dag_pickle_id, dag.fileloc AS dag_fileloc, dag.owners AS dag_owners, dag.description AS dag_description, dag.default_view AS dag_default_view, dag.schedule_interval AS dag_schedule_interval FROM dag WHERE dag.is_subdag = 0 AND dag.is_active = 1 ORDER BY dag.dag_id LIMIT %s, %s] [parameters: (0, 100)] (Background on this error at: http://sqlalche.me/e/e3q8)
When does the error occur?
When does the error occur?
After executing "airflow initdb"
I had to create a user first.
airflow list_users || (airflow initdb
&& airflow create_user --role Admin --username airflow --password airflow -e [email protected] -f airflow -l airflow)
See: https://github.com/apache/airflow/issues/8605#issuecomment-623182960
Issue solved?
To fix it:
- if you are using "MySQL" as Airflow backend, set explicit_defaults_for_timestamp=1 or "on" in Google Cloud SQL
- "Clear" dag folder before run airflow initdb
- You can create admin user on this step Done