docker-airflow icon indicating copy to clipboard operation
docker-airflow copied to clipboard

Python is not logging in the task logs

Open AxelFurlanF opened this issue 4 years ago • 3 comments

Context

I'm using Airflow Celery. I'm trying to move from BashOperator executing python script.py to the actual PythonOperator but I'm running into a logging error.

I have the following DAG to test this:

import logging

logger = logging.getLogger('dag')
logger.setLevel(logging.INFO)

import os
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from airflow.operators.latest_only_operator import LatestOnlyOperator
from datetime import datetime, timedelta

default_args = {
    'owner': 'axelfurlan',
    'depends_on_past': False,
    'start_date': datetime(2019, 9, 23),
    'email_on_failure': True,
    'email_on_retry': False,
    'retries': 1,
    'retry_delay': timedelta(minutes=5)
}

dag = DAG(
    'test', catchup=False, default_args=default_args, schedule_interval='@daily')

scriptdir = os.environ.get('HOME', '/usr/local/airflow')

latest_only = LatestOnlyOperator(task_id='latest_only', dag=dag)

test_log = PythonOperator(
    task_id='test_log',
    python_callable=lambda: logger.info("this works"),
    dag=dag
)

latest_only >> test_log

Missing INFO log

The task test_log logs appear to be:

[2020-03-11 18:43:08,045] {{standard_task_runner.py:52}} INFO - Started process 234 to run task
[2020-03-11 18:43:08,166] {{python_operator.py:105}} INFO - Exporting the following env vars:
AIRFLOW_CTX_DAG_EMAIL=
AIRFLOW_CTX_DAG_OWNER=axelfurlan
AIRFLOW_CTX_DAG_ID= test
AIRFLOW_CTX_TASK_ID= test
AIRFLOW_CTX_EXECUTION_DATE=2020-03-11T18:42:51.521606+00:00
AIRFLOW_CTX_DAG_RUN_ID=manual__2020-03-11T18:42:51.521606+00:00
[2020-03-11 18:43:08,166] {{python_operator.py:114}} INFO - Done. Returned value was: None

But in the docker logs worker I can see my statement as follows:

Running %s on host %s <TaskInstance: test. test_log 2020-03-11T15:23:42.080673+00:00 [queued]> 168afb73b515
INFO this works
INFO Task exited with return code 0

My airflow.cfg has:

# Logging level
logging_level = INFO
fab_logging_level = WARN

This is happening with other DAGs as well. I'd appreciate some insight

AxelFurlanF avatar Mar 11 '20 18:03 AxelFurlanF

This works using local executor with the Puckel's docker-compose.yml

AxelFurlanF avatar Mar 11 '20 19:03 AxelFurlanF

Did you ever get this figured out, as far as I knew the PythonOperator did log out like the BashOperator does.

BrainMonkey avatar Oct 12 '20 23:10 BrainMonkey

Did you ever get this figured out, as far as I knew the PythonOperator did log out like the BashOperator does.

@BrainMonkey nope, never came up with a solution. I'm still using the bash operator sadly

AxelFurlanF avatar Oct 12 '20 23:10 AxelFurlanF