airflow icon indicating copy to clipboard operation
airflow copied to clipboard

empty Logfile

Open chsa21 opened this issue 7 months ago • 5 comments

Apache Airflow version

3.0.1

If "Other Airflow 2 version" selected, which one?

No response

What happened?

The problem in our system is that the logging does not work. The log files are created empty in the specified folder. I get the following error message in the Airflow GUI: Log message source details: sources=[“/.../run_id=manual__2025-05-27T07:46:05.353328+00:00/task_id=say_hi/attempt=1.log”] ::group::Log message source details: sources=[“/.../run_id=manual__2025-05-27T07:46:05.353328+00:00/task_id=say_hi/attempt=1.log”]

The log level is set to DEBUG. An update to version 3.0.1 could not solve the problem either.

What you think should happen instead?

The log file should contain the task information and, in particular, an error message if the task is failed.

How to reproduce

We test the following test_dag: from airflow.decorators import dag, task from datetime import datetime

@dag(schedule=None, start_date=datetime(2024, 1, 1), catchup=False) def log_test(): @task def say_hi(): import logging print(“hello from print()”) logging.getLogger(“airflow.task”).info(“hello from logging.info()”)

say_hi()

dag = log_test()

The logfile is empty.

Operating System

Red Hat 9.5

Versions of Apache Airflow Providers

No response

Deployment

Official Apache Airflow Helm Chart

Deployment details

No response

Anything else?

No response

Are you willing to submit PR?

  • [ ] Yes I am willing to submit a PR!

Code of Conduct

chsa21 avatar May 27 '25 08:05 chsa21

Thanks for opening your first issue here! Be sure to follow the issue template! If you are willing to raise PR to address this issue please do so, no need to wait for approval.

boring-cyborg[bot] avatar May 27 '25 08:05 boring-cyborg[bot]

Hey! Some quick questions to help me get an idea of your issue:

  1. Has this logging ever worked for you on your cluster?
  2. Which executor is the deployment using? (executor in values.yaml)
  3. Can you share how you've configured your logs in the helm chart? (logs: section)
  4. Can you check what permissions the log folder has inside one of your worker pods??
kubectl exec -it <worker-pod> -- ls -ld /<path-to-airflow-folder>/logs

jameshyphen avatar May 27 '25 18:05 jameshyphen

Hey, thanks for your feedback!

  1. Yes, logging worked in version 2.10.5.
  2. We use the LocalExecutor
  3. config:
  • base_log_folder = /logs
  • remote_logging = False
  • remote_log_conn_id =
  • delete_local_logs = False
  • google_key_path =
  • remote_base_log_folder =
  • remote_task_handler_kwargs =
  • encrypt_s3_logs = False
  • logging_level = DEBUG
  • fab_logging_level = WARNING
  • logging_config_class =
  • colored_console_log = True
  • colored_log_format = [%%(blue)s%%(asctime)s%%(reset)s] {%%(blue)s%%(filename)s:%%(reset)s%%(lineno)d} %%(log_color)s%%(levelname)s%%(reset)s - %%(log_color)s%%(message)s%%(reset)s
  • colored_formatter_class = airflow.utils.log.colored_log.CustomTTYColoredFormatter
  • log_format = [%%(asctime)s] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
  • simple_log_format = %%(asctime)s %%(levelname)s - %%(message)s
  • dag_processor_log_target = file
  • dag_processor_log_format = [%%(asctime)s] [SOURCE:DAG_PROCESSOR] {%%(filename)s:%%(lineno)d} %%(levelname)s - %%(message)s
  • dag_processor_child_process_log_directory = /dag_processor (this logs are not empty)
  • log_formatter_class = airflow.utils.log.timezone_aware.TimezoneAware
  • secret_mask_adapter =
  • min_length_masked_secret = 5
  • task_log_prefix_template =
  • log_filename_template = dag_id={{ ti.dag_id }}/run_id={{ ti.run_id }}/task_id={{ ti.task_id }}/{%% if ti.map_index >= 0 %%}map_index={{ ti.map_index }}/{%% endif %%}attempt={{ try_number|default(ti.try_number) }}.log (this logs are empty)
  • task_log_reader = task
  • extra_logger_names =
  • worker_log_server_port = 8793
  • trigger_log_server_port = 8794
  • file_task_handler_new_folder_permissions = 0o775
  • file_task_handler_new_file_permissions = 0o664
  • celery_stdout_stderr_separation = False
  • color_log_error_keywords = error,exception
  • color_log_warning_keywords = warn
  1. permissions 755

Many thanks for your help and best regards

chsa21 avatar Jun 03 '25 11:06 chsa21

I am having the same issue on 3.0.1. DAG run log files are being created, but nothing is being written to them, and I'm getting the same response in the UI.

Log message source details: sources=["/usr/local/airflow/logs/dag_id=dbt_debug/run_id=manual__2025-06-09T19:33:42.198500+00:00/task_id=print_hello/attempt=1.log"] ::group::Log message source details: sources=["/usr/local/airflow/logs/dag_id=dbt_debug/run_id=manual__2[0](https://airflow.neon.tools/dags/dbt_debug/runs/manual__2025-06-09T19:33:42.198500+00:00/tasks/print_hello?try_number=1#0)25-06-09T19:33:42.198500+00:00/task_id=print_hello/attempt=1.log"]

The dag_processor logs in logs/dag_processor/ are being created and written to.

Logging worked prior to upgrading to Airflow 3, but this seems to be preventing any dags from running.

trahouston avatar Jun 09 '25 19:06 trahouston

I am working with @chsa21 on this project and installed airflow on the system. Do you need any further information?

martinuphoff avatar Jun 17 '25 05:06 martinuphoff

This issue has been automatically marked as stale because it has been open for 14 days with no response from the author. It will be closed in next 7 days if no further activity occurs from the issue author.

github-actions[bot] avatar Jul 02 '25 00:07 github-actions[bot]

This issue has been closed because it has not received response from the issue author.

github-actions[bot] avatar Jul 09 '25 00:07 github-actions[bot]

I am facing the same issue on 3.0.6. Win 10 docker compose CeleryExecutor

birddevelper avatar Sep 20 '25 15:09 birddevelper

@birddevelper I have the same problem. Could you help me to find solution. Thank you very much in advance.

dmanikhine avatar Nov 20 '25 13:11 dmanikhine

@martinuphoff I have the same problem. Could you help me to find solution. Thank you very much in advance.

dmanikhine avatar Nov 20 '25 13:11 dmanikhine

@birddevelper I have the same problem. Could you help me to find solution. Thank you very much in advance.

I removed this line related to log level env value in docker compose:

AIRFLOW__LOGGING__LOGGING_LEVEL: WARNING

and it worked. It was working in version 2, but it seems the behavior of airflow log is changed in version 3.

birddevelper avatar Nov 20 '25 16:11 birddevelper