airflow icon indicating copy to clipboard operation
airflow copied to clipboard

Add on_kill equivalent to Databricks SQL Hook to cancel timed out queries

Open R7L208 opened this issue 1 year ago • 6 comments

The Databricks Provider did not implement a mechanism to cancel SQL queries submitted by DatabricksSqlHook. This led to data quality issues, where Airflow would report a cancellation due to timeout; however, the corresponding SQL query would continue to run on Databricks.

This PR uses threading to cancel SQL queries submitted by DatabricksSqlHook.run() once the timeout is exceeded.


^ Add meaningful description above Read the Pull Request Guidelines for more information. In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed. In case of a new dependency, check compliance with the ASF 3rd Party License Policy. In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.

R7L208 avatar Oct 02 '24 14:10 R7L208

@Lee-W - Could you help me understand why the new exceptions are not being found in airflow/exceptions.py for jobs:

  • Tests / Provider checks / Compat 2.8.4:P3.8 provider check (pull_request)
  • Tests / Provider checks / Compat 2.9.3:P3.8 provider check (pull_request)
  • Tests / Provider checks / Compat 2.10.1:P3.8 provider check (pull_request)

The exceptions are present in the file and are able to be successfully imported locally when I run breeze testing tests --test-type "Providers[databricks]", so I'm struggling to understand why they would cause an import error here, specifically from /usr/local/lib/python3.8/site-packages/airflow/exceptions.py.

__ ERROR collecting tests/providers/databricks/sensors/test_databricks_sql.py __
ImportError while importing test module '/opt/airflow/tests/providers/databricks/sensors/test_databricks_sql.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/local/lib/python3.8/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/providers/databricks/sensors/test_databricks_sql.py:28: in <module>
    from airflow.providers.databricks.sensors.databricks_sql import DatabricksSqlSensor
/usr/local/lib/python3.8/site-packages/airflow/providers/databricks/sensors/databricks_sql.py:28: in <module>
    from airflow.providers.databricks.hooks.databricks_sql import DatabricksSqlHook
/usr/local/lib/python3.8/site-packages/airflow/providers/databricks/hooks/databricks_sql.py:41: in <module>
    from airflow.exceptions import (
E   ImportError: cannot import name 'AirflowTaskExecutionError' from 'airflow.exceptions' (/usr/local/lib/python3.8/site-packages/airflow/exceptions.py)
-- generated xml file: /files/test_result-providers_-amazon_google-sqlite.xml --

Also, the DB tests seem to be failing because of tests unrelated to my changes, or I'm missing how they are connected.

R7L208 avatar Oct 04 '24 22:10 R7L208

@Lee-W - Could you help me understand why the new exceptions are not being found in airflow/exceptions.py for jobs:

* Tests / Provider checks / Compat 2.8.4:P3.8 provider check (pull_request)

* Tests / Provider checks / Compat 2.9.3:P3.8 provider check (pull_request)

* Tests / Provider checks / Compat 2.10.1:P3.8 provider check (pull_request)

The exceptions are present in the file and are able to be successfully imported locally when I run breeze testing tests --test-type "Providers[databricks]", so I'm struggling to understand why they would cause an import error here, specifically from /usr/local/lib/python3.8/site-packages/airflow/exceptions.py.

__ ERROR collecting tests/providers/databricks/sensors/test_databricks_sql.py __
ImportError while importing test module '/opt/airflow/tests/providers/databricks/sensors/test_databricks_sql.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/usr/local/lib/python3.8/importlib/__init__.py:127: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/providers/databricks/sensors/test_databricks_sql.py:28: in <module>
    from airflow.providers.databricks.sensors.databricks_sql import DatabricksSqlSensor
/usr/local/lib/python3.8/site-packages/airflow/providers/databricks/sensors/databricks_sql.py:28: in <module>
    from airflow.providers.databricks.hooks.databricks_sql import DatabricksSqlHook
/usr/local/lib/python3.8/site-packages/airflow/providers/databricks/hooks/databricks_sql.py:41: in <module>
    from airflow.exceptions import (
E   ImportError: cannot import name 'AirflowTaskExecutionError' from 'airflow.exceptions' (/usr/local/lib/python3.8/site-packages/airflow/exceptions.py)
-- generated xml file: /files/test_result-providers_-amazon_google-sqlite.xml --

Also, the DB tests seem to be failing because of tests unrelated to my changes, or I'm missing how they are connected.

Sure. These exception should not be added in airflow.execptions but somewhere inside this databricks providers. Airflow core and airflow providers are released separately and that's why we have such tests to test providers with older airflow version

Lee-W avatar Oct 06 '24 23:10 Lee-W

@Lee-W @uranusjr - Apologies for all of the back-and-forth on testing, but could you help me understand why this test is failing Tests / Non-DB tests / Non-DB::3.8: Always Providers[common.sql,databricks] (pull_request) Failing after 2m? Looking at the logs the failed tests seemed unrelated, and I've been having trouble getting the logs to appear for the GH job.

I'm unable to reproduce the failing tests running breeze testing tests --test-type "Providers[common.sql,databricks]". Is there a better breeze command that would mirror this test locally?

R7L208 avatar Oct 08 '24 15:10 R7L208

@Lee-W @uranusjr - Apologies for all of the back-and-forth on testing, but could you help me understand why this test is failing Tests / Non-DB tests / Non-DB::3.8: Always Providers[common.sql,databricks] (pull_request) Failing after 2m? Looking at the logs the failed tests seemed unrelated, and I've been having trouble getting the logs to appear for the GH job.

I'm unable to reproduce the failing tests running breeze testing tests --test-type "Providers[common.sql,databricks]". Is there a better breeze command that would mirror this test locally?

Rebase to get fix from https://github.com/apache/airflow/pull/42828

dstandish avatar Oct 08 '24 15:10 dstandish

Mostly good from my end, left a few nitpicks

Lee-W avatar Oct 09 '24 01:10 Lee-W

A LOT of conflicts. We moved providers to another directory, you need to resolve the conflicts/move your changes.

potiuk avatar Oct 17 '24 22:10 potiuk

@uranusjr @Lee-W - any additional feedback or is this ok to get merged?

R7L208 avatar Oct 23 '24 15:10 R7L208

LGTM

Lee-W avatar Oct 28 '24 08:10 Lee-W

hey @uranusjr - Can you please review as your requested changes should be addressed now? 🙏

R7L208 avatar Oct 31 '24 21:10 R7L208

Wanted to nudge on this again. @uranusjr - Is there any timeline for when you could re-review?

R7L208 avatar Nov 06 '24 21:11 R7L208