databricks-sql-python icon indicating copy to clipboard operation
databricks-sql-python copied to clipboard

UNBOUND_SQL_PARAMETER when running the example with 14.2

Open Falydoor opened this issue 2 years ago • 14 comments

Hello,

When running the example from the doc:

from databricks import sql

connection = sql.connect(
  server_hostname=HOSTNAME,
  http_path=HTTP_PATH,
  access_token=TOKEN)

cursor = connection.cursor()
cursor.execute('SELECT :param `p`, * FROM RANGE(10)', {"param": "foo"})
result = cursor.fetchall()
for row in result:
  print(row)

cursor.close()
connection.close()

I'm getting this error:

Traceback (most recent call last):
  File "/Users/theo/Documents/perso/dbx/main.py", line 34, in <module>
    cursor.execute('SELECT :param `p`, * FROM RANGE(10)', {"param": "foo"})
  File "/Users/theo/Documents/perso/dbx/.env/lib/python3.11/site-packages/databricks/sql/client.py", line 761, in execute
    execute_response = self.thrift_backend.execute_command(
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/theo/Documents/perso/dbx/.env/lib/python3.11/site-packages/databricks/sql/thrift_backend.py", line 867, in execute_command
    return self._handle_execute_response(resp, cursor)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/theo/Documents/perso/dbx/.env/lib/python3.11/site-packages/databricks/sql/thrift_backend.py", line 959, in _handle_execute_response
    final_operation_state = self._wait_until_command_done(
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/theo/Documents/perso/dbx/.env/lib/python3.11/site-packages/databricks/sql/thrift_backend.py", line 792, in _wait_until_command_done
    self._check_command_not_in_error_or_closed_state(
  File "/Users/theo/Documents/perso/dbx/.env/lib/python3.11/site-packages/databricks/sql/thrift_backend.py", line 596, in _check_command_not_in_error_or_closed_state
    raise ServerOperationError(
databricks.sql.exc.ServerOperationError: [UNBOUND_SQL_PARAMETER] Found the unbound parameter: param. Please, fix `args` and provide a mapping of the parameter to either a SQL literal or collection constructor functions such as `map()`, `array()`, `struct()`. SQLSTATE: 42P02; line 1 pos 7

It looks like the native query parameter isn't working correctly. I'm using an all purpose cluster with a runtime of 14.2 with photon acceleration disabled.

Thank you!

Falydoor avatar Nov 21 '23 16:11 Falydoor

Thanks for this report. I'm trying to reproduce right now.

susodapop avatar Nov 21 '23 17:11 susodapop

I've reproduced the same issue internally. Will follow-up with our internal engineers to figure out what's happening here since I'm using the latest 14.2 snapshot with an all-purpose cluster.

susodapop avatar Nov 21 '23 18:11 susodapop

Just following up to say that this exact code snippet does work when connected to a SQL warehouse with DBR 14.2

susodapop avatar Nov 21 '23 18:11 susodapop

thanks, same here! ill try with 14.2 beta

vlasvlasvlas avatar Nov 21 '23 18:11 vlasvlasvlas

I'm puzzled about why the parameter designed to help optimize the query, specifically the 'get the first n rows so I can check the data,' isn't functioning as intended. Any insights on this issue?

url: https://api.python.langchain.com/en/latest/_modules/langchain/utilities/sql_database.html#SQLDatabase.get_table_info_no_throw

If `sample_rows_in_table_info`, the specified number of sample rows will be
        appended to each table description. This can increase performance as
        demonstrated in the paper.

vlasvlasvlas avatar Nov 21 '23 19:11 vlasvlasvlas

I'm puzzled about why the parameter designed to help optimize the query, specifically the 'get the first n rows so I can check the data,' isn't functioning as intended. Any insights on this issue?

This looks like a question for the langchain repo. I don't have any insight about their codebase.

susodapop avatar Nov 21 '23 20:11 susodapop

Until this issue is fixed for DBR 14.2 clusters, I suppose the workaround is to use Inline Parameters, correct?

Is there a way to use those with SQLAchemy? The SQLAlchemy docs do describe inlining parameters, but I'm not sure how to use that information alongside the parameter docs for this library here.

nchammas avatar Nov 27 '23 20:11 nchammas

@nchammas Try using a SQL warehouse instead, that sounds easier than using inline parameters.

Falydoor avatar Nov 28 '23 04:11 Falydoor

@Falydoor is correct that using a sql warehouse for the time-being is the only true workaround at this time. I'm following up with the engineers about when this will be fixed on all-purpose clusters.

Is there a way to use those with SQLAchemy?

No, databricks-sql-connector's inline parameters are never supported with sqlalchemy (because sqlalchemy doesn't know how to write out the correct variable markers). As you found in the sqlalchemy docs, you can force sqlalchemy to write parameter values as strings in emitted queries - this uses a completely different code path from databricks-sql-connector's inline behaviour, though.

susodapop avatar Nov 28 '23 15:11 susodapop

@nchammas What was your in line workaround? I'm facing the same issue here and need a workaround while the Databricks team fixes this. I cannot use SQL warehouse due to cost reasons

auschoi96 avatar Dec 06 '23 00:12 auschoi96

@nchammas What was your in line workaround? I'm facing the same issue here and need a workaround while the Databricks team fixes this. I cannot use SQL warehouse due to cost reasons

changing databricks cluster version to 14.2 beta made it for me.

vlasvlasvlas avatar Dec 06 '23 13:12 vlasvlasvlas

I ended up using a warehouse. As @susodapop explained, inline parameters are not supported for SQLAlchemy, and it didn't seem worth trying to figure some complex workaround if, presumably, an upcoming release of DBR (hopefully 14.3) is going to fix the issue anyway.

nchammas avatar Dec 06 '23 15:12 nchammas

Hi everyone in this thread! Can you please let me know if this issue is still relevant for any of you? Thank you so much!

kravets-levko avatar Apr 17 '24 17:04 kravets-levko

Hey @kravets-levko,

I just tried with a cluster using DBR 15.0 with databricks-sql-connector==3.1.1 and everything works fine 👍

Falydoor avatar Apr 19 '24 02:04 Falydoor