databricks-sql-python icon indicating copy to clipboard operation
databricks-sql-python copied to clipboard

Error Creating Tables - Is Databricks Unity Catalog?

Open ryanbales opened this issue 2 years ago • 11 comments

I'm trying to turn the SqlAlchemy example from this repo on a new instance of Databricks.

I'm getting the following error:

databricks.sql.exc.ServerOperationError: [UC_COMMAND_NOT_SUPPORTED] Create sample tables/views is not supported in Unity Catalog.

For context, we're not currently using the Unity Catalog. Is that required to use the latest version of the Databricks SQL Connector?

ryanbales avatar Mar 27 '23 20:03 ryanbales

While not an explicit restriction, we only test the dialect on warehouses / clusters with unity catalog enabled. Can you share which line of the example script raises this exception?

susodapop avatar Mar 28 '23 02:03 susodapop

Sorry for the delay, I re-ran that sample again and interestingly get a different error. Here's the full error and stack trace

base = declarative_base(bind=engine) Traceback (most recent call last): File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context self.dialect.do_execute( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute cursor.execute(statement, parameters) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/client.py", line 472, in execute execute_response = self.thrift_backend.execute_command( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 833, in execute_command return self._handle_execute_response(resp, cursor) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 925, in _handle_execute_response final_operation_state = self._wait_until_command_done( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 767, in _wait_until_command_done self._check_command_not_in_error_or_closed_state( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 484, in _check_command_not_in_error_or_closed_state raise ServerOperationError( databricks.sql.exc.ServerOperationError: Table constraints are only supported in Unity Catalog.

ryanbales avatar Apr 26 '23 21:04 ryanbales

AH I see why, The first error was run on the following Catalog: samples Schema: default

The most recent error was run on: Catalog: hive_metastore Schema: default

ryanbales avatar Apr 26 '23 21:04 ryanbales

Here's the trace for the initial error message

base = declarative_base(bind=engine) Traceback (most recent call last): File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1900, in _execute_context self.dialect.do_execute( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 736, in do_execute cursor.execute(statement, parameters) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/client.py", line 472, in execute execute_response = self.thrift_backend.execute_command( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 833, in execute_command return self._handle_execute_response(resp, cursor) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 925, in _handle_execute_response final_operation_state = self._wait_until_command_done( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 767, in _wait_until_command_done self._check_command_not_in_error_or_closed_state( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 484, in _check_command_not_in_error_or_closed_state raise ServerOperationError( databricks.sql.exc.ServerOperationError: [UC_COMMAND_NOT_SUPPORTED] Create sample tables/views is not supported in Unity Catalog.

ryanbales avatar Apr 26 '23 21:04 ryanbales

Okay I don't see a bug here. There are two issues at play.

  1. You can't create tables and views on the samples catalog. This catalog is there for demonstration purposes (read about it in the Databricks docs here) and is effectively read-only.

  2. The error with hive_metastore is because that catalog is special -- it maintains backward compatibility with catalogs created before Unity Catalog was available. So although the hive_metastore catalog is included in the Unity Catalog, it doesn't have any UC features enabled. It's there purely for backward compatibility.

To move through this you need to just create a new catalog in your UC and point SQLAlchemy at it. Many organisations simply call this catalog main.

susodapop avatar Apr 26 '23 23:04 susodapop

Thanks @susodapop After further review we don't have a UC created yet but are working on that now. It seems like a UC is required to use the databricks-sql-connector. Is that accurate?

ryanbales avatar Apr 27 '23 13:04 ryanbales

No UC is not required to use databricks-sql-connector.

The implicit restriction I mentioned above only applies to the SQLAlchemy dialect.

susodapop avatar May 12 '23 21:05 susodapop

Thanks @susodapop. Interesting, I'm getting this error: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster. When trying to create a new catalog/schema with the sql connector.

create_catalog_stmt = f"CREATE CATALOG IF NOT EXISTS {catalog_name}" create_schema_stmt = f"CREATE SCHEMA IF NOT EXISTS {catalog_name}.{schema_name}"

with sql.connect(
    server_hostname=server_hostname,
    http_path=http_path,
    access_token=access_token,
) as connection:
    with connection.cursor() as cursor:
        # Create Catalog
        cursor.execute(create_catalog_stmt)
        logger.info("Created Catalog %r", catalog_name)

        # Create Schema
        cursor.execute(create_schema_stmt)
        logger.info("Created Schema %r in Catalog %r", catalog_name, schema_name)

Trace:

Traceback (most recent call last): File "scripts/create_catalog_schema.py", line 39, in create_catalog_schema("rbales_test_catalog", "rbales_test_schema") File "scripts/create_catalog_schema.py", line 28, in create_catalog_schema cursor.execute(create_catalog_stmt) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/client.py", line 472, in execute execute_response = self.thrift_backend.execute_command( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 833, in execute_command return self._handle_execute_response(resp, cursor) File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 925, in _handle_execute_response final_operation_state = self._wait_until_command_done( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 767, in _wait_until_command_done self._check_command_not_in_error_or_closed_state( File "/Users/rbales/Library/Caches/pypoetry/virtualenvs/active-ai-api-NyKgQ79e-py3.8/lib/python3.8/site-packages/databricks/sql/thrift_backend.py", line 484, in _check_command_not_in_error_or_closed_state raise ServerOperationError( databricks.sql.exc.ServerOperationError: [UC_NOT_ENABLED] Unity Catalog is not enabled on this cluster.

ryanbales avatar May 23 '23 21:05 ryanbales

That error comes directly from the cluster (i.e. it's not an issue with the python connector). Have you confirmed that UC is enabled for this cluster?

susodapop avatar May 23 '23 21:05 susodapop

It's not, no UC is enabled.

ryanbales avatar Jun 06 '23 17:06 ryanbales

Hi @ryanbales! Were you able to enable UC for your cluster and check if your issue reproduces there?

kravets-levko avatar Apr 17 '24 18:04 kravets-levko

@ryanbales I'm closing this issue as stale, but if you have more questions - feel free to reopen it, or open a new one

kravets-levko avatar Oct 02 '24 17:10 kravets-levko