semantic-link-labs icon indicating copy to clipboard operation
semantic-link-labs copied to clipboard

Bug: Lakehouse SQL endpoint not returned in various functions

Open marclelijveld opened this issue 6 months ago • 5 comments

Describe the bug I tried to run labs.directlake.add_table_to_direct_lake_semantic_model to automate tables being added to my semantic model. However, it returns the error "SQL Endpoint not found". Instead I tried alternatives like labs.directlake.get_direct_lake_lakehouse which returns the same error. Underlying function labs.get_direct_lake_sql_endpoint is causing it and does not return a valid SQL endpoint.

I'm sure the semantic model is operating in Direct Lake mode, as I can manually add tables from the desired lakehouse to it through live edit in PBI Desktop.

To Reproduce Steps to reproduce the behavior: Code: labs.directlake.add_table_to_direct_lake_semantic_model( dataset= 'Sales Analysis', table_name= target_table, lakehouse_table_name= target_table, refresh= True, workspace= None )

Expected behavior A clear and concise description of what you expected to happen.

Screenshots

Image

Image

marclelijveld avatar May 23 '25 12:05 marclelijveld

So the semantic model I tried to work with, was created through labs as well using this code: labs.create_blank_semantic_model( dataset=semanticmodel_name, # compatibility_level=1605, # if not specified, defaults to 1605 workspace=None, # if not specified, will be the same workspace as the notebook runs overwrite=True )

When I manually create a semantic model, I noticed that it directly connects to the SQL Endpoint, where my previously created one (Sales Analysis) connects directly to the lakehouse.

Image

marclelijveld avatar May 23 '25 12:05 marclelijveld

Most likely it is due to the fact that the semantic model is not in DirectLake mode when I created it with the previous setup. This was not clear for me from the documentation, as a similar function (below) it worked fine.

labs.directlake.generate_direct_lake_semantic_model( dataset= semanticmodel_name, lakehouse_tables= None, workspace= None, lakehouse= 'LH_STORE_Gold', lakehouse_workspace= None, schema= 'dbo', overwrite = True, refresh= False)

marclelijveld avatar May 23 '25 12:05 marclelijveld

Is this semantic model using Direct Lake on OneLake or Direct Lake on SQL? When you create a semantic model using the create_blank_semantic_model it just creates a blank mode. In order to be in Direct Lake mode it must have at least one table in direct lake mode.

m-kovalsky avatar May 26 '25 07:05 m-kovalsky

I'm running into the same problem.

If we look at the actual function it seems like the get_direct_lake_sql_endpoint returns the lakehouse_name if you have a shared expression defined with a SQLEndpoint M Query, instead of the sqlendpointid.

Image

I would have expected the function to return the sqlendpoint (matches[0]), and instead i get the lakehouse_name.

I think this is what is incorrect.

jvl-9alt avatar Jun 04 '25 08:06 jvl-9alt

Is this semantic model using Direct Lake on OneLake or Direct Lake on SQL? When you create a semantic model using the create_blank_semantic_model it just creates a blank mode. In order to be in Direct Lake mode it must have at least one table in direct lake mode.

I got it to work with the other command labs.directlake.generate_direct_lake_semantic_model so, I think I just picked the wrong function. Is it correct to say that the function labs.create_blank_semantic_model is not intended for DirectLake but only for Import/DQ models, in which you have to add partitions to it?

If so, I think the function labs.directlake.generate_direct_lake_semantic_model is confusing in naming and more difficult to find (for me at least). Suggestion: labs.create_blank_semantic_model_directlake, as then the functions to create models are in order and nicely grouped together. :)

marclelijveld avatar Jun 16 '25 13:06 marclelijveld