semantic-link-labs icon indicating copy to clipboard operation
semantic-link-labs copied to clipboard

Add notebook to list large semantic models

Open andrewluotechnologies opened this issue 11 months ago • 2 comments

When migrating capacities, extra action is needed when moving workspaces between capacities in different regions if large semantic models exist in the workspace. It would be useful to have a notebook to list those semantic models.

andrewluotechnologies avatar Jan 07 '25 13:01 andrewluotechnologies

I don't really think this warrants its own notebook. It can be done in 2 lines of code. Maybe in a script repository would be a better idea.

m-kovalsky avatar Jan 09 '25 19:01 m-kovalsky

I do this when I run my daily inventory notebook (Get All Datasets):

#Get a list of all the datasets in the entire tenant
dfDatasets = spark.createDataFrame(labs.admin.list_datasets())
display(dfDatasets)

#Filter that list for only the datasets that are of the "large storage format" variety
dfLSFDatasets = dfDatasets[dfDatasets["Target Storage Mode"] == "PremiumFiles"]
lsfdatasetscnt = dfLSFDatasets.count()
print(f"There are {lsfdatasetscnt} large-storage format datasets in the tenant.")
display(dfLSFDatasets)

I just write this as a separate delta table, but you could easily create another column that designates a dataset as "Large" or "Small" instead.

JeridInsight avatar Apr 04 '25 16:04 JeridInsight