Ben Cassell
Ben Cassell
@dbph I would file a ticket with your company's Databricks contact, because that scenario is already supposed to be supported, and we'd need investigation from the Jobs team to know...
> So my first question is: can dbt-databricks also use the latest databricks-sdk once released or is there a reason to pin it to 0.17.0? We've been waiting for some...
@fivetran-andreymogilev you say it fixed Azure M2M in 0.18, but the reason I set the pin at 0.17.0 is that it broke all of our tests, which use Azure M2M....
For what it's worth, I have a branch where I'm testing 0.28.0, https://github.com/databricks/dbt-databricks/tree/auth_testing, if anyone wants to test out and report back. I'm also talking to the owner of the...
@fivetran-andreymogilev Ahhh, so, I think what happened here is that the recommend Azure M2M flow changed after implementation. So, what I've been working towards is using the client_secret from Azure,...
@fivetran-andreymogilev thank you for sharing! Your comment led to a breakthrough for me :P.
Investigating now. Thanks for the report.
So I dug into this and we may be stuck for now. The syntax for declaring the comments at table creation requires knowing the list of all columns and their...
> Don't use persist_docs and instead parse the comments from dbt model and use unity catalog rest api @jordandakota which api do you have in mind? I went looking for...
I have a change I'm working on now that will filter to only changed comments on incremental table updates. While that doesn't address the core issue (no bulk update), it...