Results 18 comments of mucio

We ended up using `spark_version = 10.4.x-aarch64-photon-scala2.12`, but I think the provider should expose the `runtime_engine` field. It makes things simpler

It makes more sense if the `databricks_cluster` resource mirrors the documentation and the manual configuration that people can do in the UI (e.g. setting the specific runtime). Using the `databricks_node_type`...

👍 on the "**UDF (but not only) management**, it is definitely a needed feature. I ended up on this thread because it is the most recent one on the topic...

👍 on this issue, we are hitting this with Databricks too

Will it be possible to contribute to this repo adding this serializer?

@McKnight-42 sorry for the late reply, I will try to work on this today. BTW I reached out to the Databricks support and they told me that the problem is...

@McKnight-42 @jtcohen6 sorry for being late, I finally found the time to fix this PR. I guess this will go in 1.2.0, not in 1.1.1

@McKnight-42 rebased and moved the fix description to 1.3. I hope I did it right