Jolan Rensen
Jolan Rensen
> @Jolanrensen Can you explain a little bit more about what ScalaReflection.scala and KotlinReflection.scala do, what they're for, and why the latter is a blocker to Spark 3.4 support? And...
> @Jolanrensen we should probably take a look at the connect API: https://spark.apache.org/docs/latest/spark-connect-overview.html Allowing the Spark driver and code to use different versions from the application code might indeed solve...
@asm0dey A compiler plugin could do that :)
@gregfriis I'm sorry, no, we currently don't have the resources to figure that out. What could help is if someone from the community could provide a proof of concept solution....
Small weekend/hobby update regarding the issue: I tried [Spark Connect](https://spark.apache.org/docs/latest/spark-connect-overview.html) but locally on my machine I couldn't get it to work reliably yet. Plus it requires running Spark locally with...
@asm0dey you mean using `Encoders.bean()`? That can indeed be done relatively easily, also generated, but this limits us in other ways again: nullability/default arguments are needed, and, for instance, nested...
Java bean support requires an empty constructor + getters/setters, so yeah :/. That's what `@JvmOverloads` achieves. Actually, we can do it with `lateinit var`'s
2 updates: First: We can successfully define a UDT for a class (en/de)coding it using the `ExpressionEncoder` of a different class. No generics are supported afaik though. So we cannot...
> Wow, man, this sounds stunning! How? Well if it walks like a `case class` and quacks like a `case class` XD. But jokes aside, this still requires building a...
> I mean the fact that you did it without released K2 is a big deal! We actually have every right to say that the future versions of Kotlin API...