Michael Schulte
Michael Schulte
btw - deriving the schema from the case class T works this way: ```scala List.empty[T].toDS.schema ```
I added a [PR](https://github.com/vincenzobaz/spark-scala3/pull/45/) which gives better error messages than the current code - so the collect wouldn't work, because the `as` would fail before that. However I couldn't adapt...
No, I can't get the `map` function working (because that is what I would use to get the example running) - and it has nothing to to with Option -...
I tried with `spark.sparkContext.setLogLevel("DEBUG")` but didn't get much more information (except generated code). But I can't see where exactly it hangs. I remember investigating further for udfs that lambdas couldn't...
Hey @MrPowers - that sounds like a cool project! However I am not sure how much effort we would have to put into that. On the other hand it is...
@jberkel the error message is misleading - not sure how to fix that, but spark does _not_ support recursive (circular) types when trying to compile encoding/decoding of a recursive class...
Correction on my part - it is not a compile time error but a runtime error. So I'm not sure if we can do it in this library. Maybe using...