Peng Cheng
Peng Cheng
+1 affects me as well, is there a patch that is working?
Affect me as well (v3.2.10 on scala 2.10.6), only appears when submitted to Spark & use its class loader. I have full log in both cases to demonstrate the difference.
+1: Also affects me.
I'm also looking for this feature +1 on merge and close
+1 BTW: @seratch this is not caused by mixing minor version My dependencies only differs in patch versions: 3.2.11 vs 3.2.10 and it also get the same error.
@Jasper-M are you saying that once I moved those definition into a class or an object instead of a function the error will disappear?
Yes I agree it is a free type, but it is also a type alias that can be immediately de-aliased to a dependent type. Theoretically this is not a variable...
Do you have version numbers? (Spark? Scala? Java? IPython) The latest build is for 1.3.1, didn't maintain it for some time (busy busy). But I'll try to reproduce it. Also,...
oooooooh I remember not including 1.6.x in my dependency list (which means I'm using deprecated API). This probably take some time to fix, would you mind taking a look of...
Thanks a lot for your suggestion! I'll write a few words as my humble opinion: ISpark is built to be simple, lightweight, and as close to spark-shell as possible, it...