zeppelin
zeppelin copied to clipboard
ZEPPELIN-3551. Upgrade Scala to 2.11.12
What is this PR for?
This is Just to update scala to 2.11.12 which to be consistent with spark (SPARK-24418)
What type of PR is it?
[Improvement]
Todos
- [ ] - Task
What is the Jira issue?
- https://issues.apache.org/jira/browse/ZEPPELIN-3551
How should this be tested?
- CI pass
Screenshots (if appropriate)
Questions:
- Does the licenses files need update? No
- Is there breaking changes for older versions? No
- Does this needs documentation? No
@Leemoonsoo @felixcheung Could you help review it ? Thanks
Will merge it if no more comments
LGTM
specifically in SPARK-24418 - there code changes needed but not release yet (Spark 2.4.0?)
I tested it against spark master branch, and it works. What kind of problems does spark have on 2.11.12 ? I notice SPARK-24418 is already merged.
something in SparkILoop. given that I don't think we can/should upgrade scala broadly - it might only work for Spark 2.4.0 (unreleased) but not older/current releases
Thanks @felixcheung , I will revert the change in the root pom file and only update scala version in spark module.
? it looks like spark.version is 2.2.0. Or under profile, 2.3.0 as far as I know SPARK-24418 is fixed only in (unreleased) master branch
It is to fine to compile with spark 2.2, the main change is on SparkScala211Interpreter.scala which is due to scala repl api changes in scala 2.11.12.
Perhaps you could elaborate on why we should do this now since Spark doesn’t actually support it? Is there a reason that I have missed?
This is for the next release of zeppelin which I suppose should supported the latest spark 2.4.0 which would use scala 2.11.12. Just want to catch up with spark.
well, ok. though might be hard to tell - we are juggling multiple releases 2.1.3, 2.2.2, 2.3.2 - might take some time to get to 2.4.0.
Anyway, this PR is just for the next release of spark that supports scala 2.11.12, I can hold it if you have any concern
Hi, @zjffdu , @felixcheung , @Leemoonsoo .
Since Spark 2.1.3/2.2.2/2.3.2 is out and Spark 2.4.0 RC3 is started, can we restart this?
Hey @zjffdu, busy? I can take this over - looks there's only minor conflict.
Thanks @HyukjinKwon feel free to take over this