zeppelin
zeppelin copied to clipboard
[ZEPPELIN-3551] Upgrade Scala to 2.11.12
What is this PR for?
This is just to update scala to 2.11.12 which to be consistent with spark (SPARK-24418). This PR takes over and closes #3033
There was a minor conflict which my PR (https://github.com/apache/zeppelin/pull/3206) introduced. That change is compatible with both Scala 2.11.8 and 2.11.12 so we don't need to change it anymore.
What type of PR is it?
[Improvement]
Todos
- [x] - None
What is the Jira issue?
- https://issues.apache.org/jira/browse/ZEPPELIN-3551
How should this be tested?
- CI pass
Screenshots (if appropriate)
Questions:
- Does the licenses files need update? No
- Is there breaking changes for older versions? No
- Does this needs documentation? No
Hi @HyukjinKwon , any news about this PR? Does it support Spark 2.4.0 (Scala 2.11.x) interpreter already? I just created a new branch which builds a customized zeppelin docker based on BDE spark docker and while testing it on SANSA via SANSA-Notebooks, found out that it does not work with Spark 2.4.0 :(. (see the stack trace below) :
java.lang.NoSuchMethodException: scala.tools.nsc.interpreter.ILoop.scala$tools$nsc$interpreter$ILoop$$loopPostInit()
at java.lang.Class.getMethod(Class.java:1786)
at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:268)
at org.apache.zeppelin.spark.BaseSparkScalaInterpreter.callMethod(BaseSparkScalaInterpreter.scala:262)
at org.apache.zeppelin.spark.SparkScala211Interpreter.open(SparkScala211Interpreter.scala:84)
at org.apache.zeppelin.spark.NewSparkInterpreter.open(NewSparkInterpreter.java:102)
at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:62)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:617)
at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Is there a plan to have that support soon on the new release? or we have to downgrade and use Spark 2.3.x instead ?
Looking forward to hearing from you.
Best regards,
Spark 2.4 support was added at https://github.com/apache/zeppelin/pull/3206, which exactly addresses the issue you faced - see this line https://github.com/apache/zeppelin/pull/3206/files#diff-b935226b71a3cfbabfb5324b9264c430L84. This will be available in new release of Zeppelin.
I'm not aware of release plan in Zeppelin since I'm just one of contributors. For the current status, the Spark should be downgraded as far as I can tell.
Many thanks for your response! As of now, I will downgrade the Spark version and looking forward to the new release of Zeppelin.
Best regards,
hmm, not sure why, this is the error, but I don't see recent changes that might have broken it
09:52:10,841 INFO org.apache.zeppelin.notebook.Paragraph:381 - Run paragraph [paragraph_id: paragraph_1546509130838_525871799, interpreter: org.apache.zeppelin.spark.SparkInterpreter, note_id: 2DZC2NGPW, user: anonymous]
09:52:10,841 DEBUG org.apache.zeppelin.interpreter.remote.RemoteInterpreter:206 - st:
z.run(1)
then the test timed out
Yup.. let me take a look