spark-notebook
spark-notebook copied to clipboard
Failures in stages
java.lang.NullPointerException +details
java.lang.NullPointerException
at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:873)
at org.apache.hadoop.fs.FileUtil.chmod(FileUtil.java:853)
at org.apache.spark.util.Utils$.fetchFile(Utils.scala:381)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:405)
at org.apache.spark.executor.Executor$$anonfun$org$apache$spark$executor$Executor$$updateDependencies$5.apply(Executor.scala:397)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:772)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:98)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
This is as a result of Spark using Hadoop even if running in local mode. The fix is to put a file named winutils.exe in the hadoop binary folder (i think to actually use hadoop you also need hadoop.dll or similar)
See this: http://stackoverflow.com/questions/27201505/hadoop-exception-in-thread-main-java-lang-nullpointerexception
Where should I put winutils.exe and how to set the hadoop.home.dir property (-D?)
So, I ran with -Dhadoop.home.dir=c:\path\above\bin
okay, did that worked?
If so, I'd love to have the troubleshooting section including your info about that :-D
yes
@ittayd Could you please explain how you set hadoop.home.dir?