docker-cloudera-quickstart icon indicating copy to clipboard operation
docker-cloudera-quickstart copied to clipboard

spark-shell error

Open scheung38 opened this issue 8 years ago • 3 comments

spark-shell

16/09/02 14:35:51 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/09/02 14:35:51 INFO spark.SecurityManager: Changing view acls to: root 16/09/02 14:35:51 INFO spark.SecurityManager: Changing modify acls to: root 16/09/02 14:35:51 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 16/09/02 14:35:52 INFO spark.HttpServer: Starting HTTP Server 16/09/02 14:35:52 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/09/02 14:35:52 INFO server.AbstractConnector: Started [email protected]:42113 16/09/02 14:35:52 INFO util.Utils: Successfully started service 'HTTP class server' on port 42113. Welcome to ____ __ / / ___ / / \ / _ / _ `/ __/ '/ // .__/,// //_\ version 1.3.0 /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_111) Type in expressions to have them evaluated. Type :help for more information. 16/09/02 14:36:05 INFO spark.SparkContext: Running Spark version 1.3.0 16/09/02 14:36:05 INFO spark.SecurityManager: Changing view acls to: root 16/09/02 14:36:05 INFO spark.SecurityManager: Changing modify acls to: root 16/09/02 14:36:05 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); users with modify permissions: Set(root) 16/09/02 14:36:06 INFO slf4j.Slf4jLogger: Slf4jLogger started 16/09/02 14:36:06 INFO Remoting: Starting remoting 16/09/02 14:36:06 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@3dc57b8ad2c9:43978] 16/09/02 14:36:06 INFO Remoting: Remoting now listens on addresses: [akka.tcp://sparkDriver@3dc57b8ad2c9:43978] 16/09/02 14:36:06 INFO util.Utils: Successfully started service 'sparkDriver' on port 43978. 16/09/02 14:36:06 INFO spark.SparkEnv: Registering MapOutputTracker 16/09/02 14:36:06 INFO spark.SparkEnv: Registering BlockManagerMaster 16/09/02 14:36:07 INFO storage.DiskBlockManager: Created local directory at /tmp/spark-7e93485c-83ac-4a27-a0a2-06a2606b5da6/blockmgr-5dcd6dae-1aca-4026-88ff-779ecd7d4f6b 16/09/02 14:36:07 INFO storage.MemoryStore: MemoryStore started with capacity 265.4 MB 16/09/02 14:36:07 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-56469c93-d69a-494d-aae4-cc6f71761887/httpd-aa5d16a6-d820-4d35-ad98-6a6c6d07a200 16/09/02 14:36:07 INFO spark.HttpServer: Starting HTTP Server 16/09/02 14:36:07 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/09/02 14:36:07 INFO server.AbstractConnector: Started [email protected]:46051 16/09/02 14:36:07 INFO util.Utils: Successfully started service 'HTTP file server' on port 46051. 16/09/02 14:36:07 INFO spark.SparkEnv: Registering OutputCommitCoordinator 16/09/02 14:36:07 INFO server.Server: jetty-8.y.z-SNAPSHOT 16/09/02 14:36:07 INFO server.AbstractConnector: Started [email protected]:4040 16/09/02 14:36:07 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 16/09/02 14:36:07 INFO ui.SparkUI: Started SparkUI at http://3dc57b8ad2c9:4040 16/09/02 14:36:07 INFO executor.Executor: Starting executor ID on host localhost 16/09/02 14:36:08 INFO executor.Executor: Using REPL class URI: http://172.17.0.2:42113 16/09/02 14:36:08 INFO util.AkkaUtils: Connecting to HeartbeatReceiver: akka.tcp://sparkDriver@3dc57b8ad2c9:43978/user/HeartbeatReceiver 16/09/02 14:36:08 INFO netty.NettyBlockTransferService: Server created on 42516 16/09/02 14:36:08 INFO storage.BlockManagerMaster: Trying to register BlockManager 16/09/02 14:36:08 INFO storage.BlockManagerMasterActor: Registering block manager localhost:42516 with 265.4 MB RAM, BlockManagerId(, localhost, 42516) 16/09/02 14:36:08 INFO storage.BlockManagerMaster: Registered BlockManager 16/09/02 14:36:10 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. java.io.FileNotFoundException: /user/spark/applicationHistory/local-1472826967921.inprogress (No such file or directory) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.(FileOutputStream.java:221) at java.io.FileOutputStream.(FileOutputStream.java:110) at org.apache.spark.scheduler.EventLoggingListener.start(EventLoggingListener.scala:117) at org.apache.spark.SparkContext.(SparkContext.scala:399) at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1016) at $iwC$$iwC.(:9) at $iwC.(:18) at (:20) at .(:24) at .() at .(:7) at .() at $print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

java.lang.NoClassDefFoundError: org/apache/hadoop/hive/conf/HiveConf at java.lang.Class.getDeclaredConstructors0(Native Method) at java.lang.Class.privateGetDeclaredConstructors(Class.java:2595) at java.lang.Class.getConstructor0(Class.java:2895) at java.lang.Class.getConstructor(Class.java:1731) at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1026) at $iwC$$iwC.(:9) at $iwC.(:18) at (:20) at .(:24) at .() at .(:7) at .() at $print() at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856) at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901) at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:130) at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122) at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:973) at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157) at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106) at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:990) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944) at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944) at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058) at org.apache.spark.repl.Main$.main(Main.scala:31) at org.apache.spark.repl.Main.main(Main.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 52 more

:10: error: not found: value sqlContext import sqlContext.implicits._ ^ :10: error: not found: value sqlContext import sqlContext.sql ^

scala>

scheung38 avatar Sep 02 '16 14:09 scheung38

Did you build the latest version or use the docker hub version? What steps I can do to reproduct this?

caioquirino avatar Sep 02 '16 15:09 caioquirino

docker pulled this version. Then when spark-shell at the top it is then produced

scheung38 avatar Sep 02 '16 16:09 scheung38

Ahh ok, It will be better if we hard-code the Cloudera version...

caioquirino avatar Nov 12 '16 21:11 caioquirino