CaffeOnSpark icon indicating copy to clipboard operation
CaffeOnSpark copied to clipboard

There are some problems when training the example

Open jq460494839 opened this issue 8 years ago • 4 comments

When I run this:

spark@master:~$ spark-submit --master yarn --deploy-mode cluster
--num-executors ${SPARK_WORKER_INSTANCES}
--files ${CAFFE_ON_SPARK}/data/lenet_memory_solver.prototxt,${CAFFE_ON_SPARK}/data/lenet_memory_train_test.prototxt
--conf spark.driver.extraLibraryPath="${LD_LIBRARY_PATH}"
--conf spark.executorEnv.LD_LIBRARY_PATH="${LD_LIBRARY_PATH}"
--class com.yahoo.ml.caffe.CaffeOnSpark
${CAFFE_ON_SPARK}/caffe-grid/target/caffe-grid-0.1-SNAPSHOT-jar-with-dependencies.jar
-train
-features accuracy,loss -label label
-conf lenet_memory_solver.prototxt
-devices ${DEVICES}
-connection ethernet
-model hdfs:///mnist.model
-output hdfs:///mnist_features_result

Then it report this

16/12/08 13:54:06 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 16/12/08 13:54:06 INFO yarn.Client: Requesting a new application from cluster with 4 NodeManagers 16/12/08 13:54:06 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 16/12/08 13:54:06 INFO yarn.Client: Will allocate AM container, with 1408 MB memory including 384 MB overhead 16/12/08 13:54:06 INFO yarn.Client: Setting up container launch context for our AM 16/12/08 13:54:06 INFO yarn.Client: Setting up the launch environment for our AM container 16/12/08 13:54:06 INFO yarn.Client: Preparing resources for our AM container 16/12/08 13:54:06 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 16/12/08 13:54:08 INFO yarn.Client: Uploading resource file:/home/spark/work/spark/spark-04cb9995-7f78-48c8-a7b0-9d371737e259/__spark_libs__8148212977399012375.zip -> hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005/__spark_libs__8148212977399012375.zip 16/12/08 13:54:10 INFO yarn.Client: Uploading resource file:/home/spark/CaffeOnSpark/caffe-grid/target/caffe-grid-0.1-SNAPSHOT-jar-with-dependencies.jar -> hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005/caffe-grid-0.1-SNAPSHOT-jar-with-dependencies.jar 16/12/08 13:54:10 INFO yarn.Client: Uploading resource file:/home/spark/CaffeOnSpark/data/lenet_memory_solver.prototxt -> hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005/lenet_memory_solver.prototxt 16/12/08 13:54:10 INFO yarn.Client: Uploading resource file:/home/spark/CaffeOnSpark/data/lenet_memory_train_test.prototxt -> hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005/lenet_memory_train_test.prototxt 16/12/08 13:54:10 INFO yarn.Client: Uploading resource file:/home/spark/work/spark/spark-04cb9995-7f78-48c8-a7b0-9d371737e259/__spark_conf__7301206583520651716.zip -> hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005/spark_conf.zip 16/12/08 13:54:10 INFO spark.SecurityManager: Changing view acls to: spark 16/12/08 13:54:10 INFO spark.SecurityManager: Changing modify acls to: spark 16/12/08 13:54:10 INFO spark.SecurityManager: Changing view acls groups to: 16/12/08 13:54:10 INFO spark.SecurityManager: Changing modify acls groups to: 16/12/08 13:54:10 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(spark); groups with view permissions: Set(); users with modify permissions: Set(spark); groups with modify permissions: Set() 16/12/08 13:54:10 INFO yarn.Client: Submitting application application_1481173656093_0005 to ResourceManager 16/12/08 13:54:10 INFO impl.YarnClientImpl: Submitted application application_1481173656093_0005 16/12/08 13:54:11 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:11 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1481176450695 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1481173656093_0005/ user: spark 16/12/08 13:54:12 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:13 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:14 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:15 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:16 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:17 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:18 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:19 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:20 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:21 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:22 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:23 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:24 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:25 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:26 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:27 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:28 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:29 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:30 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:31 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:32 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:33 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:34 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:35 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:36 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:37 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:38 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:39 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:40 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:41 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:42 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:43 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:44 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:45 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:46 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:47 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:48 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:49 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:50 INFO yarn.Client: Application report for application_1481173656093_0005 (state: ACCEPTED) 16/12/08 13:54:51 INFO yarn.Client: Application report for application_1481173656093_0005 (state: FAILED) 16/12/08 13:54:51 INFO yarn.Client: client token: N/A diagnostics: Application application_1481173656093_0005 failed 2 times due to AM Container for appattempt_1481173656093_0005_000002 exited with exitCode: -1 For more detailed output, check application tracking page:http://master:8088/cluster/app/application_1481173656093_0005Then, click on links to logs of each attempt. Diagnostics: File /tmp/hadoop-spark/nm-local-dir/usercache/spark/appcache/application_1481173656093_0005/container_1481173656093_0005_02_000001 does not exist Failing this attempt. Failing the application. ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1481176450695 final status: FAILED tracking URL: http://master:8088/cluster/app/application_1481173656093_0005 user: spark 16/12/08 13:54:51 INFO yarn.Client: Deleting staging directory hdfs://master:9000/user/spark/.sparkStaging/application_1481173656093_0005 Exception in thread "main" org.apache.spark.SparkException: Application application_1481173656093_0005 finished with failed status at org.apache.spark.deploy.yarn.Client.run(Client.scala:1132) at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1178) at org.apache.spark.deploy.yarn.Client.main(Client.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 16/12/08 13:54:51 INFO util.ShutdownHookManager: Shutdown hook called 16/12/08 13:54:51 INFO util.ShutdownHookManager: Deleting directory /home/spark/work/spark/spark-04cb9995-7f78-48c8-a7b0-9d371737e259

what's wrong?

jq460494839 avatar Dec 08 '16 05:12 jq460494839

Please use "yarn logs -applicationId" to get more detailed executor log.

junshi15 avatar Dec 21 '16 21:12 junshi15

YARN executor launch context: env: CLASSPATH -> {{PWD}}<CPS>{{PWD}}/spark_conf<CPS>{{PWD}}/spark_libs/<CPS>$HADOOP_CONF_DIR<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/<CPS>$HADOOP_COMMON_HOME/share/hadoop/common/lib/<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/<CPS>$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib/<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/<CPS>$HADOOP_YARN_HOME/share/hadoop/yarn/lib/<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/<CPS>$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib/* SPARK_LOG_URL_STDERR -> http://master:8042/node/containerlogs/container_1482386109230_0001_02_000003/spark/stderr?start=-4096 SPARK_YARN_STAGING_DIR -> hdfs://master:9000/user/spark/.sparkStaging/application_1482386109230_0001 SPARK_USER -> spark SPARK_YARN_MODE -> true LD_LIBRARY_PATH -> /usr/local/lib: SPARK_LOG_URL_STDOUT -> http://master:8042/node/containerlogs/container_1482386109230_0001_02_000003/spark/stdout?start=-4096

command: {{JAVA_HOME}}/bin/java -server -Xmx1024m -Djava.io.tmpdir={{PWD}}/tmp '-Dspark.driver.port=34334' '-Dspark.ui.port=0' -Dspark.yarn.app.container.log.dir=<LOG_DIR> -XX:MaxPermSize=256m -XX:OnOutOfMemoryError='kill %p' org.apache.spark.executor.CoarseGrainedExecutorBackend --driver-url spark://[email protected]:34334 --executor-id 2 --hostname master --cores 1 --app-id application_1482386109230_0001 --user-class-path file:$PWD/app.jar 1> <LOG_DIR>/stdout 2> <LOG_DIR>/stderr

===============================================================================

16/12/22 13:56:22 INFO impl.ContainerManagementProtocolProxy: Opening proxy : master:38420 16/12/22 13:56:22 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (172.16.49.102:55406) with ID 1 16/12/22 13:56:22 INFO storage.BlockManagerMasterEndpoint: Registering block manager master:41234 with 366.3 MB RAM, BlockManagerId(1, master, 41234) 16/12/22 13:56:23 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Registered executor NettyRpcEndpointRef(null) (172.16.49.102:55410) with ID 2 16/12/22 13:56:23 INFO storage.BlockManagerMasterEndpoint: Registering block manager master:40744 with 366.3 MB RAM, BlockManagerId(2, master, 40744) 16/12/22 13:56:24 INFO cluster.YarnClusterSchedulerBackend: SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 1.0 16/12/22 13:56:24 INFO cluster.YarnClusterScheduler: YarnClusterScheduler.postStartHook done 16/12/22 13:56:25 INFO caffe.DataSource$: Source data layer:0 16/12/22 13:56:25 INFO caffe.LMDB: Batch size:64 16/12/22 13:56:25 INFO caffe.DataSource$: Source data layer:1 16/12/22 13:56:25 INFO caffe.LMDB: Batch size:100 16/12/22 13:56:25 INFO caffe.CaffeOnSpark: interleave 16/12/22 13:56:26 ERROR yarn.ApplicationMaster: User class threw exception: java.lang.UnsatisfiedLinkError: no lmdbjni in java.library.path java.lang.UnsatisfiedLinkError: no lmdbjni in java.library.path at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886) at java.lang.Runtime.loadLibrary0(Runtime.java:849) at java.lang.System.loadLibrary(System.java:1088) at com.yahoo.ml.caffe.LmdbRDD$.com$yahoo$ml$caffe$LmdbRDD$$loadLibrary(LmdbRDD.scala:244) at com.yahoo.ml.caffe.LmdbRDD.com$yahoo$ml$caffe$LmdbRDD$$openDB(LmdbRDD.scala:198) at com.yahoo.ml.caffe.LmdbRDD.getPartitions(LmdbRDD.scala:45) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:248) at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:246) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.rdd.RDD.partitions(RDD.scala:246) at org.apache.spark.SparkContext.runJob(SparkContext.scala:1913) at org.apache.spark.rdd.RDD.count(RDD.scala:1134) at com.yahoo.ml.caffe.CaffeOnSpark.trainWithValidation(CaffeOnSpark.scala:257) at com.yahoo.ml.caffe.CaffeOnSpark$.main(CaffeOnSpark.scala:42) at com.yahoo.ml.caffe.CaffeOnSpark.main(CaffeOnSpark.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627) 16/12/22 13:56:26 INFO yarn.ApplicationMaster: Final app status: FAILED, exitCode: 15, (reason: User class threw exception: java.lang.UnsatisfiedLinkError: no lmdbjni in java.library.path) 16/12/22 13:56:26 INFO spark.SparkContext: Invoking stop() from shutdown hook 16/12/22 13:56:26 INFO server.ServerConnector: Stopped ServerConnector@bd50c47{HTTP/1.1}{0.0.0.0:0} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@30028ac8{/stages/stage/kill,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@54748db1{/api,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@1cf3f568{/,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3b7e080{/static,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3a018bc{/executors/threadDump/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@180f0b13{/executors/threadDump,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2e13d527{/executors/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@63e47e1d{/executors,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4d122b7a{/environment/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@4013b21b{/environment,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@6c37bfd6{/storage/rdd/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@62513d0{/storage/rdd,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@11f759bf{/storage/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@27a82c58{/storage,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@30726a69{/stages/pool/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@43117a45{/stages/pool,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@496b81d{/stages/stage/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@748db9a3{/stages/stage,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@c02720d{/stages/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@20142abf{/stages,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@15152879{/jobs/job/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@27fa1b48{/jobs/job,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@2eaa32ff{/jobs/json,null,UNAVAILABLE} 16/12/22 13:56:26 INFO handler.ContextHandler: Stopped o.s.j.s.ServletContextHandler@3a43a39b{/jobs,null,UNAVAILABLE} 16/12/22 13:56:26 INFO ui.SparkUI: Stopped Spark web UI at http://172.16.49.102:38425 16/12/22 13:56:26 INFO yarn.YarnAllocator: Driver requested a total number of 0 executor(s). 16/12/22 13:56:26 INFO cluster.YarnClusterSchedulerBackend: Shutting down all executors 16/12/22 13:56:26 INFO cluster.YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 16/12/22 13:56:26 INFO cluster.SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 16/12/22 13:56:26 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/12/22 13:56:26 INFO memory.MemoryStore: MemoryStore cleared 16/12/22 13:56:26 INFO storage.BlockManager: BlockManager stopped 16/12/22 13:56:26 INFO storage.BlockManagerMaster: BlockManagerMaster stopped 16/12/22 13:56:26 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 16/12/22 13:56:26 INFO spark.SparkContext: Successfully stopped SparkContext 16/12/22 13:56:26 INFO yarn.ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.UnsatisfiedLinkError: no lmdbjni in java.library.path) 16/12/22 13:56:26 INFO impl.AMRMClientImpl: Waiting for application to be successfully unregistered. 16/12/22 13:56:26 INFO yarn.ApplicationMaster: Deleting staging directory hdfs://master:9000/user/spark/.sparkStaging/application_1482386109230_0001 16/12/22 13:56:26 INFO util.ShutdownHookManager: Shutdown hook called 16/12/22 13:56:26 INFO util.ShutdownHookManager: Deleting directory /tmp/hadoop-spark/nm-local-dir/usercache/spark/appcache/application_1482386109230_0001/spark-2b81d232-b553-4068-b5f3-213a34e0f7bb End of LogType:stderr

LogType:stdout Log Upload Time:Thu Dec 22 13:56:27 +0800 2016 LogLength:0 Log Contents: End of LogType:stdout

jq460494839 avatar Dec 22 '16 06:12 jq460494839

Looks like your program could not find lmdbjni.so, either you did not have it or you did not set the LD_LIBRARY_PATH properly.

junshi15 avatar Dec 22 '16 13:12 junshi15

I've commented regarding this just now here. For Ubuntu based systems it would be wise to add LD_LIBRARY_PATH variables to .bashrc file; just so that we won't forget it the next time :)

arundasan91 avatar Mar 09 '17 13:03 arundasan91