BigDL-2.x
BigDL-2.x copied to clipboard
Error : Warning: Skip remote jar analytics-zoo
Hello,
I am trying to make Analytics-zoo work with my cluster, and I specified all the parameters in sparkmagic/config.json file, including the extraClassPath, spark.jars, pyFiles etc.
The problem is that the session does not even start, due to the following error:
Warning: Skip remote jar
I am able to use the same method to load BigDL, though. Can you help me fix the issue?
Hello,
I am trying to make Analytics-zoo work with my cluster, and I specified all the parameters in sparkmagic/config.json file, including the extraClassPath, spark.jars, pyFiles etc.
The problem is that the session does not even start, due to the following error: Warning: Skip remote jar /analytics-zoo-bigdl_0.12.2-spark_2.3.1-0.10.0-jar-with-dependencies.jar.
I am able to use the same method to load BigDL, though. Can you help me fix the issue?
Warning: Skip remote jar
: is the jar stored in distributed file system (e.g., HDFS)? I think you can only do that in cluster mode for Spark?
Thanks @jason-dai for your reply. Actually I am running notebooks on Jupyterlab as Spark Applications on YARN.
I tried both by local and HDFS paths, the weird fact is that the problem verifies only with this JAR, as I am able to run BigDL by setting more or less the same configurations
in spark doc, / is not a option, please try hdfs:// or file://. application-jar: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an hdfs:// path or a file:// path that is present on all nodes. https://spark.apache.org/docs/2.3.1/submitting-applications.html
in spark doc, / is not a option, please try hdfs:// or file://. application-jar: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an hdfs:// path or a file:// path that is present on all nodes. https://spark.apache.org/docs/2.3.1/submitting-applications.html
@glorysdj I know, the base path I used starts with hdfs:// ; I simply had to cut the full path as it featured my workplace directories.
in spark doc, / is not a option, please try hdfs:// or file://. application-jar: Path to a bundled jar including your application and all dependencies. The URL must be globally visible inside of your cluster, for instance, an hdfs:// path or a file:// path that is present on all nodes. https://spark.apache.org/docs/2.3.1/submitting-applications.html
@glorysdj I know, the base path I used starts with hdfs:// ; I simply had to cut the full path as it featured my workplace directories.
Cool. @guidiandrea Have you fixed the issue?