node2vec
node2vec copied to clipboard
Exception in thread "main" java.lang.NoSuchMethodError
127:node2vec ani.das$ /Users/ani.das/spark-2.4.0-bin-hadoop2.7/bin/spark-submit --class com.navercorp.Main ./node2vec_spark/target/node2vec-0.0.1-SNAPSHOT.jar --cmd randomwalk --p 10.0 --q 10.0 --walkLength 50 --numWalks 5 --weighted --window 6 --input /Users/ani.das/Projects/phase2/data/runa/tf_idf_1.txt --output /Users/ani.das/Projects/phase2/data/runa/emb/tf_idf_1.emb
2019-01-26 22:29:11 WARN Utils:66 - Your hostname, 127.0.0.1 resolves to a loopback address: 127.0.0.1; using 192.168.1.73 instead (on interface en0)
2019-01-26 22:29:11 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
2019-01-26 22:29:11 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaMirrors$JavaMirror;
at com.navercorp.Main$Params.
@anirband You should change your spark version, I also found that this program cannot work in spark2.4.0, it works well in spark2.3.0 and 2.3.2.
Any suggestion on what to do with spark 2.4.0?
I've tried with Spark 2.3.2, Spark 2.4.0 and Spark 2.1 and all give me this error. This package seems effectively unusable
Had the same issue. I managed to solve it by running a Spark version that supports Scala version 2.10. In my case i was able to run it with Spark version 1.6.0.
Maybe you should change other dependency and plugin version in pom.xml to make them work with spark new version, you can view my pom.xml.
Maybe you should change other dependency and plugin version in pom.xml to make them work with spark new version, you can view my pom.xml.
Can you show your pom.xml to me since I encounter the same error Thanks!
@q463746583 , you can try this:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.navercorp</groupId>
<artifactId>node2vec</artifactId>
<packaging>jar</packaging>
<version>0.1.2-SNAPSHOT</version>
<name>node2vec_spark</name>
<url>https://github.com/superPershing</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<shadedClassifier>bin</shadedClassifier>
<maven-shade-plugin.version>3.2.1</maven-shade-plugin.version>
<exec-maven-plugin.version>1.6.0</exec-maven-plugin.version>
<java.version>1.8</java.version>
<scala.binary.version>2.11</scala.binary.version>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>3.1.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
<encoding>UTF-8</encoding>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<skip>false</skip>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.3</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.binary.version}.8</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>2.3.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.binary.version}</artifactId>
<version>2.3.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_${scala.binary.version}</artifactId>
<version>3.3.0</version>
<exclusions>
<exclusion>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>27.0.1-jre</version>
</dependency>
</dependencies>
</project>
@q463746583 , you can try this:
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.navercorp</groupId> <artifactId>node2vec</artifactId> <packaging>jar</packaging> <version>0.1.2-SNAPSHOT</version> <name>node2vec_spark</name> <url>https://github.com/superPershing</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <shadedClassifier>bin</shadedClassifier> <maven-shade-plugin.version>3.2.1</maven-shade-plugin.version> <exec-maven-plugin.version>1.6.0</exec-maven-plugin.version> <java.version>1.8</java.version> <scala.binary.version>2.11</scala.binary.version> </properties> <build> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <version>2.15.2</version> <executions> <execution> <goals> <goal>compile</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-dependency-plugin</artifactId> <version>3.1.1</version> <executions> <execution> <id>copy-dependencies</id> <phase>package</phase> <goals> <goal>copy-dependencies</goal> </goals> <configuration> <outputDirectory>${project.build.directory}/lib</outputDirectory> </configuration> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.2.1</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.8.0</version> <configuration> <source>1.8</source> <target>1.8</target> <encoding>UTF-8</encoding> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <skip>false</skip> </configuration> </plugin> </plugins> </build> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.3</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>${scala.binary.version}.8</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.binary.version}</artifactId> <version>2.3.2</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_${scala.binary.version}</artifactId> <version>2.3.2</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.github.scopt</groupId> <artifactId>scopt_${scala.binary.version}</artifactId> <version>3.3.0</version> <exclusions> <exclusion> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>27.0.1-jre</version> </dependency> </dependencies> </project>
Thank you! But I face an another problem,
19/07/08 16:33:59 DEBUG storage.DiskBlockManager: Adding shutdown hook
19/07/08 16:33:59 INFO memory.MemoryStore: MemoryStore started with capacity 366.3 MB
19/07/08 16:33:59 DEBUG scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: init
19/07/08 16:33:59 DEBUG spark.SecurityManager: Created SSL options for ui: SSLOptions{enabled=false, port=None, keyStore=None, keyStorePassword=None, trustStore=None, trustStorePassword=None, protocol=None, enabledAlgorithms=Set()}
19/07/08 16:34:00 DEBUG util.log: Logging to org.slf4j.impl.Log4jLoggerAdapter(org.spark_project.jetty.util.log) via org.spark_project.jetty.util.log.Slf4jLog
19/07/08 16:34:00 INFO util.log: Logging initialized @6961ms
19/07/08 16:34:00 DEBUG util.Jetty:
java.lang.NumberFormatException: For input string: "unknown"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.valueOf(Long.java:803)
at org.spark_project.jetty.util.Jetty.formatTimestamp(Jetty.java:89)
at org.spark_project.jetty.util.Jetty.<clinit>(Jetty.java:61)
at org.spark_project.jetty.server.Server.getVersion(Server.java:159)
at org.spark_project.jetty.server.handler.ContextHandler.<clinit>(ContextHandler.java:128)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:143)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:130)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:83)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:65)
at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:65)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:65)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:62)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:80)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:175)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:444)
at com.navercorp.Main$$anonfun$main$1.apply(Main.scala:95)
at com.navercorp.Main$$anonfun$main$1.apply(Main.scala:93)
at scala.Option.map(Option.scala:146)
at com.navercorp.Main$.main(Main.scala:93)
at com.navercorp.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
19/07/08 16:34:00 DEBUG util.DecoratedObjectFactory: Adding Decorator: org.spark_project.jetty.util.DeprecationWarning@16c587de
19/07/08 16:34:00 DEBUG component.ContainerLifeCycle: o.s.j.s.ServletContextHandler@18460128{/,null,null} added {org.spark_project.jetty.servlet.ServletHandler@74d3b638,MANAGED}
Do you have any idea? Thanks a lot
@q463746583 It seems that error happens while parsing command parameters(in Main.scala:93
). Maybe there are some wrongs in your command input.
@q463746583 , you can try this:
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.navercorp</groupId> <artifactId>node2vec</artifactId> <packaging>jar</packaging> <version>0.1.2-SNAPSHOT</version> <name>node2vec_spark</name> <url>https://github.com/superPershing</url> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <shadedClassifier>bin</shadedClassifier> <maven-shade-plugin.version>3.2.1</maven-shade-plugin.version> <exec-maven-plugin.version>1.6.0</exec-maven-plugin.version> <java.version>1.8</java.version> <scala.binary.version>2.11</scala.binary.version> </properties> <build> <plugins> <plugin> <groupId>org.scala-tools</groupId> <artifactId>maven-scala-plugin</artifactId> <version>2.15.2</version> <executions> <execution> <goals> <goal>compile</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-dependency-plugin</artifactId> <version>3.1.1</version> <executions> <execution> <id>copy-dependencies</id> <phase>package</phase> <goals> <goal>copy-dependencies</goal> </goals> <configuration> <outputDirectory>${project.build.directory}/lib</outputDirectory> </configuration> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>3.2.1</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> </execution> </executions> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>3.8.0</version> <configuration> <source>1.8</source> <target>1.8</target> <encoding>UTF-8</encoding> </configuration> </plugin> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-surefire-plugin</artifactId> <configuration> <skip>false</skip> </configuration> </plugin> </plugins> </build> <dependencies> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>2.7.3</version> </dependency> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>${scala.binary.version}.8</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.binary.version}</artifactId> <version>2.3.2</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-mllib_${scala.binary.version}</artifactId> <version>2.3.2</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.github.scopt</groupId> <artifactId>scopt_${scala.binary.version}</artifactId> <version>3.3.0</version> <exclusions> <exclusion> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>27.0.1-jre</version> </dependency> </dependencies> </project>
remember CHANGE the package version to 0.1.2-SNAPSHOT when submit the task ==|