wikihadoop
wikihadoop copied to clipboard
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.fs.FileStatus.isDirectory()Z
Hello , I am running the jar files on hadoop-1.0.4 against the latest uncompressed 42 GB dump. The following is an exception I am getting
[ravisg@topsail-sn bin]$ hadoop jar /var/hadoop-1.0.4/contrib/streaming/hadoop-streaming-1.0.4.jar -libjars /export/home/ravisg/wikihadoop-0.2.jar -D mapreduce.input.fileinputformat.split.minsize=300000000 -D mapreduce.task.timeout=6000000 -input /user/ravisg/enwiki-latest-pages-articles.xml -output /user/ravisg/out -inputformat org.wikimedia.wikihadoop.StreamWikiDumpInputFormat -mapper /bin/cat packageJobJar: [/state/partition1/tmp/hadoop-unjar1677292213229350177/] [] /tmp/streamjob6373180723230676896.jar tmpDir=null 13/06/12 12:41:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library 13/06/12 12:41:24 WARN snappy.LoadSnappy: Snappy native library not loaded 13/06/12 12:41:24 INFO mapred.FileInputFormat: StreamWikiDumpInputFormat.getSplits job=Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, hdfs-default.xml, hdfs-site.xml n=2 13/06/12 12:41:24 INFO mapred.FileInputFormat: Total input paths to process : 1 13/06/12 12:41:24 INFO mapred.FileInputFormat: Total input paths to process : 1 13/06/12 12:41:24 INFO mapred.JobClient: Cleaning up the staging area hdfs://topsail-sn/state/partition1/tmp/mapred/staging/ravisg/.staging/job_201305061436_0541 Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.fs.FileStatus.isDirectory()Z at org.wikimedia.wikihadoop.StreamWikiDumpInputFormat.getSplits(StreamWikiDumpInputFormat.java:155) at org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:989) at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:981) at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:897) at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:850) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121) at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:850) at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:824) at org.apache.hadoop.streaming.StreamJob.submitAndMonitorJob(StreamJob.java:913) at org.apache.hadoop.streaming.StreamJob.run(StreamJob.java:121) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.streaming.HadoopStreaming.main(HadoopStreaming.java:50) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Facing same issue. I am using hadoop 1.2.1 stable. Is it something to do with hadoop version ? I have tried this on two different clusters but both running on hadoop 1.2.1.
I also have this error. It seems to be the hadoop version. I'm also running 1.2.1.
I am running hadoop version 2.6.0 and base version 0.94.27. having same issue