storm-hdfs icon indicating copy to clipboard operation
storm-hdfs copied to clipboard

Unable to write in CDH hadoop distribution

Open mnanirban opened this issue 10 years ago • 0 comments

Hi, Using the github project we are able to write into APACHE hadoop. But when we are trying to write in Clouderra hadoop hdfs its failing. Its creating the file but with zero bytes. We are not able to write anything in that file. Exception : 2015-02-24 16:57:27 o.a.h.h.DFSClient [INFO] Exception in createBlockOutputStream java.net.ConnectException: Connection timed out at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.6.0_31] at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567) ~[na:1.6.0_31] at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[hadoop-common-2.5.1.jar:na] at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529) ~[hadoop-common-2.5.1.jar:na] at org.apache.hadoop.hdfs.DFSOutputStream.createSocketForPipeline(DFSOutputStream.java:1515) ~[hadoop-hdfs-2.3.0-cdh5.1.2.jar:na] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(DFSOutputStream.java:1318) [hadoop-hdfs-2.3.0-cdh5.1.2.jar:na] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DFSOutputStream.java:1271) [hadoop-hdfs-2.3.0-cdh5.1.2.jar:na] at org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:525) [hadoop-hdfs-2.3.0-cdh5.1.2.jar:na] 2015-02-24 16:57:27 o.a.h.h.DFSClient [INFO] Abandoning BP-284372577-192.168.192.1-1420735649037:blk_1073915189_174366 2015-02-24 16:57:27 o.a.h.h.DFSClient [INFO] Excluding datanode 192.168.192.6:50010

mnanirban avatar Feb 25 '15 21:02 mnanirban