docker-hadoop-spark-workbench
docker-hadoop-spark-workbench copied to clipboard
Faile to connect to namenode:8020
I used this repo to pick up a docker swarm cluster and follow everything in swarm directory step by step and also modified the Makefile in main directory as follow ->
get-example:
if [ ! -f example/SparkWriteApplication.jar ]; then
wget -O example/SparkWriteApplication.jar https://www.dropbox.com/s/7dn0horm8ocbu0p/SparkWriteApplication.jar ;
fi
example: get-example docker run --rm -it --network workbench --env-file ./swarm/hadoop.env -e SPARK_MASTER=spark://sparm-master:7077 --volume $(shell pwd)/example:/example bde2020/spark-base:2.2.0-hadoop2.8-hive-java8 /spark/bin/spark-submit --master spark://spark-master:7077 /example/SparkWriteApplication.jar docker exec -it namenode hadoop fs -cat /tmp/numbers-as-text/part-00000
and when execute make example it reads the file and give me exception which tells me -> Failed to connect to server: namenode/10.0.0.102:8020: try once and fail. java.net.ConnectException: Connection refused
I have allowed all of the necessary ports and still cannot connect to namenode.... any suggestions ?
Hi!
The application from the root Makefile is not for swarm. To test it, run one of the provided examples from spark distro.
Also, what I can see from the execution namenode exposes port 9000 by default. Did you change the port?