atomobianco
atomobianco
+1 for this. Indeed, nothing listening on port 6066 from a spark 2.3.2 container ``` root@01a1f568c397:/# netstat -apn | grep LISTEN tcp 0 0 172.18.0.4:7077 0.0.0.0:* LISTEN 100/java tcp 0...
@earthquakesan Thanks for the tip I did not try it yet, but to answer your question no, I only did change the image version. @jgoodman8 not yet solved
Hello, coming back on this problem as I am updating the stack to Spark 2.3.2. As far as I understood @earthquakesan you suggest to submit the job with `spark.driver.bindAddress`; the...
For information, here two Spark UI environmental variables taken from the previous version and the current version: with bde2020/spark-base:2.2.0-hadoop2.8-hive-java8 ``` spark.driver.host 172.24.0.5 spark.master spark://e0e2e4fea039:7077 ``` with bde2020/spark-base:2.3.2-hadoop2.8 ``` spark.driver.host afe3b572290b...
The workaround I did for the moment is rewrite the `/etc/hosts` in the worker nodes: The old one was `172.24.0.5 afe3b572290b` The new one is `172.24.0.5 172.24.0.5` This rewriting is...
This line within `worker.sh` should work: ``` # Substitute container id with current IP sed -r "s/$(hostname -i).+/$(hostname -i) $(hostname -i)/" /etc/hosts > /tmp/.intermediate-file-2431; cp tmp/.intermediate-file-2431 /etc/hosts; ```