docker-spark
docker-spark copied to clipboard
cannot submit tasks to master
Hi, I met an issue. Anyone can help? Thanks in advance.
After deploy docker-spark to a server(192.168.10.8), I try to test by another server(192.168.10.7). same version spark has been installed on 192.168.10.7. cmd steps:
spark-shell --master spark://192.168.10.8:7077 --total-executor-cores 1 --executor-memory 512M
# xxxx
# some output here
# xxxx
val textFile = sc.textFile("file:///opt/spark/README.md");
textFile.first();
I got below error.(infinite loop messages)
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Hey @pzg250 ,
thanks a lot for reporting this. Could you tell us how you are running spark-shell
command? Withing docker exec
or from outside the docker network? Have you tried to use one of our docker templates as an example?
Best,
I've the same issue. Anyone can help? Thanks alot.
Hi @pzg250 ,
I got the same error too. Could I use this configuration as below from outside Spark Cluster ?
spark = SparkSession.builder.appName("SparkSample2").master("spark://192.XX.X.XX:7077").getOrCreate()
I'd like to run this application from client side.
Thank you for your great support.
Best,