docker-spark icon indicating copy to clipboard operation
docker-spark copied to clipboard

cannot submit tasks to master

Open pzg250 opened this issue 3 years ago • 3 comments

Hi, I met an issue. Anyone can help? Thanks in advance.

After deploy docker-spark to a server(192.168.10.8), I try to test by another server(192.168.10.7). same version spark has been installed on 192.168.10.7. cmd steps:

spark-shell --master spark://192.168.10.8:7077 --total-executor-cores 1 --executor-memory 512M
# xxxx
# some output here
# xxxx
val textFile = sc.textFile("file:///opt/spark/README.md");
textFile.first();

I got below error.(infinite loop messages)

WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

pzg250 avatar Jun 30 '21 10:06 pzg250

Hey @pzg250 ,

thanks a lot for reporting this. Could you tell us how you are running spark-shell command? Withing docker exec or from outside the docker network? Have you tried to use one of our docker templates as an example?

Best,

GezimSejdiu avatar Nov 21 '21 20:11 GezimSejdiu

I've the same issue. Anyone can help? Thanks alot.

nguacon90 avatar Dec 10 '21 04:12 nguacon90

Hi @pzg250 ,

I got the same error too. Could I use this configuration as below from outside Spark Cluster ?

spark = SparkSession.builder.appName("SparkSample2").master("spark://192.XX.X.XX:7077").getOrCreate() I'd like to run this application from client side. Thank you for your great support. Best,

rilakgg avatar Mar 30 '22 05:03 rilakgg