docker-spark icon indicating copy to clipboard operation
docker-spark copied to clipboard

spark task is not working

Open joyatcloudfall opened this issue 2 years ago • 2 comments

I tried to use pyspark to do a count task. image

The task was sent to spark, however, the task was not running. image

Any idea what's going wrong?

joyatcloudfall avatar Mar 01 '22 09:03 joyatcloudfall

Hi, I've submitted an application to master from remote server. I got the same problem too. Loop messages: WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources

Do you have any suggestions ?

Images: Spark 3.2.0 for Hadoop 3.2 with OpenJDK 8 and Scala 2.12 Thanks.

rilakgg avatar Mar 31 '22 04:03 rilakgg

Hi, I am having the same issue, did it get resolved ?

kiran-jayaram avatar Sep 26 '22 22:09 kiran-jayaram