docker-spark
docker-spark copied to clipboard
spark task is not working
I tried to use pyspark to do a count task.
The task was sent to spark, however, the task was not running.
Any idea what's going wrong?
Hi,
I've submitted an application to master from remote server.
I got the same problem too.
Loop messages:
WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
Do you have any suggestions ?
Images: Spark 3.2.0 for Hadoop 3.2 with OpenJDK 8 and Scala 2.12 Thanks.
Hi, I am having the same issue, did it get resolved ?