docker-spark icon indicating copy to clipboard operation
docker-spark copied to clipboard

do not start worker

Open 0neday opened this issue 4 years ago • 1 comments

show

bash-5.0# ./start-worker.sh    spark://spark:7077 
rsync from spark://spark:7077
/spark/sbin/spark-daemon.sh: line 177: rsync: command not found
starting org.apache.spark.deploy.worker.Worker, logging to /spark/logs/spark--org.apache.spark.deploy.worker.Worker-1-6f7782b9b0d5.out
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes

        -o COL1,COL2=HEADER     Select columns for display
ps: unrecognized option: p
BusyBox v1.30.1 (2020-05-30 09:44:53 UTC) multi-call binary.

Usage: ps [-o COL1,COL2=HEADER]

Show list of processes

0neday avatar Oct 15 '21 09:10 0neday

Hi @0neday ,

um, could you maybe tell me a bit more about how you are trying to start a worker? From which image? If you have set it up via our example docker-compose file then the worker is automatically started or even via a normal docker-run using our worker image.

Do let me know a bit more so that we can resolve this.

Best regards,

GezimSejdiu avatar Nov 21 '21 22:11 GezimSejdiu