apache-spark-internals
apache-spark-internals copied to clipboard
spark high memory configuration
I currently wonder what is a good configuration of spark in case a lot of memory is available (on a single node) http://stackoverflow.com/questions/43262870/spark-with-high-memory-use-multiple-executors-per-node Maybe you could put some clarifying hints into your great book.