almond
almond copied to clipboard
The cell is always running but the spark job has been successful
hello, I start spark in the scala kernel through the following code
import org.apache.log4j.{Level, Logger}
Logger.getLogger("org").setLevel(Level.OFF)
import org.apache.spark.sql._
import org.apache.spark.{SparkConf, SparkContext}
val conf = {
new SparkConf()
.setAppName("Jupyter")
.setMaster("yarn")
.set("spark.yarn.queue", "foo")
...
}
val spark = {
NotebookSparkSession
.builder
.config(conf)
.enableHiveSupport.getOrCreate
}
but sometimes when I run spark sql, the cell will always keep running, but it shows that the job has run successfully on spark ui, it seems to be a bug, how can I solve it?