SpatialSpark
SpatialSpark copied to clipboard
Multiple running SparkContexts detected in the same JVM!
I am migrating from Spark 1.3.0 to Spark 2.1.1 Earlier I was able to create SparkContext on spark 1.3.0 in Java using the following: SparkConf conf = new SparkConf().setAppName(dsp.getAppName()).setMaster("local").set("spark.driver.allowMultipleContexts", "true"); JavaSparkContext java_sc = new JavaSparkContext(conf);
However I get error "Multiple running SparkContexts detected in the same JVM!" and " org.apache.spark.SparkException: Failed to get broadcast_0_piece0 of broadcast_0"
Can someone help me out?
we can't use multiple context