SpatialSpark icon indicating copy to clipboard operation
SpatialSpark copied to clipboard

Multiple running SparkContexts detected in the same JVM!

Open ankur1000 opened this issue 7 years ago • 1 comments

I am migrating from Spark 1.3.0 to Spark 2.1.1 Earlier I was able to create SparkContext on spark 1.3.0 in Java using the following: SparkConf conf = new SparkConf().setAppName(dsp.getAppName()).setMaster("local").set("spark.driver.allowMultipleContexts", "true"); JavaSparkContext java_sc = new JavaSparkContext(conf);

However I get error "Multiple running SparkContexts detected in the same JVM!" and " org.apache.spark.SparkException: Failed to get broadcast_0_piece0 of broadcast_0"

Can someone help me out?

ankur1000 avatar Oct 16 '17 11:10 ankur1000

we can't use multiple context

sivapriyaMsp avatar Sep 27 '18 09:09 sivapriyaMsp