Mobius
Mobius copied to clipboard
Running Pi example
I am getting this error
17/09/15 00:01:05 INFO YarnClientSchedulerBackend: Application application_1505118143285_0040 has started running.
17/09/15 00:01:05 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 44869.
17/09/15 00:01:05 INFO NettyBlockTransferService: Server created on 10.0.0.15:44869
17/09/15 00:01:05 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/09/15 00:01:05 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 10.0.0.15, 44869, None)
17/09/15 00:01:05 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.0.15:44869 with 366.3 MB RAM, BlockManagerId(driver, 10.0.0.15, 44869, None)
17/09/15 00:01:05 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 10.0.0.15, 44869, None)
17/09/15 00:01:05 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 10.0.0.15, 44869, None)
17/09/15 00:01:05 INFO ContextHandler: Started o.s.j.s.ServletContextHandler@6fa0a452{/metrics/json,null,AVAILABLE,@Spark}
17/09/15 00:01:06 INFO EventLoggingListener: Logging events to adl:///hdp/spark2-events/application_1505118143285_0040
17/09/15 00:01:27 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxRegisteredResourcesWaitingTime: 30000(ms)
[2017-09-15 00:01:28,706] [1] [INFO ] [Microsoft.Spark.CSharp.Core.SparkContext] - Parallelizing 300001 items to form RDD in the cluster with 3 partitions
[2017-09-15 00:01:29,573] [1] [INFO ] [Microsoft.Spark.CSharp.Core.RDD1[[System.Int32, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]] - Executing Map operation on RDD (preservesPartitioning=False) [2017-09-15 00:01:29,577] [1] [INFO ] [Microsoft.Spark.CSharp.Core.RDD
1[[System.Int32, mscorlib, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089]]] - Executing Reduce operation on RDD
17/09/15 00:01:29 ERROR CSharpBackendHandler: methods:
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.saveStringRddAsTextFile(org.apache.spark.api.java.JavaRDD,java.lang.String,java.lang.Class)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.saveStringRddAsTextFile(org.apache.spark.api.java.JavaRDD,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.currentStageId_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.currentStageId()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.csharp.CSharpRDD.createRDDFromArray(org.apache.spark.SparkContext,byte[][],int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.csharpWorkerWriteBufferSize_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.csharpWorkerWriteBufferSize()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.csharpWorkerReadBufferSize_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.csharpWorkerReadBufferSize()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.csharpWorkerSocketType_$eq(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static java.lang.String org.apache.spark.api.csharp.CSharpRDD.csharpWorkerSocketType()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.maxCSharpWorkerProcessCount_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.maxCSharpWorkerProcessCount()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.nextSeqNum()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.nextSeqNum_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.csharp.CSharpRDD.executorCores()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.csharp.CSharpRDD.executorCores_$eq(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Iterator org.apache.spark.api.csharp.CSharpRDD.compute(org.apache.spark.Partition,org.apache.spark.TaskContext)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Option org.apache.spark.api.python.PythonRDD.partitioner()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.Partition[] org.apache.spark.api.python.PythonRDD.getPartitions()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.hadoopRDD(org.apache.spark.api.java.JavaSparkContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.hadoopFile(org.apache.spark.api.java.JavaSparkContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.newAPIHadoopFile(org.apache.spark.api.java.JavaSparkContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.newAPIHadoopRDD(org.apache.spark.api.java.JavaSparkContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.sequenceFile(org.apache.spark.api.java.JavaSparkContext,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,int,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.python.PythonRDD.runJob(org.apache.spark.SparkContext,org.apache.spark.api.java.JavaRDD,java.util.ArrayList)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.saveAsHadoopFile(org.apache.spark.api.java.JavaRDD,boolean,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.saveAsNewAPIHadoopFile(org.apache.spark.api.java.JavaRDD,boolean,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.lang.String,java.util.HashMap)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.saveAsHadoopDataset(org.apache.spark.api.java.JavaRDD,boolean,java.util.HashMap,java.lang.String,java.lang.String,boolean)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static scala.collection.mutable.Set org.apache.spark.api.python.PythonRDD.getWorkerBroadcasts(java.net.Socket)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.api.python.PythonRDD.reuse_worker()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.asJavaRDD()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.saveAsSequenceFile(org.apache.spark.api.java.JavaRDD,boolean,java.lang.String,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.python.PythonRDD.serveIterator(scala.collection.Iterator,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.writeIteratorToStream(scala.collection.Iterator,java.io.DataOutputStream)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.broadcast.Broadcast org.apache.spark.api.python.PythonRDD.readBroadcastFromFile(org.apache.spark.api.java.JavaSparkContext,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.readRDDFromFile(org.apache.spark.api.java.JavaSparkContext,java.lang.String,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.python.PythonRDD.toLocalIteratorAndServe(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static int org.apache.spark.api.python.PythonRDD.collectAndServe(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.api.java.JavaRDD org.apache.spark.api.python.PythonRDD.valueOfPair(org.apache.spark.api.java.JavaPairRDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.api.python.PythonRDD.bufferSize()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static void org.apache.spark.api.python.PythonRDD.writeUTF(java.lang.String,java.io.DataOutputStream)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.toDebugString()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Seq org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies_()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.Partition[] org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$partitions_()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.util.CallSite org.apache.spark.rdd.RDD.creationSite()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Option org.apache.spark.rdd.RDD.checkpointData()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$checkpointAllMarkedAncestors()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$doCheckpointCalled()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static scala.runtime.Null$ org.apache.spark.rdd.RDD.rddToPairRDDFunctions$default$4(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.DoubleRDDFunctions org.apache.spark.rdd.RDD.numericRDDToDoubleRDDFunctions(org.apache.spark.rdd.RDD,scala.math.Numeric)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.DoubleRDDFunctions org.apache.spark.rdd.RDD.doubleRDDToDoubleRDDFunctions(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.OrderedRDDFunctions org.apache.spark.rdd.RDD.rddToOrderedRDDFunctions(org.apache.spark.rdd.RDD,scala.math.Ordering,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.SequenceFileRDDFunctions org.apache.spark.rdd.RDD.rddToSequenceFileRDDFunctions(org.apache.spark.rdd.RDD,scala.reflect.ClassTag,scala.reflect.ClassTag,org.apache.spark.WritableFactory,org.apache.spark.WritableFactory)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.AsyncRDDActions org.apache.spark.rdd.RDD.rddToAsyncRDDActions(org.apache.spark.rdd.RDD,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public static org.apache.spark.rdd.PairRDDFunctions org.apache.spark.rdd.RDD.rddToPairRDDFunctions(org.apache.spark.rdd.RDD,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.SparkContext org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$sc()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Seq org.apache.spark.rdd.RDD.getPreferredLocations(org.apache.spark.Partition)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.SparkContext org.apache.spark.rdd.RDD.sparkContext()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.persist()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.persist(org.apache.spark.storage.StorageLevel)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.isLocallyCheckpointed()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.unpersist(boolean)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.unpersist$default$1()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.storage.StorageLevel org.apache.spark.rdd.RDD.getStorageLevel()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$dependencies__$eq(scala.collection.Seq)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$partitions__$eq(org.apache.spark.Partition[])
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final int org.apache.spark.rdd.RDD.getNumPartitions()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final scala.collection.Seq org.apache.spark.rdd.RDD.preferredLocations(org.apache.spark.Partition)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Iterator org.apache.spark.rdd.RDD.computeOrReadCheckpoint(org.apache.spark.Partition,org.apache.spark.TaskContext)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Iterator org.apache.spark.rdd.RDD.getOrCompute(org.apache.spark.Partition,org.apache.spark.TaskContext)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Seq org.apache.spark.rdd.RDD.getNarrowAncestors()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final void org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$visit$1(org.apache.spark.rdd.RDD,scala.collection.mutable.HashSet)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.isCheckpointedAndMaterialized()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.firstParent(scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.reflect.ClassTag org.apache.spark.rdd.RDD.elementClassTag()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.distinct$default$2(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.repartition(int,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.repartition$default$2(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.coalesce$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Option org.apache.spark.rdd.RDD.coalesce$default$3()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.coalesce$default$4(int,boolean,scala.Option)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.sample(boolean,double,long)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.sample$default$3()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD[] org.apache.spark.rdd.RDD.randomSplit(double[],long)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.randomSplit$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.randomSampleWithRange(double,double,long)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.withScope(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.doCheckpoint()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final org.apache.spark.Partition[] org.apache.spark.rdd.RDD.partitions()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.mapPartitionsWithIndex(scala.Function2,boolean,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.takeSample(boolean,int,long)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.sortBy$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.rdd.RDD.sortBy$default$3()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.intersection$default$3(org.apache.spark.rdd.RDD,org.apache.spark.Partitioner)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.glom()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.cartesian(org.apache.spark.rdd.RDD,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.runtime.Null$ org.apache.spark.rdd.RDD.groupBy$default$4(scala.Function1,org.apache.spark.Partitioner)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.pipe(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.pipe(java.lang.String,scala.collection.Map)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.pipe(scala.collection.Seq,scala.collection.Map,scala.Function1,scala.Function2,boolean,int,java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Map org.apache.spark.rdd.RDD.pipe$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Function1 org.apache.spark.rdd.RDD.pipe$default$3()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Function2 org.apache.spark.rdd.RDD.pipe$default$4()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.pipe$default$5()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.rdd.RDD.pipe$default$6()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.pipe$default$7()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.mapPartitions(scala.Function1,boolean,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.mapPartitions$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.mapPartitionsWithIndexInternal(scala.Function2,boolean,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.mapPartitionsInternal(scala.Function1,boolean,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.mapPartitionsInternal$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.mapPartitionsWithIndex$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.mapPartitionsWithIndexInternal$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,boolean,scala.Function3,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,boolean,scala.Function2,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,scala.Function2,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,boolean,scala.Function4,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,scala.Function3,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipPartitions(org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,org.apache.spark.rdd.RDD,scala.Function4,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.foreachPartition(scala.Function1)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Iterator org.apache.spark.rdd.RDD.toLocalIterator()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.subtract$default$3(org.apache.spark.rdd.RDD,org.apache.spark.Partitioner)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.treeReduce(scala.Function2,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.rdd.RDD.treeReduce$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.treeAggregate(java.lang.Object,scala.Function2,scala.Function2,int,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.rdd.RDD.treeAggregate$default$4(java.lang.Object)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.partial.PartialResult org.apache.spark.rdd.RDD.countApprox(long,double)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public double org.apache.spark.rdd.RDD.countApprox$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Map org.apache.spark.rdd.RDD.countByValue(scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.countByValue$default$1()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.partial.PartialResult org.apache.spark.rdd.RDD.countByValueApprox(long,double,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public double org.apache.spark.rdd.RDD.countByValueApprox$default$2()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.math.Ordering org.apache.spark.rdd.RDD.countByValueApprox$default$3(long,double)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.countApproxDistinct(int,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.countApproxDistinct(double)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public double org.apache.spark.rdd.RDD.countApproxDistinct$default$1()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipWithUniqueId()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.takeSample$default$3()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.takeOrdered(int,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.saveAsTextFile(java.lang.String,java.lang.Class)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.saveAsTextFile(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.saveAsObjectFile(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.keyBy(scala.Function1)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object[] org.apache.spark.rdd.RDD.collectPartitions()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.checkpointData_$eq(scala.Option)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.localCheckpoint()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.isCheckpointed()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Option org.apache.spark.rdd.RDD.getCheckpointFile()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.getCreationSite()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.retag(scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.retag(java.lang.Class)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$doCheckpointCalled_$eq(boolean)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.markCheckpointed()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.clearDependencies()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.api.java.JavaRDD org.apache.spark.rdd.RDD.toJavaRDD()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final scala.collection.Seq org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$debugString$1(org.apache.spark.rdd.RDD,java.lang.String,boolean,boolean)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final boolean org.apache.spark.rdd.RDD.org$apache$spark$rdd$RDD$$debugString$default$4$1()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.checkpoint()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.coalesce(int,boolean,scala.Option,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.union(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.intersection(org.apache.spark.rdd.RDD,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.intersection(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.intersection(org.apache.spark.rdd.RDD,org.apache.spark.Partitioner,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.Option org.apache.spark.rdd.RDD.scope()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.fold(java.lang.Object,scala.Function2)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.aggregate(java.lang.Object,scala.Function2,scala.Function2,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.groupBy(scala.Function1,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.groupBy(scala.Function1,int,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.groupBy(scala.Function1,org.apache.spark.Partitioner,scala.reflect.ClassTag,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.take(int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.$plus$plus(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.foreach(scala.Function1)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.flatMap(scala.Function1,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zipWithIndex()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.sortBy(scala.Function1,boolean,int,scala.math.Ordering,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.distinct()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.distinct(int,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.name_$eq(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.subtract(org.apache.spark.rdd.RDD)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.subtract(org.apache.spark.rdd.RDD,int)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.subtract(org.apache.spark.rdd.RDD,org.apache.spark.Partitioner,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.top(int,scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.slf4j.Logger org.apache.spark.rdd.RDD.org$apache$spark$internal$Logging$$log_()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.org$apache$spark$internal$Logging$$log__$eq(org.slf4j.Logger)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.logName()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logInfo(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logInfo(scala.Function0,java.lang.Throwable)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logDebug(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logDebug(scala.Function0,java.lang.Throwable)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logTrace(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logTrace(scala.Function0,java.lang.Throwable)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logWarning(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logWarning(scala.Function0,java.lang.Throwable)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logError(scala.Function0)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.logError(scala.Function0,java.lang.Throwable)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.isTraceEnabled()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public void org.apache.spark.rdd.RDD.initializeLogIfNecessary(boolean)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.SparkConf org.apache.spark.rdd.RDD.conf()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public scala.collection.Seq org.apache.spark.rdd.RDD.getDependencies()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.name()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.parent(int,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.SparkContext org.apache.spark.rdd.RDD.context()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.cache()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public long org.apache.spark.rdd.RDD.count()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.String org.apache.spark.rdd.RDD.toString()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final scala.collection.Seq org.apache.spark.rdd.RDD.dependencies()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.slf4j.Logger org.apache.spark.rdd.RDD.log()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.min(scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.max(scala.math.Ordering)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean org.apache.spark.rdd.RDD.isEmpty()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final scala.collection.Iterator org.apache.spark.rdd.RDD.iterator(org.apache.spark.Partition,org.apache.spark.TaskContext)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.collect()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.collect(scala.PartialFunction,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.zip(org.apache.spark.rdd.RDD,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.first()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.setName(java.lang.String)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.filter(scala.Function1)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public org.apache.spark.rdd.RDD org.apache.spark.rdd.RDD.map(scala.Function1,scala.reflect.ClassTag)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public int org.apache.spark.rdd.RDD.id()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public java.lang.Object org.apache.spark.rdd.RDD.reduce(scala.Function2)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final void java.lang.Object.wait(long,int) throws java.lang.InterruptedException
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final native void java.lang.Object.wait(long) throws java.lang.InterruptedException
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final void java.lang.Object.wait() throws java.lang.InterruptedException
17/09/15 00:01:29 ERROR CSharpBackendHandler: public boolean java.lang.Object.equals(java.lang.Object)
17/09/15 00:01:29 ERROR CSharpBackendHandler: public native int java.lang.Object.hashCode()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final native java.lang.Class java.lang.Object.getClass()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final native void java.lang.Object.notify()
17/09/15 00:01:29 ERROR CSharpBackendHandler: public final native void java.lang.Object.notifyAll()
17/09/15 00:01:29 ERROR CSharpBackendHandler: args:
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: org.apache.spark.rdd.ParallelCollectionRDD, argValue: ParallelCollectionRDD[0] at parallelize at CSharpRDD.scala:237
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: byte[], argValue: [B@ef0f60f
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.util.Hashtable, argValue: {}
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.util.ArrayList, argValue: []
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.lang.Boolean, argValue: false
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.lang.String, argValue: CSharpWorker.exe
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.lang.String, argValue: 1.0
17/09/15 00:01:29 ERROR CSharpBackendHandler: argType: java.util.ArrayList, argValue: []
17/09/15 00:01:29 ERROR CSharpBackendHandler: arg: NULL
[2017-09-15 00:01:29,745] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge] - JVM method execution failed: Constructor failed for class org.apache.spark.api.csharp.CSharpRDD when called with 9 parameters ([Index=1, Type=JvmObjectReference, Value=8], [Index=2, Type=Byte[], Value=System.Byte[]], [Index=3, Type=JvmObjectReference, Value=5], [Index=4, Type=JvmObjectReference, Value=6], [Index=5, Type=Boolean, Value=False], [Index=6, Type=String, Value=CSharpWorker.exe], [Index=7, Type=String, Value=1.0], [Index=8, Type=JvmObjectReference, Value=7], [Index=9, Type=null, Value=null], )
[2017-09-15 00:01:29,745] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge] - java.lang.NoSuchMethodError: org.apache.spark.api.python.PythonFunction.
[2017-09-15 00:01:29,747] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge] - JVM method execution failed: Constructor failed for class org.apache.spark.api.csharp.CSharpRDD when called with 9 parameters ([Index=1, Type=JvmObjectReference, Value=8], [Index=2, Type=Byte[], Value=System.Byte[]], [Index=3, Type=JvmObjectReference, Value=5], [Index=4, Type=JvmObjectReference, Value=6], [Index=5, Type=Boolean, Value=False], [Index=6, Type=String, Value=CSharpWorker.exe], [Index=7, Type=String, Value=1.0], [Index=8, Type=JvmObjectReference, Value=7], [Index=9, Type=null, Value=null], ) [2017-09-15 00:01:29,747] [1] [ERROR] [Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge] -
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod (System.Boolean isStatic, System.Object classNameOrJvmObjectReference, System.String methodName, System.Object[] parameters) [0x0005f] in <6f66514957744af8a393c7667e586f58>:0
[2017-09-15 00:01:29,747] [1] [ERROR] [Microsoft.Spark.CSharp.Examples.PiExample] - Error calculating Pi [2017-09-15 00:01:29,747] [1] [ERROR] [Microsoft.Spark.CSharp.Examples.PiExample] - JVM method execution failed: Constructor failed for class org.apache.spark.api.csharp.CSharpRDD when called with 9 parameters ([Index=1, Type=JvmObjectReference, Value=8], [Index=2, Type=Byte[], Value=System.Byte[]], [Index=3, Type=JvmObjectReference, Value=5], [Index=4, Type=JvmObjectReference, Value=6], [Index=5, Type=Boolean, Value=False], [Index=6, Type=String, Value=CSharpWorker.exe], [Index=7, Type=String, Value=1.0], [Index=8, Type=JvmObjectReference, Value=7], [Index=9, Type=null, Value=null], ) [2017-09-15 00:01:29,748] [1] [ERROR] [Microsoft.Spark.CSharp.Examples.PiExample] -
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallJavaMethod (System.Boolean isStatic, System.Object classNameOrJvmObjectReference, System.String methodName, System.Object[] parameters) [0x00144] in <6f66514957744af8a393c7667e586f58>:0
at Microsoft.Spark.CSharp.Interop.Ipc.JvmBridge.CallConstructor (System.String className, System.Object[] parameters) [0x00000] in <6f66514957744af8a393c7667e586f58>:0
at Microsoft.Spark.CSharp.Proxy.Ipc.SparkContextIpcProxy.CreateCSharpRdd (Microsoft.Spark.CSharp.Proxy.IRDDProxy prevJvmRddReference, System.Byte[] command, System.Collections.Generic.Dictionary2[TKey,TValue] environmentVariables, System.Collections.Generic.List
1[T] pythonIncludes, System.Boolean preservesPartitioning, System.Collections.Generic.List1[T] broadcastVariables, System.Collections.Generic.List
1[T] accumulator) [0x00094] in <6f66514957744af8a393c7667e586f58>:0
at Microsoft.Spark.CSharp.Core.PipelinedRDD1[U].get_RddProxy () [0x0003c] in <6f66514957744af8a393c7667e586f58>:0 at Microsoft.Spark.CSharp.Core.RDD
1[T].Collect () [0x00000] in <6f66514957744af8a393c7667e586f58>:0
at Microsoft.Spark.CSharp.Core.RDD1[T].Reduce (System.Func
3[T1,T2,TResult] f) [0x0002a] in <6f66514957744af8a393c7667e586f58>:0
at Microsoft.Spark.CSharp.Examples.PiExample.CalculatePiUsingAnonymousMethod (System.Int32 n, Microsoft.Spark.CSharp.Core.RDD`1[T] rdd) [0x00026] in <2098c8d21cdf4d3681deec7106c5f8a6>:0
at Microsoft.Spark.CSharp.Examples.PiExample.Main (System.String[] args) [0x00065] in <2098c8d21cdf4d3681deec7106c5f8a6>:0
[2017-09-15 00:01:29,748] [1] [INFO ] [Microsoft.Spark.CSharp.Core.SparkContext] - Stopping SparkContext [2017-09-15 00:01:29,748] [1] [INFO ] [Microsoft.Spark.CSharp.Core.SparkContext] - Note that there might be error in Spark logs on the failure to delete userFiles directory under Spark temp directory (spark.local.dir config value in local mode) [2017-09-15 00:01:29,749] [1] [INFO ] [Microsoft.Spark.CSharp.Core.SparkContext] - This error may be ignored for now. See https://issues.apache.org/jira/browse/SPARK-8333 for details 17/09/15 00:01:29 INFO ServerConnector: Stopped Spark@7bf6a51e{HTTP/1.1}{0.0.0.0:4040} 17/09/15 00:01:29 INFO SparkUI: Stopped Spark web UI at http://10.0.0.15:4040 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.BlockManager.disk.diskSpaceUsed_MB, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.BlockManager.memory.maxMem_MB, value=366 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.BlockManager.memory.memUsed_MB, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.BlockManager.memory.remainingMem_MB, value=366 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.DAGScheduler.job.activeJobs, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.DAGScheduler.job.allJobs, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.DAGScheduler.stage.failedStages, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.DAGScheduler.stage.runningStages, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.DAGScheduler.stage.waitingStages, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.PS-MarkSweep.count, value=2 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.PS-MarkSweep.time, value=60 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.PS-Scavenge.count, value=5 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.PS-Scavenge.time, value=71 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.heap.committed, value=607125504 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.heap.init, value=461373440 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.heap.max, value=954728448 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.heap.usage, value=0.16835208832072007 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.heap.used, value=160730528 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.non-heap.committed, value=80084992 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.non-heap.init, value=2555904 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.non-heap.max, value=-1 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.non-heap.usage, value=-7.7432448E7 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.non-heap.used, value=77432512 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Code-Cache.committed, value=13238272 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Code-Cache.init, value=2555904 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Code-Cache.max, value=251658240 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Code-Cache.usage, value=0.046904500325520834 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Code-Cache.used, value=11811840 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Compressed-Class-Space.committed, value=8126464 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Compressed-Class-Space.init, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Compressed-Class-Space.max, value=1073741824 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Compressed-Class-Space.usage, value=0.007335491478443146 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Compressed-Class-Space.used, value=7876424 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Metaspace.committed, value=58720256 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Metaspace.init, value=0 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Metaspace.max, value=-1 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Metaspace.usage, value=0.9837374005998883 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.Metaspace.used, value=57765312 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Eden-Space.committed, value=319815680 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Eden-Space.init, value=115867648 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Eden-Space.max, value=319815680 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Eden-Space.usage, value=0.3431463272845159 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Eden-Space.used, value=109743576 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Old-Gen.committed, value=268435456 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Old-Gen.init, value=307757056 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Old-Gen.max, value=716177408 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Old-Gen.usage, value=0.04624148099349149 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Old-Gen.used, value=33117104 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Survivor-Space.committed, value=18874368 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Survivor-Space.init, value=18874368 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Survivor-Space.max, value=18874368 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Survivor-Space.usage, value=0.9989191691080729 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.pools.PS-Survivor-Space.used, value=18853968 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.total.committed, value=687210496 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.total.init, value=463929344 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.total.max, value=954728447 17/09/15 00:01:29 INFO metrics: type=GAUGE, name=application_1505118143285_0040.driver.jvm.total.used, value=239750488 17/09/15 00:01:29 INFO metrics: type=COUNTER, name=application_1505118143285_0040.driver.HiveExternalCatalog.fileCacheHits, count=0 17/09/15 00:01:29 INFO metrics: type=COUNTER, name=application_1505118143285_0040.driver.HiveExternalCatalog.filesDiscovered, count=0 17/09/15 00:01:29 INFO metrics: type=COUNTER, name=application_1505118143285_0040.driver.HiveExternalCatalog.hiveClientCalls, count=0 17/09/15 00:01:29 INFO metrics: type=COUNTER, name=application_1505118143285_0040.driver.HiveExternalCatalog.parallelListingJobCount, count=0 17/09/15 00:01:29 INFO metrics: type=COUNTER, name=application_1505118143285_0040.driver.HiveExternalCatalog.partitionsFetched, count=0 17/09/15 00:01:29 INFO metrics: type=HISTOGRAM, name=application_1505118143285_0040.driver.CodeGenerator.compilationTime, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 17/09/15 00:01:29 INFO metrics: type=HISTOGRAM, name=application_1505118143285_0040.driver.CodeGenerator.generatedClassSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 17/09/15 00:01:29 INFO metrics: type=HISTOGRAM, name=application_1505118143285_0040.driver.CodeGenerator.generatedMethodSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 17/09/15 00:01:29 INFO metrics: type=HISTOGRAM, name=application_1505118143285_0040.driver.CodeGenerator.sourceCodeSize, count=0, min=0, max=0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0 17/09/15 00:01:29 INFO metrics: type=TIMER, name=application_1505118143285_0040.driver.DAGScheduler.messageProcessingTime, count=0, min=0.0, max=0.0, mean=0.0, stddev=0.0, median=0.0, p75=0.0, p95=0.0, p98=0.0, p99=0.0, p999=0.0, mean_rate=0.0, m1=0.0, m5=0.0, m15=0.0, rate_unit=events/second, duration_unit=milliseconds 17/09/15 00:01:30 INFO YarnClientSchedulerBackend: Interrupting monitor thread 17/09/15 00:01:30 INFO YarnClientSchedulerBackend: Shutting down all executors 17/09/15 00:01:30 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each executor to shut down 17/09/15 00:01:30 INFO SchedulerExtensionServices: Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 17/09/15 00:01:30 INFO YarnClientSchedulerBackend: Stopped 17/09/15 00:01:30 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 17/09/15 00:01:30 INFO MemoryStore: MemoryStore cleared 17/09/15 00:01:30 INFO BlockManager: BlockManager stopped 17/09/15 00:01:30 INFO BlockManagerMaster: BlockManagerMaster stopped 17/09/15 00:01:30 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 17/09/15 00:01:30 INFO SparkContext: Successfully stopped SparkContext 17/09/15 00:01:30 INFO CSharpBackend: Requesting to close all call back sockets. 17/09/15 00:01:30 INFO CSharpRunner: Closing CSharpBackend 17/09/15 00:01:30 INFO CSharpBackend: Requesting to close all call back sockets. 17/09/15 00:01:30 INFO CSharpRunner: Return CSharpBackend code 0 17/09/15 00:01:30 INFO Utils: Utils.exit() with status: 0, maxDelayMillis: 1000 17/09/15 00:01:30 INFO ShutdownHookManager: Shutdown hook called 17/09/15 00:01:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-6d5c4806-b26a-4dd4-a4f4-b2b5432beccc
This most likely due to the incompatibility between Mobius and Spark versions. Make sure that you are using the right build of Mobius for the Spark version you have.