tobert.github.io icon indicating copy to clipboard operation
tobert.github.io copied to clipboard

No need to sc.stop with latest spark-cassandra-connector

Open tsindot opened this issue 11 years ago • 1 comments

if you add spark.cassandra.connection.host to the spark-defaults.conf, you can alleviate the sc.stop when using the spark-shell

cat > spark-defaults.conf <<EOF
spark.master            spark://node0.pc.datastax.com:7077
spark.executor.memory   512m
spark.eventLog.enabled  true
spark.serializer        org.apache.spark.serializer.KryoSerializer
spark.cassandra.connection.host node1.pc.datastax.com [or comma separated list of nodes]
EOF

Now that spark.cassandra.connection.host is set the spark-shell will add it to the SparkContext for use which means the below step can be removed.

scala> sc.stop

Finally, the value cassandra.contection.host is no longer valid and has been replace with the spark.cassandra.connection.host property.

-Todd

tsindot avatar Aug 23 '14 01:08 tsindot

Cool! Will test today. Thanks!

tobert avatar Aug 24 '14 20:08 tobert