SpatialSpark
SpatialSpark copied to clipboard
Big Spatial Data Processing using Spark
I am migrating from Spark 1.3.0 to Spark 2.1.1 Earlier I was able to create SparkContext on spark 1.3.0 in Java using the following: SparkConf conf = new SparkConf().setAppName(dsp.getAppName()).setMaster("local").set("spark.driver.allowMultipleContexts", "true");...
Is this library compatible with Spark 2.1.1?
It would be awesome to add Spark SQL wrapper functions to Spatial Spark, so it would be possible to use someting like in Oracle Spatial: https://docs.oracle.com/database/122/SPATL/complex-spatial-queries.htm#SPATL-GUID-C4BE15BF-04AC-41EC-BEE1-F6DDA2A29E79 ``` SELECT c.city, sdo_nn_distance...
Hi, It would be nice on the join to add some of the kind of customs ids. Because we need to broadcast all the data around if we want for...
remove the `Wrapped` class to make it more clear, since we can use the **tuple** itself and can be sorted well.