sedona
                                
                                 sedona copied to clipboard
                                
                                    sedona copied to clipboard
                            
                            
                            
                        A cluster computing framework for processing large-scale geospatial data
 
Click  and play the interactive Sedona Python Jupyter Notebook immediately!
Apache Sedona™(incubating) is a cluster computing system for processing large-scale spatial data. Sedona equips cluster computing systems such as Apache Spark and Apache Flink with a set of out-of-the-box distributed Spatial Datasets and Spatial SQL that efficiently load, process, and analyze large-scale spatial data across machines.
| Download statistics | Maven | PyPI | CRAN | 
|---|---|---|---|
| Apache Sedona | 80k/month | ||
| Archived GeoSpark releases | 300k/month | 
System architecture
Our users and code contributors are from ...
 
Modules in the source code
| Name | API | Introduction | 
|---|---|---|
| Core | Scala/Java | Distributed Spatial Datasets and Query Operators | 
| SQL | Spark RDD/DataFrame in Scala/Java/SQL | Geospatial data processing on Apache Spark | 
| Flink | Flink DataStream/Table in Scala/Java/SQL | Geospatial data processing on Apache Flink | 
| Viz | Spark RDD/DataFrame in Scala/Java/SQL | Geospatial data visualization on Apache Spark | 
| Python | Spark RDD/DataFrame in Python | Python wrapper for Sedona | 
| R | Spark RDD/DataFrame in R | R wrapper for Sedona | 
| Zeppelin | Apache Zeppelin | Plugin for Apache Zeppelin 0.8.1+ | 
Sedona supports several programming languages: Scala, Java, SQL, Python and R.
Compile the source code
Please refer to Sedona website
Contact
Feedback to improve Apache Sedona: Google Form
Twitter: Sedona@Twitter
Sedona JIRA: Bugs, Pull Requests, and other similar issues
- [email protected]: project development, general questions or tutorials
Please visit Apache Sedona website for detailed information
Powered by
