hail
hail copied to clipboard
Hail on Spark 3.2
Hey, I was wondering when Hail will be on Spark 3.2?
So we already support building with Spark 3.2 if you build your own jar. We just use 3.1 for our pypi release because it's what Google, AWS, and Azure have their respective Spark images set to last time I checked.
got it, thanks
Databricks is on Spark 3.2, and people are asking about upgrades.
Please comment here once these Dataproc / EMR are up to Spark 3.2 and Hail has a PyPi release on 3.2, we can then upgrade the Databricks Documentation for Hail
Hail now supports Spark 3.3.0.
@danking Are there any releases of Hail that support 3.2? It seems like hail went from 3.1 -> 3.3 and that no builds hosted on Github support 3.2, and that because pyspark is pinned to 3.3+ we'd need to fork it to install hail in any projects that use spark 3.2.
@zyd14 There are no releases of Hail that support Spark 3.2.x. We very closely follow Google Dataproc's release cycle.
If you want Spark 3.2.x support, you'll need to fork, edit the code to handle changes to the APIs of dependencies, and recompile.