Spark.jl icon indicating copy to clipboard operation
Spark.jl copied to clipboard

Use SparkSubmit to initialize JVM (maybe?)

Open exyi opened this issue 2 years ago • 4 comments

PySpark seems to start JVM using the spark-submit script: https://github.com/apache/spark/blob/master/python/pyspark/java_gateway.py#L63. That has some benefits, I'm specifically looking for an easy way to add dependencies using the spark.jars.packages config.

However, I don't know how they call java methods... I think Spark.jl could call the SparkSubmit.main method using jcall, which should lead to basically the same behavior, but the JVM will remain under Julia's control.

Honestly, I'm quite confused of how spark-submit works, maybe I'm just missing something obvious. I though it could be possible to execute Julia script using spark-submit after the dependencies are handled, but that also does not work :/

exyi avatar Jul 18 '21 19:07 exyi

So are you looking for a way to add custom JARs? If so, we have add_jar function for SparkContext, and there should be a similar way to add jars to SparkSession (you can call any Java methods using JavaCall.jcall()).

dfdx avatar Jul 18 '21 19:07 dfdx

add_jar does not really cut it, the package has many dependencies and I'd really like Spark/maven to load them for me. I could find a method similar to addJar that would add packages :/

exyi avatar Jul 18 '21 20:07 exyi

You can try something like:

config = Dict("spark.jars.packages" => "...")
spark = SparkSession(..., config=config)

This should be equivalent to set this config via spark-submit.

dfdx avatar Jul 18 '21 20:07 dfdx

I looked at spark-submit a few years ago when I worked on this package, and it seemed too complicated -- I did not really understand how it worked. The way we load the JVM here seemed easier and more appropriate to me.

aviks avatar Aug 06 '21 22:08 aviks