sbt-spark-package
sbt-spark-package copied to clipboard
Packaging a multi-module project
Hi!
I have a multi module project here which I'm trying to publish as a spark package. It has 'core' and 'spark' modules, which would be bundled together in a jar and published.
The README doesn't seem to contain instructions for this -- what changes would be needed to package and publish a multi-module project?
Thanks! Anurag
If spark
depends on core
, and core
can be find on a repository (maven, bintray, github), you may simply use spark/spPublish
in the sbt console. Please set all related keys (sparkVersion, sparkComponents, spDependencies, ...) under spark.
If you need more information on project scoping, please check out the sbt docs.