spark-sql-perf icon indicating copy to clipboard operation
spark-sql-perf copied to clipboard

sbt package failed with unresolved dependency

Open haojinIntel opened this issue 4 years ago • 5 comments

I've installed sbt-0.13.15 in my environment. I trigger "sbt packge" and meet the following exceptions: image Is there anyone meet the similar issue? And how can I fix this problem.

haojinIntel avatar May 11 '21 14:05 haojinIntel

It seems that some dependencies repositories have shut down, so you have to manage dependencies for this jar manually.

We had to remove the sbt-spark-package plugin, by erasing it from project/plugins.sbt and include the Spark dependencies manually as in this picture. We changed a few things on the following files:

On build.sbt you must change this: image

And this on project/plugins.sbt image

That worked for us!

Edit: I forgot to mention that we had to remove the sbt-spark-package plugin.

eavilaes avatar May 11 '21 15:05 eavilaes

@haojinIntel I forgot to mention that you must remove the sbt-spark-package plugin, I've edited the previous message 😄

eavilaes avatar May 12 '21 06:05 eavilaes

@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.

pingsutw avatar Jun 06 '21 20:06 pingsutw

@evanye Whare could I download those jars, and where should I put those jars? Sorry, I'm a beginner sbt.

You must build the jars as explained in https://github.com/databricks/spark-sql-perf#build

eavilaes avatar Jun 07 '21 07:06 eavilaes

For anyone stumbling across this issue, it can be fixed by changing in project/plugins.sbt the line: resolvers += "Spark Packages repo" at "https://dl.bintray.com/spark-packages/maven/" to resolvers += "Spark Packages repo" at "https://repos.spark-packages.org/" As already noted in PRs #204 and #206

AlessandroPomponio avatar Aug 05 '21 14:08 AlessandroPomponio