spark-indexedrdd
spark-indexedrdd copied to clipboard
dependency com.ankurdave#part_2.10;0.1:
Hi All,
When I added indexedrdd dependency in the sbt configuration file and compiled my scala, it keeps giving me this error:
error sbt.ResolveException: unresolved dependency: com.ankurdave#part_2.10;0.1: not found [error] Total time: 6 s, completed Sep 9, 2015 11:01:37 PM
What does this mean? Any ideas? Thank you! Hong
IndexedRDD depends on PART, which is stored in a custom repository. I think an update to Spark Packages might have caused it to stop adding that repository automatically along with the IndexedRDD package.
For now, you should be able to work around this by adding this repository to your build.sbt:
resolvers += "Repo at github.com/ankurdave/maven-repo" at "https://github.com/ankurdave/maven-repo/raw/master"
Hi Ankur,
Thank you for your prompt response! Yes, that solved my problem. BTW, I should take a look at the your build.sbt before asking the question. It has some clues there.
Thank you! Hong
Thanks for reporting this! I wouldn't have noticed there was a problem otherwise.
I just released version 0.3, which should fix this problem and remove the need for the workaround.
Thanks, it works smoothly right now.
Oh, actually I didn't clear the cache properly after applying the workaround so the problem is still there.
Huh~, I didn't get the the compiling error though.
@hywUMD That's probably because you now have part in your cache. If you delete ~/.ivy2/cache/com.ankurdave
you would observe the error
I still have issue using this package:
/opt/spark-1.5.1-bin-hadoop2.6# /opt/spark-1.5.1-bin-hadoop2.6/bin/spark-shell --packages amplab:spark-indexedrdd:0.3
Ivy Default Cache set to: /root/.ivy2/cache
The jars for the packages stored in: /root/.ivy2/jars
:: loading settings :: url = jar:file:/opt/spark-1.5.1-bin-hadoop2.6/lib/spark-assembly-1.5.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
amplab#spark-indexedrdd added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found amplab#spark-indexedrdd;0.3 in spark-packages
downloading http://dl.bintray.com/spark-packages/maven/amplab/spark-indexedrdd/0.3/spark-indexedrdd-0.3.jar ...
[SUCCESSFUL ] amplab#spark-indexedrdd;0.3!spark-indexedrdd.jar (243ms)
:: resolution report :: resolve 1219ms :: artifacts dl 245ms
:: modules in use:
amplab#spark-indexedrdd;0.3 from spark-packages in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 2 | 1 | 1 | 0 || 1 | 1 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
module not found: com.ankurdave#part_2.10;0.1
==== local-m2-cache: tried
file:/root/.m2/repository/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom
-- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:
file:/root/.m2/repository/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar
==== local-ivy-cache: tried
/root/.ivy2/local/com.ankurdave/part_2.10/0.1/ivys/ivy.xml
==== central: tried
https://repo1.maven.org/maven2/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom
-- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:
https://repo1.maven.org/maven2/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar
==== spark-packages: tried
http://dl.bintray.com/spark-packages/maven/com/ankurdave/part_2.10/0.1/part_2.10-0.1.pom
-- artifact com.ankurdave#part_2.10;0.1!part_2.10.jar:
http://dl.bintray.com/spark-packages/maven/com/ankurdave/part_2.10/0.1/part_2.10-0.1.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: com.ankurdave#part_2.10;0.1: not found
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.ankurdave#part_2.10;0.1: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1009)
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
@pazooki You can manually solve this by adding the repository dependency to the .sbt file resolvers += "Repo at github.com/ankurdave/maven-repo" at "https://github.com/ankurdave/maven-repo/raw/master" libraryDependencies += "com.ankurdave" %% "part" % "0.1"
@pazooki Another workaround if you're using spark-shell or spark-submit is to invoke it with --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master
.
I will try it. Thanks. Would there be a difference if I'm using pyspark?
On Tue, Oct 13, 2015 at 4:03 PM Ankur Dave [email protected] wrote:
Another workaround if you're using spark-shell or spark-submit is to invoke it with --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master.
— Reply to this email directly or view it on GitHub https://github.com/amplab/spark-indexedrdd/issues/8#issuecomment-147835843 .
nope ./pyspark --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master ...
should work
with this code ./pyspark --repositories https://raw.githubusercontent.com/ankurdave/maven-repo/master ... should work it initialize with scala>
Maven Compilation error when using latest spark which comes for Scala 2.11. Plz help.
part_2.10-0.1.jar of MyProject build path is cross-compiled with an incompatible version of Scala (2.10.0). Unknown Scala Version Problem
This is still an issue in 0.4.0. It would be most helpful if this could be bundled and published as 0.4.1. Can this be done please? I am happy to help.