deequ icon indicating copy to clipboard operation
deequ copied to clipboard

"Modules were resolved with conflicting cross-version suffixes"

Open froocpu opened this issue 4 years ago • 9 comments

I get the following error when I try to import dependencies with the following in my build.sbt:

(update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, com.chuusai:shapeless, org.apache.spark:spark-sketch, org.apache.spark:spark-kvstore, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.apache.spark:spark-sql, org.scala-lang.modules:scala-xml, org.json4s:json4s-jackson, org.typelevel:macro-compat, com.fasterxml.jackson.module:jackson-module-scala, org.scalanlp:breeze-macros, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.typelevel:machinist, org.json4s:json4s-scalap, org.scala-lang.modules:scala-parser-combinators, org.scalanlp:breeze, org.apache.spark:spark-tags, org.apache.spark:spark-core, org.apache.spark:spark-network-common

scalaVersion := "2.12.12"
val sparkVersion = "3.0.1"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12"

However, it does compile properly when I provide the following configurations:

scalaVersion := "2.11.8"
val sparkVersion = "2.4.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided"
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided"
libraryDependencies += "com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12"

I'm building using sbt version 1.2.8 and I've tried importing these dependencies with other versions of sbt, too. Behaviour also persists if you use the sbt plugin or run sbt from the terminal directly.

I've also tried deleting invalidating caches and restarting the IDE.

IntelliJ Community 2019.2.4 macOS Mojave 10.14.6

froocpu avatar Jan 19 '21 10:01 froocpu

@froocpu I was able to solve this error by excluding some of Deequ's transitive libraries that are cross-compiled with Scala 2.11:

libraryDependencies += ("com.amazon.deequ" % "deequ" % "1.1.0_spark-3.0-scala-2.12")
        .exclude("org.scalanlp", "breeze_2.11")
        .exclude("com.chuusai", "shapeless_2.11")
        .exclude("org.apache.spark", "spark-core_2.11")
        .exclude("org.apache.spark", "spark-sql_2.11")

hedibejaoui avatar Jan 25 '21 11:01 hedibejaoui

Although that compiled fine, our tests that rely on RelativeRateOfChangeStrategy fails with no class found for breeze:

[info]   java.lang.NoClassDefFoundError: breeze/linalg/DenseVector$
[info]   at com.amazon.deequ.anomalydetection.BaseChangeStrategy.detect(BaseChangeStrategy.scala:90)
[info]   at com.amazon.deequ.anomalydetection.BaseChangeStrategy.detect$(BaseChangeStrategy.scala:80)
[info]   at com.amazon.deequ.anomalydetection.RelativeRateOfChangeStrategy.detect(RelativeRateOfChangeStrategy.scala:36)
[info]   at com.amazon.deequ.anomalydetection.AnomalyDetector.detectAnomaliesInHistory(AnomalyDetector.scala:98)
[info]   at com.amazon.deequ.anomalydetection.AnomalyDetector.isNewPointAnomalous(AnomalyDetector.scala:60)
[info]   at com.amazon.deequ.checks.Check$.isNewestPointNonAnomalous(Check.scala:1126)
[info]   at com.amazon.deequ.checks.Check.$anonfun$isNewestPointNonAnomalous$1(Check.scala:433)
[info]   at scala.runtime.java8.JFunction1$mcZD$sp.apply(JFunction1$mcZD$sp.java:23)
[info]   at com.amazon.deequ.constraints.AnalysisBasedConstraint.runAssertion(AnalysisBasedConstraint.scala:108)
[info]   at com.amazon.deequ.constraints.AnalysisBasedConstraint.pickValueAndAssert(AnalysisBasedConstraint.scala:74)
[

leopasta avatar Jan 26 '21 18:01 leopasta

@leopasta Try adding the breeze library explicitly as a test dependency.

hedibejaoui avatar Jan 26 '21 23:01 hedibejaoui

I ran into a similar issue when using gradle. I just excluded the spark dependency when compiling deequ:

    compile(group: 'com.amazon.deequ', name: 'deequ', version: '1.1.0_spark-3.0-scala-2.12'){
        exclude group: 'org.apache.spark', module: 'spark-core_2.11'
        exclude group: 'org.apache.spark', module: 'spark-sql_2.11'
    }

nathan-bennett avatar Feb 03 '21 02:02 nathan-bennett

On a related note, does anyone know why the 1.1.0 release which even has "scala-2.12" in its name depends on spark for scala-2.11 on maven as in here: https://mvnrepository.com/artifact/com.amazon.deequ/deequ/1.1.0_spark-3.0-scala-2.12 .

piotrm0 avatar Feb 25 '21 03:02 piotrm0

Closing due to inactivity - please reopen if issues are remaining with the latest version.

lange-labs avatar Jun 09 '21 11:06 lange-labs

This is still an issue for me as well. Unable to compile 1.1 with my spark 3.0 2.12 app

apython1998 avatar Jun 22 '21 22:06 apython1998

Same issue today. Exclusions are not working for me as suggested above. Please reopen

fernanluyano avatar Aug 04 '21 19:08 fernanluyano

In my case (scala 2.12.13 and spark 3.1.1), I did not need to exclude anything, I simply upgraded the deequ dependency to the latest 2.12 built deequ version:

scalaVersion := "2.12.13"

....

"com.amazon.deequ" % "deequ" % "1.2.2-spark-3.0",

Mehdi-Bendriss avatar Aug 05 '21 14:08 Mehdi-Bendriss