Dmitri Bourlatchkov

Results 91 comments of Dmitri Bourlatchkov

Please note that the `NessieCatalog` implementation from Iceberg must use all jars from that exact Iceberg version. Injecting new client jars is not supported.

For example, here's how to run `spark-sql`: ``` $ bin/spark-sql \ --packages \ org.apache.iceberg:iceberg-spark-runtime-3.2_2.12:0.14.0 \ --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions \ --conf spark.sql.catalog.nessie=org.apache.iceberg.spark.SparkCatalog \ --conf spark.sql.catalog.nessie.warehouse=$PWD/data2 \ --conf spark.sql.catalog.nessie.catalog-impl=org.apache.iceberg.nessie.NessieCatalog \ --conf spark.sql.catalog.nessie.uri=http://localhost:19120/api/v1 \...

@asheeshgarg : I _suspect_ that in your env. there's some jar mismatch on the Iceberg class path. Generally, Iceberg should be used only with its own official artifacts because it...

@asheeshgarg : > I have kept the jar in the spark/jar folder AFAIK, nothing needs to be (manually) added to the Spark's jar directory. Spark is able to download extensions...

@asheeshgarg : Does this doc help? https://projectnessie.org/tools/iceberg/spark/

Interesting :thinking: Could you describe your "ideal" setup? Something like a pre-packaged Spark installation directory (tar?) that can work with Iceberg and Nessie?

What Spark shell do you use? Scala or SQL?

The `OIDC Server is not available` error is inconsequential. It basically means that OIDC is not configured. Please ignore it. It is tracked by #5327

@asheeshgarg : I was able to reproduce your problem :hourglass:

The basic problem is #5363. It can happen even with the `packages` config option.