alan_rodriguez
alan_rodriguez
My current way on this issue was using commands.getstatusoutput('kadmin -p {0} -kt {1} -q "xst -k keys/{2}.keytab {2}"'.format(app.config['PRINCIPAL'], app.config['KEYTAB'], user)) and then read keytab. Although i resolved this, i don't...
facing same issue, wait for updates
> @michael1991 just to check , Are you also using composite key? Can you post table configuration @ad1happy2go please check below: #Updated at 2024-02-27T07:34:03.809265Z #Tue Feb 27 07:34:03 UTC 2024...
> @michael1991 the above one is `hoodie.properties` and @ad1happy2go is asking for the table properties you used during table creation. thanks Thanks for reminding, i'm using Dataproc 2.1 with Spark...
> I hit the same error when I try to use record indexing: > > ``` > hoodie.metadata.record.index.enable=true > hoodie.index.type=RECORD_INDEX > ``` > > Are there additional configs/jars that are...
> @michael1991 can you add the value that you pass `spark.executor.extraClassPath` and `spark.driver.extraClassPath`? so that I can try at my end as well. Sure @maheshguptags, due to I'm using GCP...
Hey @maheshguptags , I just got inspired by GCP Dataproc Doc here: https://cloud.google.com/dataproc/docs/concepts/components/hudi Actually, I don't know why need to pass the jar in extraPath, I guess maybe some classes...
We have only one writer here, while multiple readers. So we have one cleaner within writer, right?
> yeah, can you check the metadata file state under `.hoodie` dir? Oh, I found some empty files there. And if I enable metadata in reader side, warning thrown as...
- Because we are ingesting event log data without record key, before 0.14, we must set record key for writer. Meanwhile, in GCP Dataproc, Hudi v0.12.3 is the default version,...