nleena123
nleena123
@pratyakshsharma HI what is stacktrace ?, i am not aware of it ,can you please help with more details.
Hi @pratyakshsharma Please find the attached complete error log details. [metrics_job_logs.txt](https://github.com/apache/hudi/files/8667036/metrics_job_logs.txt)
@nsivabalan Can you help us , this is happing in our production server.
#hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.CustomKeyGenerator hoodie.datasource.write.hive_styling_partioning=true hoodie.datasource.write.partitionpath.urlencode=false #hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.ComplexKeyGenerator hoodie.datasource.write.recordkey.field=interaction_id hoodie.datasource.write.partitionpath.field=create_datetime:TIMESTAMP hoodie.deltastreamer.keygen.timebased.timestamp.type=DATE_STRING hoodie.deltastreamer.keygen.timebased.input.dateformat=yyyyMMdd HH:mm hoodie.deltastreamer.keygen.timebased.output.dateformat=yyyy/MM hoodie.deltastreamer.keygen.timebased.timezone=UTC-6:00 hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.CustomKeyGenerator #20211119 13:39 #hoodie.index.type=GLOBAL_BLOOM #hoodie.bloom.index.update.partition.path=true # schema provider configs #hoodie.deltastreamer.schemaprovider.registry.url=https://azure-prod-schema-registry.extscloud.com/subjects/async.messaging.interaction.metrics-value/versions/3 hoodie.deltastreamer.schemaprovider.registry.url=https://azure-prod-schema-registry.extscloud.com/subjects/async.messaging.interaction.metrics-value/versions/latest # Kafka Source hoodie.deltastreamer.source.kafka.topic=async.messaging.interaction.metrics #Kafka props #hoodie.auto.commit=true enable.auto.commit=true...
yes , below DFAvroKafkaSource.java file we are using to read the data which extends AvroSource. DF3HoodieDeltaStreamer.java which extends HoodieDeltaStreamer , and attached pom.xml file [code.zip](https://github.com/apache/hudi/files/8685199/code.zip) in the folder please have...
Hi @nsivabalan Below attached property file contain all configs that we used to This job. And used passed below arguments to databrick job (we are running hudi job through Azure...
@pratyakshsharma i have changed to hoodie.datasource.write.hive_style_partitioning=true and i ran the job , still i could see the same issue. i have not made any changes after 20211211183554__commit__COMPLETED
We are adding new field CDC_TS to Kafka data through code which present in DF3HoodieDeltaStreamer.java program. **Code :-** static GenericRecord getDataWithCDC_TS(GenericRecord record, String ts) { Schema base = record.getSchema(); List...
Hi @xushiyan , @pratyakshsharma From very long back , i am seeing this issue , Can you please update on this , or please suggest me the resolution steps ?
Hi Team, Can you please update me on the above issue ??