akhq icon indicating copy to clipboard operation
akhq copied to clipboard

kafka-long key(with avro value) is mis-parsed as avro type

Open uqix opened this issue 2 years ago • 15 comments

image

uqix avatar Oct 03 '21 07:10 uqix

Can you try with dev version please, there is a fix about primitive avro type

tchiotludo avatar Oct 03 '21 07:10 tchiotludo

Sorry for ambiguity, the message key type is kafka builtin long(not avro long), and the value type is avro.

uqix avatar Oct 03 '21 08:10 uqix

Ok but it's always an primitive, please try with dev branch please

tchiotludo avatar Oct 03 '21 09:10 tchiotludo

Will do later, thanks

uqix avatar Oct 03 '21 09:10 uqix

It seems that the kafka-long key is still parsed as avro-long type:

image

uqix avatar Oct 04 '21 03:10 uqix

Maybe the UI could let us select key/value type?

uqix avatar Oct 04 '21 03:10 uqix

Long doesn't exist for Kafka key since key are always store as bytes. If AKHQ display this message, it means that the first bytes of your key message is the special format for schema registry message with an id (in your case 0).

Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka

tchiotludo avatar Oct 04 '21 04:10 tchiotludo

Long keys are consumed well by Conduktor:

image

uqix avatar Oct 04 '21 05:10 uqix

Can you try with other tools I mentioned ? conduktor is closed source and I can't figure how it's handle. Or can you provide a way to reproduce the issues ? I don't know how to produce this kind of message and I need to exact message on my side to fix that.

tchiotludo avatar Oct 04 '21 06:10 tchiotludo

Maybe you can try with kafkacat or kafka-console-consumer but I'm pretty sure that the same error will be displayed on these tool meaning an incorrect message store on Kafka

I made some tests with Kafkacat. The keys are unreadable (e.g. ��s) in case no deserializers are provided. But you can select the deserializers for the key you need (e.g. -s key=q [signed 64-bit integer]) to consume key/value and it will work.

tstuber avatar Oct 11 '21 12:10 tstuber

Seems that you need to provide us an unit test with a message produce in the wrong format to have some fix here. I just can't figure what you are doing to produce this kind of message

tchiotludo avatar Oct 13 '21 18:10 tchiotludo

@tchiotludo FYI, here's our spring-boot kafka properties:

spring.kafka:
  properties:
    schema.registry.url: http://schema-registry.kafka
    specific.avro.reader: true
  consumer:
    keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer
    valueDeserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer
  producer:
    keySerializer: org.apache.kafka.common.serialization.LongSerializer
    valueSerializer: io.confluent.kafka.serializers.KafkaAvroSerializer

uqix avatar Oct 14 '21 02:10 uqix

I've made a try https://github.com/tchiotludo/akhq/commit/c46c5de7958c94672366f486a9756b9067da4e2d. This don't work, since the double Deserializer is catch first (same byte length) and display the a false double value.

I have no clue how to know the serde for the topic for this standard type automatically, we need to add some configuration per topic in order to find the good serde (like conduktor is doing manually).

tchiotludo avatar Oct 24 '21 20:10 tchiotludo

@tchiotludo Should this issue be solved in 0.24.0?

haroldlbrown avatar Jul 25 '23 09:07 haroldlbrown

@haroldlbrown no it's not fixed for now, PR are welcome

tchiotludo avatar Jul 31 '23 20:07 tchiotludo