Results 25 comments of JQ Zhu

Sorry for ambiguity, the message key type is kafka builtin long(not avro long), and the value type is avro.

It seems that the kafka-long key is still parsed as avro-long type: ![image](https://user-images.githubusercontent.com/10142314/135787530-f53424bb-83d9-4ca3-b20b-c0b4959b9709.png)

Maybe the UI could let us select key/value type?

Long keys are consumed well by Conduktor: ![image](https://user-images.githubusercontent.com/10142314/135798246-9ec8e20d-7aed-4db8-853f-ce702cd4d3b5.png)

@tchiotludo FYI, here's our spring-boot kafka properties: ```yaml spring.kafka: properties: schema.registry.url: http://schema-registry.kafka specific.avro.reader: true consumer: keyDeserializer: org.apache.kafka.common.serialization.LongDeserializer valueDeserializer: io.confluent.kafka.serializers.KafkaAvroDeserializer producer: keySerializer: org.apache.kafka.common.serialization.LongSerializer valueSerializer: io.confluent.kafka.serializers.KafkaAvroSerializer ```

From README: > In case your project contains a lot of files you might want to disable file monitoring via lsp-enable-file-watchers (you may use dir-locals). Is `dir-locals` for project root...

Tips about adding cloud_provider for private k8s posted here: https://github.com/hyperledger/bevel/issues/1001#issuecomment-1125908098

Any plan? I thought `decorator.datasource.datasource-proxy.count-query=true` would expose those metrics to actuator/prometheus.