logstash-codec-avro
logstash-codec-avro copied to clipboard
Confluent Schema Registry as a schema uri
We keep our Avro schemas in Confluent's Schema Registry. It would be great if we could point the schema_uri
to the registry's API, but the registry returns the Avro schema as a nested JSON object with the key schema
:
HTTP/1.1 200 OK
Content-Type: application/vnd.schemaregistry.v1+json
{
"schema": "{\"type\": \"string\"}"
}
Thoughts on what would be the best way to support this?
I agree, this is needed for us as well. It looks like there's a new codec that lets you point to the Schema Registry in the logstash Input (https://github.com/revpoint/logstash-codec-avro_schema_registry). But we need a way in the logstash Output to send to Kafka using avro with a schema_uri pointing to our confluent schema registry. Is this possible, or is anyone working on this already?
@suyograo shouldn't this be implemented in revpoint's repository? https://github.com/revpoint/logstash-codec-avro_schema_registry/blob/master/lib/logstash/codecs/avro_schema_registry.rb#L90
+1 Confluent platform seems to becoming popular and revpoint repository is unfortunately outdated. Do you consider to create your own avro codec with schema registry support?
There are two problems:
-
Schema Registry API returns the Avro schema in
schema
object. You expect that the Avro schema is not inside of nested object. -
Every message inside of Confluent Platform should have following structure:
< magic byte > < schema id (4 bytes) > < Avro blob >
Currently the codec is not prefixing the Avro blob with magic byte nor schema id.
It would be really great to have this codec available. Unfortunately I know nothing about Ruby :/
+1
+1
Anyone working on this?, same issue like @malonej7, could you solve or workaround that?
@CBR09 just took a first crack at it.
https://github.com/revpoint/logstash-codec-avro_schema_registry/pull/5
Any update on supporting AVRO encoding when outputting to Kafka?
Hi Team, is there any update or making the this official?