azure-functions-kafka-extension
azure-functions-kafka-extension copied to clipboard
Should review serialisation
Currently we support built-in serialisation for avro and protobuf. Avro relies on Confluent.Kafka. Protobuf relies on google.protobuf.
Having serialisation built-in has the following advantages:
- Performance when using a language worker, removing the need to serialising byte[] to language worker and do the serialisation there
- Simplicity: user doesn't have to come up with much code to get it going
Disadvantages:
- Opinionated: we use specific libraries for serialisation. Currently there is no way to inject a different one. Using specific library versions can cause problems when building functions that depends on a different versions of the library.
@jeffhollan what's your take on this? Do you see an issue with depending on specific libraries like Avro andor Protobuf for developers trying to build functions? Should we somehow allow developers to bring their own serializer?
I'm personally fine with the behavior of having it built in. The biggest question I have is on schema. I don't know if Avro and Protobuf require it but I know usually you read as a specific "type", and I don't know how a JavaScript user could provide their .proto
metadata to enable it. But maybe it's like JSON and we can read as a generic "JObject-like" thing and it's not a big deal. Any thoughts on if more is required than just a switch between "raw | protobuf | Avro" ?