srclient
srclient copied to clipboard
Schema name used for avro serialized data
Hi,
I would like to use the client to communicatie with our Apache Kafka (MSK) cluster which is working differently as the confluence variants. The main difference is, there is no direct relation with the schema server in de producing process, only the actual schema name which was being used during the producing process of a message is being added/sent as metadata:
event-type: test.user_activities
schema-id: test/user_activities/v0.0.1/schema.avsc
content-type: application/x-avro-json
{ -- avro json serialized data according to schema-id --}
There is a python package which can be use with this Kafka cluster, but I prefer to use Golang. En snippet from the python producer, to make it more clear:
data = {"message": "Hello world!"}
topic = "helloworld"
async def produce():
for _ in range(5):
# Serialize the data with APPLICATION_X_AVRO_BINARY
metadata = await stream_engine.send(
topic, value=data, event_type="hello_world.test", schema_id="example/hello_world/v0.0.1/schema.avsc"
)
print(f"Message sent: {metadata}")
# Serialize the data with APPLICATION_X_AVRO_JSON
metadata = await stream_engine.send(
topic,
value=data,
event_type="hello_world.test",
schema_id="example/hello_world/v0.0.1/schema.avsc",
serialization_type=consts.APPLICATION_X_AVRO_JSON,
)
print(f"Message sent: {metadata}")
So if someone reads this, would this be possible with this package? Or are there other Golang packages more suitable? Or am I really banned to use python ;-)
only the actual schema name which was being used during the producing process of a message is being added/sent as metadata
From what I understood, sounds like you should be using CloudEvents SDK, which can optionally be used with Kafka, and does not dictate what serialization format you use, nor depend on a Schema Registry as this repo does.
Closing the issue as it doesn't seem related to this module.