confluent-kafka-python
confluent-kafka-python copied to clipboard
Confluent's Kafka Python Client
Description =========== Schema validation is coupled with object validation in **JSONSerializer.__call__()**. Every time an object is being serialized, the schema (JSONSerializer._parsed_schema) is being validated alongside the object dict validation: [https://github.com/confluentinc/confluent-kafka-python/blob/baf71ea0ed54c71948208bfc5c352f4ee57054dd/src/confluent_kafka/schema_registry/json_schema.py#L267](url)...
I was wondering if there exists or were plans to add support for faster Python JSON libraries (e.g. [orjson](https://github.com/ijl/orjson), [rapidjson](https://github.com/python-rapidjson/python-rapidjson)) instead of using the standard Python JSON library for serialization...
Description =========== Not an issue but a possible enhancement request. Currently the protobuf and json deserialisers do not interact with the schema registry (because they don't need to). However it...
Description =========== https://github.com/confluentinc/confluent-kafka-python/blob/master/confluent_kafka/schema_registry/protobuf.py#L315 _known_subject is never populated. The ProtobufSerializer has a cache for the subject created. However, it is never populated in the code. As such, the serializer does a...
Description =========== If someone wants the schema id of message which consumer is going to consume(after some transformation) then there should be schema ID to serialize the data into schema....
I want to integrate python kafka protobuf deserializer with apache spark streaming, Could you help me solve this problem, please?
Description =========== At looking into the source code for Protobuf, it seems like not possible to deserialize a protobuf message without its corresponding static message type argument. When it comes...
[Package documentation](https://docs.confluent.io/current/clients/confluent-kafka-python/index.html#avroproducer-legacy) marks `AvroProducer` and `AvroConsumer` as legacy. But, [README.md](https://github.com/confluentinc/confluent-kafka-python/blob/624cdb8fe459b8910410f379cf2658c8d096436b/README.md#usage) still prominently advertises these legacy classes. This may be confusing for new users. Please update README.md.
Description =========== I'll try to explain using an example instead: 1. I have an Avro producer/consumer, the consumer uses `latest` as `auto.offset.reset`. The schema registry is using `backward` as compatibility...
Description =========== I'm using `AvroDeserializer` to deserialize records from a topic and I'm running into issues when confluent-kafka-python calls fastavro in the deserialization function: ```python obj_dict = schemaless_reader(payload, writer_schema, self._reader_schema)...