confluent-kafka-python
confluent-kafka-python copied to clipboard
Confluent's Kafka Python Client
What ---- Checklist ------------------ - [ ] Contains customer facing changes? Including API/behavior changes - [ ] Did you add sufficient unit test and/or integration test coverage for this PR?...
Description =========== I have random problem while committing offset. Here is consumer code ```python import time from confluent_kafka import Consumer, KafkaError from typing import Callable, Any, List from dags.pkg.proceed_data import...
Description =========== After upgrading from 0.11.4 to 1.0.0 I noticed that `assignment()` has begun returning empty list. Am I missing something? How to reproduce ================ Here is the simple script...
Hello team! **Problem** The official [example shows](https://github.com/confluentinc/confluent-kafka-python/blob/b30765ecacb0adf1789749e7aeda5377ec6021cf/examples/asyncio_avro_producer.py#L49) `delivery_future` should resolve without calling `flush()`: ``` from confluent_kafka.aio import AIOProducer # Note: stable module, not experimental delivery_future = await producer.produce(args.topic, value=serialized_value) msg...
What ---- Checklist ------------------ - [ ] Contains customer facing changes? Including API/behavior changes - [ ] Did you add sufficient unit test and/or integration test coverage for this PR?...
What ---- FIPS 140-3 (newest) requires a newer openssl version, updating documentation around this. Also went through SR dependencies and believe they're still compliant. Probably need a +1 from clients...
I know, we're using an old version, but can't upgrade due to dropped CentOS support. Really weird behaviour.... What I do is roughly: ```python consumer = make_consumer(enable_auto_commit=False) consumer.subscribe(my_topic) while True:...
https://github.com/confluentinc/confluent-kafka-python/blob/138517e8fdd1353146801fa72871d36a6ac5ae35/requirements/requirements-schemaregistry.txt#L3 Error on version httpx==0.28.1 [SSL: SSLV3_ALERT_BAD_CERTIFICATE] sslv3 alert bad certificate No error after downgrade to 0.27.2
when i set up a schema regsitry client with oauth2 i must pass `bearer.auth.logical.cluster`, `bearer.auth.identity.pool.id` however im running schema registry self hosted, so i don't know what those values are
With the release of Python 3.14, free threaded support (i.e., GIL-less python) is now a sensible option. Third party packages that use C-Extensions like confluent_kafka however need to have code...