kafka-connect-bigquery icon indicating copy to clipboard operation
kafka-connect-bigquery copied to clipboard

Commit of offsets threw an unexpected exception

Open PedroEFLourenco opened this issue 5 years ago • 2 comments

I am streaming data from Kafka to BigQuery and it goes fine for the majority of topics, but on a few the topics I get this type of error:

ERROR WorkerSinkTask{id=connector_name-0} Offset commit failed, rewinding to last committed offsets (org.apache.kafka.connect.runtime.WorkerSinkTask:385) ERROR WorkerSinkTask{id=connector_name-0} Commit of offsets threw an unexpected exception for sequence number 28: null (org.apache.kafka.connect.runtime.WorkerSinkTask:260)

Apart from that, the topic defined to store the offsets with offset.storage.topic is empty, but the topics defined to store both status and configs are populated.

We are on Kafka v2.0.0 This is the overall configuration for the connectors: { "connector.class": "com.wepay.kafka.connect.bigquery.BigQuerySinkConnector", "type.name": "default", "autoCreateTables": "true", "tasks.max": "1", "group.id": "group1-production", "schemaRegistryLocation": "<some_url>", "project": "<some_project>", "datasets": ".*=<some_dataset>", "internal.key.converter.schemas.enable": "false", "offset.storage.topic": "connect-offsets-production", "value.converter": "io.confluent.connect.avro.AvroConverter", "key.converter": "io.confluent.connect.avro.AvroConverter", "sanitizeTopics": "true", "config.storage.topic": "connect-configs-production", "status.storage.topic": "connect-statuses-production", "internal.key.converter": "org.apache.kafka.connect.json.JsonConverter", "value.converter.schema.registry.url": "<another_url>", "task.class": "com.wepay.kafka.connect.bigquery.BigQuerySinkTask", "keyfile": "<key_file_location>", "internal.value.converter.schemas.enable": "false", "internal.value.converter": "org.apache.kafka.connect.json.JsonConverter", "schemaRetriever": "com.wepay.kafka.connect.bigquery.schemaregistry.schemaretriever.SchemaRegistrySchemaRetriever", "key.converter.schema.registry.url": "<another_url>" }

Already tried to work with the following options, without success: "bufferSize": "1000", "maxWriteSize": "100", "tableWriteWait": "3000", "offset.flush.timeout.ms": "20000"

PedroEFLourenco avatar Jul 22 '19 12:07 PedroEFLourenco

@PedroEFLourenco Was you able to find a solution for this?

atrbgithub avatar Jun 24 '20 08:06 atrbgithub

In my case, I reverted to connector version 1.6.1 from 1.6.6 (latest as of now) and it fixed the issue.

skumarlabs avatar Dec 16 '20 16:12 skumarlabs