kafka-connect-bigquery
kafka-connect-bigquery copied to clipboard
Timestamp conversion issue for big query connector
I am using schema-less json with big query connector. I am storing number of milliseconds as a timestamp.
{
"metadata": {
"eventTime": 1587677792225,
...
}
}
I am getting following error while putting data in sync.
[row index 9]: invalid: Timestamp field value is out of range:1587677792225999872
To solve this I tried to add following configuration
"transforms": "Cast",
"transforms.Cast.type": "org.apache.kafka.connect.transforms.Cast$Value",
"transforms.Cast.spec": "metadata.eventTime:int64"
Also , tried to use Timestamp converter
"transforms": "TimestampConverter",
"transforms.TimestampConverter.field": "metadata.eventTime",
"transforms.TimestampConverter.type": "org.apache.kafka.connect.transforms.TimestampConverter$Value",
"transforms.TimestampConverter.target.type": "Timestamp"
But it seems connect is just ignoring the nested fields.
Does it expected behaviour and is there any other alternative?
This seems like more of an issue you are having with the kafka connect framework, rather than KCBQ specifically. Maybe you should bring this question to the confluent folks?
I'm having the same issue right now with managed Kafka on Confluent Cloud.
@manojgdhv - have you ever resolved this?