kafka-connect-elasticsearch
kafka-connect-elasticsearch copied to clipboard
Kafka Connect Elasticsearch connector
I want to use the topic/partition/offset as the ES document ID while still capturing the Kafka message key for audit/search purposes. Unless I'm mistaken, there doesn't appear to be a...
What's the recommended way to convert a JSON String field (from Debezium MySQL JSON field) into an ES structured object field? Here's my ES record: ``` "hits": [ { "_index":...
Hi Confluent Team, I have a issue with 'data.stream.type' It was implemented with enum and It's also using for index template. I would like to use custom index template because...
Error from task ``` org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception. org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:610) org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:330) org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:232) org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:201) org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:188) org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:237) java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) java.util.concurrent.FutureTask.run(FutureTask.java:266) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) java.lang.Thread.run(Thread.java:750)\nCaused by: org.apache.kafka.connect.errors.ConnectException: Bulk request failed: [{\"type\":\"mapper_parsing_exception\",\"reason\":\"failed to parse\",\"caused_by\":{\"type\":\"array_index_out_of_bounds_exception\",\"reason\":\"Index...
The elastic sink connector shows this error when opensearch upgraded to 2.2.0 from 1.2.4 org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.\n\tat org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:561)\n\tat org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:322)\n\tat org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:224)\n\tat org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:192)\n\tat org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:177)\n\tat org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:227)\n\tat java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)\n\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat...
I want to use the key of the message in the topic as the id, but at the same time use the internal versioning of the elasticsearch. Now this is...
# Security Vulnerability Fix This pull request fixes a Temporary File Information Disclosure Vulnerability, which existed in this project. ## Preamble The system temporary directory is shared between all users...
Would it be possible to support [ingest pipelines](https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html), in case we want to add additional logic before indexing the document? Currently, I have to consume the messages from topic A...
Hello, what is this error? ``` WorkerSinkTask{id=connector-1-3} Task threw an uncaught and unrecoverable exception"} -- | org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception. | at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:564) | at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:325) |...
I was looking at the way the logical types are handled in [io.confluent.connect.elasticsearch.DataConverter](https://github.com/confluentinc/kafka-connect-elasticsearch/blob/master/src/main/java/io/confluent/connect/elasticsearch/DataConverter.java) and found the following piece of code ``` case Decimal.LOGICAL_NAME: return ((BigDecimal) value).doubleValue(); ``` Won't converting a...