kafka-connect-elasticsearch icon indicating copy to clipboard operation
kafka-connect-elasticsearch copied to clipboard

Kafka Connect Elasticsearch connector

Results 166 kafka-connect-elasticsearch issues
Sort by recently updated
recently updated
newest added

Hi, Recently i faced the version conflict issue when providing a key as id to index documents to Elastic search. It is because the offset of the consumed record is...

Hı, Before the start, some useful information related to our setup. **Versions** ``` debezium-connect: 1.0 elasticsearch-sink connector: 11.1.0 elasticsearch: 7.10.2 ``` Here is my sink connector configuration. ``` { "name":...

Hi, I am reading in the code/docs that the "_version" of the document used with indexing/updates to Elasticsearch is set to be the "kafka record offset". This will only work...

bug

## Problem There's a compiled JAR-package on [Confluent Hub](https://www.confluent.io/hub/confluentinc/kafka-connect-elasticsearch) without [guava.jar](https://mvnrepository.com/artifact/com.google.guava/guava). But it depends on [Guava](https://github.com/google/guava) - #366 an #409. ## Solution Removed scope `test` from `pom.xml` for Guava package....

Hi guys, The ES cloud version uses the following Authentication Header https://www.elastic.co/guide/en/elasticsearch/reference/current/security-api-create-api-key.html ``` curl -H "Authorization: ApiKey VnVhQ2.......WFrdzl0dk5udw==" http://localhost:9200/xxxx ``` Any plans to support it ?

Version: 11.1.7 Problematic code: [ElasticsearchSinkTask.put(Collection records)](https://github.com/confluentinc/kafka-connect-elasticsearch/blob/v11.1.7/src/main/java/io/confluent/connect/elasticsearch/ElasticsearchSinkTask.java#L113) kafka-connect-elasticsearch can be configured to ignore null values using `behavior.on.null.values=IGNORE`. The code in [ElasticsearchSinkTask.put(Collection records)](https://github.com/confluentinc/kafka-connect-elasticsearch/blob/v11.1.7/src/main/java/io/confluent/connect/elasticsearch/ElasticsearchSinkTask.java#L113) will check if the record should be skipped (i.e....

Version checked: 11.1.7 ES version: 7.16 Issue location: [ElasticsearchClient.maping(String index)](https://github.com/confluentinc/kafka-connect-elasticsearch/blob/v11.1.7/src/main/java/io/confluent/connect/elasticsearch/ElasticsearchClient.java#L622) When using data-streams the [ElasticsearchSinkTask.tryWriteRecord(SingRecord sinkRecord, OffsetState offsetState)](https://github.com/confluentinc/kafka-connect-elasticsearch/blob/v11.1.7/src/main/java/io/confluent/connect/elasticsearch/ElasticsearchSinkTask.java#L293) will first check if there are any predefined mappings for the index...

Hi Team, when I try to create a Elasticsearch connector using following command, echo '{ "name": "elastic-login-connector", "config": { "connector.class": "ElasticsearchSinkConnector", "connection.url": "http://localhost:9200", "type.name": "mysql-data", "topics": "mysql.login", "key.ignore": true }...

See https://logging.apache.org/log4j/2.x/security.html log4j 16.0 is still susceptible to log4shell (although to a lesser scale). The vulnerability is fixed in log4j 2.17.1 Can this be updated? Thank you

Related to https://github.com/confluentinc/kafka-connect-elasticsearch/issues/582 The JARs on https://packages.confluent.io/maven/io/confluent/kafka-connect-elasticsearch/11.1.3/ are only for `kafka-connect-elasticsearch` and do not include any of the necessary dependency JARs. Conversely, the ZIP on the [downloads page](https://www.confluent.io/hub/confluentinc/kafka-connect-elasticsearch?_ga=2.60570221.197358983.1636395323-362194616.1635361962) does contain...