Ewen Cheslack-Postava
Ewen Cheslack-Postava
@sunnn You probably just need to tie a couple of connectors together. For MySQL -> Kafka you could use [Confluent's JDBC connector](http://docs.confluent.io/3.0.0/connect/connect-jdbc/docs/index.html) which uses the standard JDBC interface to support...
It seems like there could be multiple issues at play here. The first is the one with `java.lang.NoSuchMethodError: org.apache.kafka.clients.consumer.KafkaConsumer.subscribe(Ljava/util/List;Lorg/apache/kafka/clients/consumer/ConsumerRebalanceListener;)` message. This looks like you might have mismatched versions of Connect...
@sunnn Yes, simply because of the timing, that repository uses a CP 2.0 version. I have not tested this, but you can probably just substitute the 2.0 version for 3.0...
@sunnn Great question! It is a third party connector (connectors are federated, so anyone can develop them), but it looks like it might currently be targeted at CP 2.0/Kafka 0.9.0....
@pronzato That definitely looks like you're exceeding the permitted size. Unfortunately addressing this is more complicated than just changing that one setting because the message is larger than the default...
@pronzato Glad that worked. I'm not sure we want to expose _every_ producer/consumer config. That makes more sense for something like REST where it's supposed to be a light wrapper...
ok to test
@SupermanScott Basic changes seem good, left a couple of minor comments and it looks like it needs a merge/rebase.
@criccomini Correct, not supported right now. It definitely complicates things quite a bit in the implementation. Content-Type becomes confusing as it is now mixed between Avro and something else (and...
Right, and given how routing/content negotiation works and deserializing is automatic via Jackson and tied to the message type, I'm not sure how this would work. Routing/content negotiation we could...