divolte-collector
divolte-collector copied to clipboard
Pushing records to ALL kafka sinks
I'm a little confused as to the behavior of Kafka sinks. It seems that when an event is created, the event is pushed to all sinks. In the case of Kafka, this pushes the event to all Kafka sinks even though the event is only relevant to one sink. Am I missing something? Why wouldn't it be smart enough only to push the event to the relevant Kafka sink?
Example:
mappings {
t1 = {
sources = [browser]
sinks = [t1]
confluent_id = 1
}
t2 = {
schema_file = "/opt/divolte/divolte-collector/conf/avro/t2.avsc"
mapping_script_file = "/opt/divolte/divolte-collector/conf/mapping.groovy"
sources = [browser]
sinks = [t2]
confluent_id = 2
}
}
sinks {
t1 {
type = kafka
mode = confluent
topic = t1
}
t2 {
type = kafka
mode = confluent
topic = t2
}
}
t1 events will write to t1 and t2 sinks. t2 events will write to t1 and t2 sinks.
This is a bit of a head-scratcher. Indeed, events from mapping.t2
should only end up on topic t2
.
Are you sure they're on t1
?
A colleague mocked this up on a demonstration project and couldn't reproduce this: Divolte Shop, MR#45
even though the event is only relevant to one sink
Can you clarify that? Your source for both mappings is the browser. Are you filtering the events somehow? If not, both sinks will receive the source events
Hey I'm running on the same issue, did anyone found any solution for that ???
We are also facing this issue , can you please provide some help here , we are integrating this one. and we found a bug in our testing, this is pushing the messages to all slinked Kafka
hi @OneCricketeer can you please let me know, I want to send a group of fields which are define in the Avro schema to their particular Kafka sink filter by something event name which we are passing in Divolte signal as a first parameter Can we achieve this one ?
I've never used Divolte