flink-siddhi
flink-siddhi copied to clipboard
A CEP library to run Siddhi within Apache Flink™ Streaming Application
Use this topic for discussion the use cases are using `flink-siddhi`. You are very welcome to kindly share your project or experience about it. Thanks very much!
在不考虑扩展性的前提下 flink+siddhi,flink一个主要做数据分发,真正计算的是siddhi在操作,这个过程,数据分发的网络开销(数据的序列化和反序列化),还有flink的数据结构转化成siddhi的结构 如果只用siddhi,这些都可以省去,性能会不会更好,因为是单机程序运维成本也会更好 比如,5台64G机器,应用跑在flink siddhi上,和用siddhi 分5个应用跑在5个机器上,不考虑数据准确性前提下,那个性能会更好 感谢!
I have the query like this - "from every e1=firewallStream[name == 'A'] -> e2=firewallStream [ name == 'B' ] within 40 seconds select 'AAAA' as ruleId group by e1[0].signatureId insert...
I don't konw how to chose siddhi and flink-cep
hi, haoch With AbstractSiddhiOperator, we already have snapshotState using flink checkpoint, why do we still need to checkpointSiddhiRuntimeState or checkpointRecordQueueState when processing every element ? code as @Override public void...
Hi, Below query when run is failing with error: Error on sending events [Event{timestamp=1641985535389, data=[4, 4320918970, 2, resource, aws_S3, stackidentitytest.com, stackidentitytest.com, True, S3FullControlAccess, [], 2022-01-12 16:35:35], isExpired=false}] in the SiddhiApp...
I have the pattern query and I want to select all fields with some of our custom mapping can we do this? ex: from every( e1=TempStream ) -> e2=TempStream[ e1.roomNo...
#65 adding states for failure recovery
hi @haoch , I am getting Null-pointer exception when using control stream and publishing data through kafka to input stream. When I debugged I found that somehow the EventListener( SiddhiStreamOperator...
Hi, Can someone provide the sample ControlEvent json that can be push to like Kafka topic please? I'm not able to make sense the MetadataControlEvent.builder. Thanks, Calvin