KTS
KTS copied to clipboard
Is planing to update for Kibana 5 ?
It is planned towards the end of the month - however any feedback on these dashboards import to Kibana 5 is welcome!
@pevma FWIW patch bellow to the stock logstash filter (seems to) gets most stuff to work, as is, out of the box under elk (last tested with 5.2.0)
--- logstash.conf.l4 2017-02-03 18:45:24.000000000 +0000
+++ logstash.conf 2017-02-03 18:46:16.000000000 +0000
@@ -11,10 +11,21 @@
match => [ "timestamp", "ISO8601" ]
}
ruby {
- code => "if event['event_type'] == 'fileinfo'; event['fileinfo']['type']=event['fileinfo']['magic'].to_s.split(',')[0]; end;"
+ code => "
+ if event.get('[event_type]') == 'fileinfo'
+ event.set('[fileinfo][type]', event.get('[fileinfo][magic]').to_s.split(',')[0])
+ end
+ "
}
ruby {
- code => "if event['event_type'] == 'alert'; sp = event['alert']['signature'].to_s.split(' group '); if (sp.length == 2) and /\A\d+\z/.match(sp[1]); event['alert']['signature'] = sp[0] ;end; end;"
+ code => "
+ if event.get('[event_type]') == 'alert'
+ sp = event.get('[alert][signature]').to_s.split(' group ')
+ if (sp.length == 2) and /\A\d+\z/.match(sp[1])
+ event.set('[alert][signature]', sp[0])
+ end
+ end
+ "
}
metrics {
And thanks && congrats to the whole team for the great work!
@AntonioMeireles - Thanks a bunch for the feedback ! Very useful!
What is the logstash.config file that you use for these dashboards? I tried a modified version from https://redmine.openinfosecfoundation.org/projects/suricata/wiki/_Logstash_Kibana_and_Suricata_JSON_output but Kibana (4.6) only seems to find the logstash-* index - I had to change all the visualizations to use that index instead of the other.
I want to upgrade to ELK 5.2 stack, would anyone mind sharing a config file that uses the changes @AntonioMeireles mentioned?
@mliu1212 something along ...
input {
beats {
port => 5044
codec => json
}
}
filter {
if [type] == "SELKS" {
date {
match => [ "timestamp", "ISO8601" ]
}
ruby {
code => "
if event.get('[event_type]') == 'fileinfo'
event.set('[fileinfo][type]', event.get('[fileinfo][magic]').to_s.split(',')[0])
end
"
}
ruby {
code => "
if event.get('[event_type]') == 'alert'
sp = event.get('[alert][signature]').to_s.split(' group ')
if (sp.length == 2) and /\A\d+\z/.match(sp[1])
event.set('[alert][signature]', sp[0])
end
end
"
}
metrics {
meter => [ "eve_insert" ]
add_tag => "metric"
flush_interval => 30
}
}
if [http] {
useragent {
source => "[http][http_user_agent]"
target => "[http][user_agent]"
}
}
if [src_ip] {
if [src_ip] !~ ":" {
mutate {
add_field => [ "[src_ip4]", "%{src_ip}" ]
}
}
geoip {
source => "src_ip"
target => "geoip"
#database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
if ![geoip.ip] {
if [dest_ip] {
geoip {
source => "dest_ip"
target => "geoip"
#database => "/opt/logstash/vendor/geoip/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}
}
if [dest_ip] {
if [dest_ip] !~ ":" {
mutate {
add_field => [ "[dest_ip4]", "%{dest_ip}" ]
}
}
}
}
output {
if [event_type] and [event_type] not in ['stats', 'engine'] {
elasticsearch {
hosts => elasticsearch
index => "logstash-%{event_type}-%{+YYYY.MM.dd}"
#template => "/etc/logstash/elasticsearch-template.json"
}
} else {
elasticsearch {
hosts => elasticsearch
index => "logstash-%{+YYYY.MM.dd}"
}
}
# stdout { codec => rubydebug }
if "metric" in [tags] {
stdout {
codec => line {
format => "EVE insert rate: %{[eve_insert][rate_1m]}"
}
}
}
}
Thanks I will try that out. Quick question - is your input file not coming from suricata eve.json? I noticed you are using beats.
I'm ingesting plain eve.json thru filebeat on suricata host into a fattish ELK cluster on top of kubernetes ... (Filebeat is way simpler than using logstash on sensor as shipper)
António
On Thu, 9 Feb 2017 at 17:09, mliu1212 [email protected] wrote:
Thanks I will try that out. Quick question - is your input file not coming from suricata eve.json? I noticed you are using beats.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/StamusNetworks/KTS/issues/9#issuecomment-278706627, or mute the thread https://github.com/notifications/unsubscribe-auth/AAtYZz9ptoCtbaOLMEClwxHGLAXDmd4mks5ra0gugaJpZM4L1F0O .
Ah so your filebeat configuration has the eve.json as the input, and logstash as the output?
Yes
On Thu, 9 Feb 2017 at 17:25, mliu1212 [email protected] wrote:
Ah so your filebeat configuration has the eve.json as the input, and logstash as the output?
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/StamusNetworks/KTS/issues/9#issuecomment-278711642, or mute the thread https://github.com/notifications/unsubscribe-auth/AAtYZ37o5lky2lrVd2HGNp_ZLWFogTrpks5ra0wLgaJpZM4L1F0O .
Any news ?
There is planned work on that this and next week. @SboichakovDmitriy - willing to participate in some test feedback?
Yes, sure. You can send out all needed information to my email : [email protected] Thanks.
perfect - thanks for the help !
@pevma - can you plz add me to the "beta" too ? (email address is the one in GitHub logs)
As soon as we have something in test sequence i will let you know guys. Thanks!
Hey, @pevma, any updates/news for us ? Maybe we can help with something? #WaitForKibana5 ;)
@SboichakovDmitriy @AntonioMeireles and anyone else willing to test and feedback - https://github.com/StamusNetworks/KTS5 :)
You can use the same logstash config but with the adjustment @AntonioMeireles mentioned above.
@pevma THANKS! will get a look && report. btw my logstash config above can be AFAICT trimmed to ...
filter {
if [type] == "SELKS" {
date {
match => [ "timestamp", "ISO8601" ]
}
ruby {
code => "
if event.get('[event_type]') == 'fileinfo'
event.set('[fileinfo][type]', event.get('[fileinfo][magic]').to_s.split(',')[0])
end
"
}
ruby {
code => "
if event.get('[event_type]') == 'alert'
sp = event.get('[alert][signature]').to_s.split(' group ')
if (sp.length == 2) and /\A\d+\z/.match(sp[1])
event.set('[alert][signature]', sp[0])
end
end
"
}
metrics {
meter => [ "eve_insert" ]
add_tag => "metric"
add_field => { "event_type" => "stats" }
flush_interval => 30
}
}
if [http] {
useragent {
source => "[http][http_user_agent]"
target => "[http][user_agent]"
}
}
if [src_ip] {
if [src_ip] !~ ":" {
mutate {
add_field => [ "[src_ip4]", "%{src_ip}" ]
}
}
geoip {
source => "src_ip"
}
if ![geoip.ip] {
if [dest_ip] {
geoip {
source => "dest_ip"
}
}
}
}
if [dest_ip] {
if [dest_ip] !~ ":" {
mutate {
add_field => [ "[dest_ip4]", "%{dest_ip}" ]
}
}
}
}
FYI - We did some further updates here - https://github.com/StamusNetworks/KTS5
We also have upgrade procedure ready for testing for SELKS 3 to SELKS 4 upgrades - https://github.com/StamusNetworks/SELKS/wiki/SELKS-3.0-to-SELKS-4.0-upgrades---testing
@pevma Hi!
(sorry lag! && many, many thanks for your hard work) some notes...
- kibana-5.3.1 doesn't seem to like at all of spaces in dashboards names. it just can't find them. (5.3.0 is OK)
- there 's a bug in the
load.sh
script.-H "Content-Type: application/json"
needs to be added to everycurl -X{PUT,POST}
otherwise one gets aHTTP/406 "Content-Type header [application/x-www-form-urlencoded] is not supported"
type of error - there's another issue at the botton of the
load.sh
script. it expectsdashboards/config.json
to be present (was before) and meanwhile ir went MIA. while on this it probably makes sense to update the hardcoded4.3.1
in that script to something more dynamic/evolving...
Also, load.sh
should probably offer a way to dynamically allow the indices prefix to be set (defaults to logstash-
. Otherwise, things work as expected, the way that is expected :-) . To close the circle and make things perfect, only thing missing would be probably some dashboards for netflow data.
@pevma also sed -i "s,\.raw,.keyword,g" dashboards/*/*.json
...
@AntonioMeireles - thanks for the input! noted on the errors ...doing some fixing and updating the repo soon. I tried importing the dashboards through Scirius - had no issue with Kibana 5.4.0 with regards to spaces in the names. wondering though if we should sub the spaces with a "-" or similar?
One problem about substituting the "raw" with "keyword" is that we will loose backwards compatibility I think.
@AntonioMeireles - addressed most of the stuff here - https://github.com/StamusNetworks/KTS5/commits/master - master updated.