dionaea
dionaea copied to clipboard
Credentials are not flattened as expected
ISSUE TYPE
- Bug Report
DIONAEA VERSION
Dionaea Version 0.5.1
Compiled on Linux/x86_64 at Sep 6 2016 02:39:10 with gcc 4.8.4
Started on c9de603aad08 running Linux/x86_64 release 4.4.0-38-generic
CONFIGURATION
- name: log_json
# Uncomment next line to flatten object lists to work with ELK
flat_data: true
config:
handlers:
#- http://127.0.0.1:8080/
- file:///data/dionaea/log/dionaea.json
OS / ENVIRONMENT
- Ubuntu 14.04.5
SUMMARY
iHandler for JSON logging is set to flatten output, however credentials (at least for MySQL) are written as an array while everything else is flattened.
STEPS TO REPRODUCE
Connect to MySQL port with some user / password.
EXPECTED RESULTS
Flattened credentials.
ACTUAL RESULTS
"credentials": [
{
"username": "root",
"password": ""
}
],
I went through some log files today in prep for T-Pot 16.10 and found that ftp is written as an array as well instead of being flattened:
"ftp": {
"commands": [
{
"command": "GET",
"arguments": [
"/ HTTP/1.1"
]
},
{
"command": "Connection:",
"arguments": [
"close"
]
},
{
"command": "Host:",
"arguments": [
"9x.2xx.1xx.2xx:21"
]
},
{
"command": "User-Agent:",
"arguments": [
"libwww-perl/6.13"
]
}
]
},
Please let me know if you need more info.
@phibos Could you have look at this issue please :bowtie:
Hi @t3chn0m4g3,
from my point of view the flattening works as discussed in #38.
Data not flattened:
{
"credentials": [
{"username": "user", "password": "pw"}
]
}
Data flattened:
{
"credentials": {
"password": ["pw"],
"username": ["user"]
}
}
Can you please provide an example how you would expect the flattening.
@phibos Thank you for your quick reply and taking this on.
I'm a little bit confused... your flattened example actually shows a flattened output, however the actual result I get (even with flattening enabled) is a non flattened log entry as you pointed out in your example:
"credentials": [
{
"username": "root",
"password": ""
}
],
You can find an example below for review...
The native dionaea.json
log entry reads as follows:
{"ftp": {"commands": [{"command": "USER", "arguments": ["john"]}, {"command": "PASS", "arguments": ["doe"]}, {"command": "SYST", "arguments": []}, {"command": "EPRT", "arguments": ["|2|::1|58083|"]}, {"command": "EPRT", "arguments
": ["|2|::1|52827|"]}, {"command": "EPRT", "arguments": ["|2|::1|53415|"]}, {"command": "EPRT", "arguments": ["|2|::1|42624|"]}, {"command": "QUIT", "arguments": []}]}, "credentials": [{"password": "doe", "username": "john"}], "ds
t_port": 21, "src_port": 52422, "dst_ip": "172.17.0.4", "connection": {"type": "accept", "transport": "tcp", "protocol": "ftpd"}, "src_hostname": "", "timestamp": "2016-10-21T09:38:13.804182", "src_ip": "172.17.0.1"}
After being imported to ES via Logstash you can see the formatted JSON:
{
"_index": "logstash-2016.10.21",
"_type": "Dionaea",
"_id": "AVfmmyz4UaS6FRw1b92h",
"_score": null,
"_source": {
"ftp": {
"commands": [
{
"command": "USER",
"arguments": [
"john"
]
},
{
"command": "PASS",
"arguments": [
"doe"
]
},
{
"command": "SYST",
"arguments": []
},
{
"command": "EPRT",
"arguments": [
"|2|::1|58083|"
]
},
{
"command": "EPRT",
"arguments": [
"|2|::1|52827|"
]
},
{
"command": "EPRT",
"arguments": [
"|2|::1|53415|"
]
},
{
"command": "EPRT",
"arguments": [
"|2|::1|42624|"
]
},
{
"command": "QUIT",
"arguments": []
}
]
},
"credentials": [
{
"password": "doe",
"username": "john"
}
],
"src_port": 52422,
"connection": {
"type": "accept",
"transport": "tcp",
"protocol": "ftpd"
},
"src_hostname": "",
"timestamp": "2016-10-21T09:38:13.804182",
"src_ip": "172.17.0.1",
"@version": "1",
"@timestamp": "2016-10-21T09:38:13.804Z",
"path": "/data/dionaea/log/dionaea.json",
"host": "e7d5a97b46d9",
"type": "Dionaea",
"dest_port": 21,
"dest_ip": "172.17.0.4"
},
"fields": {
"timestamp": [
1477042693804
],
"@timestamp": [
1477042693804
]
},
"highlight": {
"type.raw": [
"@kibana-highlighted-field@Dionaea@/kibana-highlighted-field@"
]
},
"sort": [
1477042693804
]
}
While i.e. connection
is being flattened correctly and thus being indexed by ELK you can see that ftp
/commands
and credentials
are not.
In both cases my best guess is to format the flattened output as you did with the connections
part:
"connection": {
"type": "accept",
"transport": "tcp",
"protocol": "ftpd"
},
For a better overview please review the attached screenshot (not flattened entries are highlighted with a yellow exclamation mark sign):
Please let me know if I can be of any assistance.
Thanks again for your kind support!
I'm unable to reproduce the error you reported.
My /opt/dionaea/etc/dionaea/ihandlers-enabled/log_json.yaml looks like
- name: log_json
config:
# Uncomment next line to flatten object lists to work with ELK
flat_data: true
handlers:
#- http://127.0.0.1:8080/
- file:///opt/dionaea/var/dionaea/dionaea.json
And the result in /opt/dionaea/var/dionaea/dionaea.json looks like:
{
"connection": {
"protocol": "ftpd",
"transport": "tcp",
"type": "accept"
},
"credentials": {
"password": [
"test"
],
"username": [
"test"
]
},
"dst_ip": "192.168.x.y",
"dst_port": 21,
"ftp": {
"commands": {
"arguments": [
"test",
"test",
"",
""
],
"command": [
"USER",
"PASS",
"SYST",
"QUIT"
]
}
},
"src_hostname": "",
"src_ip": "192.168.x.y",
"src_port": 36500,
"timestamp": "2016-11-10T21:07:37.366618"
}
I see, the brackets [
and ]
are the reason why ELK interprets it as an array.
ELK 5.3 will not index the credentials array at all. Can you hava a look at this please?
14:29:40.137 [[main]>worker1] WARN logstash.outputs.elasticsearch - Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2017.04.10", :_type=>"Dionaea", :_routing=>nil}, 2017-04-10T14:29:39.062Z c25fd5a718f2 %{message}], :response=>{"index"=>{"_index"=>"logstash-2017.04.10", "_type"=>"Dionaea", "_id"=>"AVtYRSyd2-Fzduehg3tX", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [credentials]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:464"}}}}}
{
"t-pot_hostname" => "lipstickenemy",
"geoip" => {},
"type" => "Dionaea",
"tags" => [
[0] "_geoip_lookup_failure"
],
"src_port" => 0,
"src_ip" => "",
"path" => "/data/dionaea/log/dionaea.json",
"@timestamp" => 2017-04-10T14:29:39.069Z,
"dest_ip" => "172.17.0.8",
"@version" => "1",
"host" => "c25fd5a718f2",
"connection" => {
"protocol" => "ftpdatalisten",
"transport" => "tcp",
"type" => "listen"
},
"t-pot_ip" => "1.2.4.3",
"src_hostname" => "",
"dest_port" => 45282,
"timestamp" => "2017-04-10T14:29:39.069810"
}
{
"t-pot_hostname" => "lipstickenemy",
"ftp" => {
"commands" => [
[0] {
"arguments" => [
[0] "anonymous"
],
"command" => "USER"
},
[1] {
"arguments" => [
[0] "[email protected]"
],
"command" => "PASS"
},
[2] {
"arguments" => [],
"command" => "SYST"
},
[3] {
"arguments" => [],
"command" => "PWD"
},
[4] {
"arguments" => [
[0] "I"
],
"command" => "TYPE"
},
[5] {
"arguments" => [
[0] "/"
],
"command" => "SIZE"
},
[6] {
"arguments" => [
[0] "/"
],
"command" => "CWD"
},
[7] {
"arguments" => [],
"command" => "PASV"
},
[8] {
"arguments" => [
[0] "-l"
],
"command" => "LIST"
},
[9] {
"arguments" => [],
"command" => "QUIT"
}
]
},
"geoip" => {},
"credentials" => [
[0] {
"password" => "[email protected]",
"username" => "anonymous"
}
],
"type" => "Dionaea",
"tags" => [
[0] "_geoip_lookup_failure"
],
"src_port" => 53308,
"src_ip" => "172.20.254.130",
"path" => "/data/dionaea/log/dionaea.json",
"@timestamp" => 2017-04-10T14:29:39.062Z,
"dest_ip" => "172.17.0.8",
"@version" => "1",
"host" => "c25fd5a718f2",
"connection" => {
"protocol" => "ftpd",
"transport" => "tcp",
"type" => "accept"
},
"t-pot_ip" => "1.2.4.3",
"src_hostname" => "",
"dest_port" => 21,
"timestamp" => "2017-04-10T14:29:39.062577"
}
I'm sorry but I wasn't able to reproduce the error. I have setup an 5.3.0 ELK stack, dionaea log_json with flat data and all messages get indexed successfully without any issues. Are you doing some kind of post processing to add additional values?
@phibos Thanks for testing, retested with a very basic logstash config and ES unfortunately does not index the message with the array in place.
Just out of curiosity, did the username you tested with have "@" in its name?
Here my config (maybe you can share yours too)
# Input section
input {
# Dionaea
file {
path => ["/data/dionaea/log/dionaea.json"]
codec => json
type => "Dionaea"
}
}
# Filter Section
filter {
# Dionaea
if [type] == "Dionaea" {
date {
match => [ "timestamp", "ISO8601" ]
}
}
}
# Output section
output {
elasticsearch {
hosts => ["localhost:9200"]
}
# Debug output
stdout {
codec => rubydebug
}
}