beats
beats copied to clipboard
Add Microsoft DNS log ingest support to existing filebeat Microsoft module
Add support for Microsoft DNS logs ingested via filebeat from files written to disk my Microsoft DNS server.
I will issue a pull request from a form containing working code/config for this.
Filebeat module
- [ ] Test log files exist for the grok patterns
- [ ] Generated output for at least 1 log file exists
Pinging @elastic/siem (Team:SIEM)
Not seeing much movement on this. It will add a lot of value. Let us know if we can assist with making progress.
There is an Elastic Agent / Fleet integration for Microsoft DHCP log files. https://docs.elastic.co/en/integrations/microsoft_dhcp#logs
And there are plans to add event log collect to the integration to grab any DHCP server related event logs. https://github.com/elastic/integrations/issues/2756
Pinging @elastic/security-external-integrations (Team:Security-External Integrations)
@andrewkroh yes we use that one and it works great. The title of this page is DNS though and that is what we are commenting on. We hope to see some progress on a DNS integration soon.
My bad. I read DHCP when it said DNS. 🤦
It would be helpful to point to some documentation for the logs that we want collected (what's the format, where do they get pulled from, how do you enable logging, etc). And if anyone has samples please anonymize and attach them.
@threatangler-jp @andrewkroh - this is what we've been using as a plain filebeat config, it was built quickly and is kinda hacky but it gets us what we need in these situations.
Refer to:
- https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2012-r2-and-2012/dn593669(v=ws.11)
- https://www.trustedsec.com/blog/tracing-dns-queries-on-your-windows-dns-server/
- https://docs.nxlog.co/userguide/integrate/dns-monitoring-windows.html#dns_windows_setup_debug
PowerShell based configuration of Microsoft DNS to log to file,
Set-DnsServerDiagnostics -Queries $True -Answers $True -Notifications $True -Update $True -QuestionTransactions $True -UnmatchedResponse $True -SendPackets $True -ReceivePackets $True -TcpPackets $True -UdpPackets $True -FullPackets $True -EventLogLevel 4 -EnableLoggingToFile $True -LogFilePath c:\Windows\System32\dns\dns.log -MaxMBFileSize 8388608 -SaveLogsToPersistentStorage $True -WriteThrough $True
# ingest of Microsoft DNS server logs
filebeat.inputs:
- type: log
paths:
- 'c:\Windows\System32\dns\*.log'
- 'c:\logs\dns\*.log'
exclude_files: ['.gz$','.zip$']
multiline:
pattern: '^[0-9]+/'
negate: true
match: after
flush_pattern: '\n\n'
tags: ['forwarded']
fields_under_root: true
fields:
event:
module: 'microsoft'
dataset: 'microsoft.dns'
network:
protocol: 'dns'
processors:
# parse the event to a basic level so we can use fields to filter further and perform additional parsing on ingest
- script:
lang: javascript
source: >
function process(event) {
var matches = event.Get("message").match(/(\d+)\/(\d+)\/(\d+)\s+(\d+):(\d+):(\d+)\s+(AM|PM)\s+(\w+)\s+PACKET\s+(\w+)\s+(UDP|TCP)\s+(Rcv|Snd)\s+([0-9a-fA-F\.:]+)\s+(\w+)\s+([\s\w]+)+\s+\[([\w\s]+)\]\s+(\w+)\s+(\([^\s]+\))/);
if (matches && matches.length == 18) {
if (matches[7] === "PM") {
var hour = Number(matches[4])
matches[4] = hour + 12;
}
String("000" + matches[2]).slice(-5)
event.Put("fields.dns.datetime", matches[3] + "/" + String("00" + matches[2]).slice(-2) + "/" + String("00" + matches[1]).slice(-2) + " " + String("00" + matches[4]).slice(-2) + ":" + String("00" + matches[5]).slice(-2) + ":" + String("00" + matches[6]).slice(-2));
event.Put("process.thread.id", parseInt(matches[8], 16));
event.Put("span.id", matches[9]);
event.Put("network.transport", matches[10].toLowerCase());
event.Put("host.ip", matches[12]);
event.Put("event.id", matches[13]);
event.Put("dns.id", matches[13]);
event.Put("fields.dns.opcode", matches[14]);
event.Put("fields.dns.qflags", matches[15]);
event.Put("dns.question.type", matches[16]);
event.Put("dns.question.name", matches[17].replace(/\([0-9]+\)/g, '.').replace(/^\.|\.$/g, ''));
switch(matches[14]) {
case "":
case "Q":
event.Put("dns.op_code", "QUERY");
break;
case "R Q":
event.Put("dns.op_code", "RESPONSE");
break;
case "N":
event.Put("dns.op_code", "NOTIFY");
break;
case "U":
event.Put("dns.op_code", "UPDATE");
break;
default:
event.Put("dns.op_code", "UNKNOWN");
}
if (matches[11].toLowerCase() === "rcv") {
event.Put("source.ip", matches[12]);
event.Put("network.direction", "ingress");
switch(event.Get("dns.op_code")) {
case "QUERY":
event.Put("dns.type", "query");
event.Put("client.ip", matches[12]);
break;
case "RESPONSE":
event.Put("dns.type", "answer");
event.Put("server.ip", matches[12]);
break;
default:
event.Put("dns.type", "other");
}
} else if (matches[11].toLowerCase() === "snd") {
event.Put("destination.ip", matches[12]);
event.Put("network.direction", "egress");
switch(event.Get("dns.op_code")) {
case "QUERY":
event.Put("dns.type", "query");
event.Put("server.ip", matches[12]);
break;
case "RESPONSE":
event.Put("dns.type", "answer");
event.Put("client.ip", matches[12]);
break;
default:
event.Put("dns.type", "other");
}
}
matches = event.Get("fields.dns.qflags").match(/(\w+)\s+([\s\w]+)\s+(\w+)/);
if (matches && matches.length === 4) {
event.Put("fields.dns.header.flags.hex", matches[1]);
event.Put("dns.response_code", matches[3]);
var header_flags = [];
if (matches[2].indexOf("A") > -1) { header_flags.push("AA"); }
if (matches[2].indexOf("T") > -1) { header_flags.push("TC"); }
if (matches[2].indexOf("D") > -1) { header_flags.push("RD"); }
if (matches[2].indexOf("R") > -1) { header_flags.push("RA"); }
event.Put("dns.header_flags", header_flags);
}
} else {
event.Tag("dns-regex-fail");
}
}
# parse domain name
- registered_domain:
field: dns.question.name
target_field: dns.question.registered_domain
target_etld_field: dns.question.top_level_domain
target_subdomain_field: dns.question.subdomain
ignore_missing: true
ignore_failure: true
# fix timestamp for event from when log was read to when log says the event happened
# NOTE: Adjust timezone here if using on a system which does not log in QLD standard, aka. Brisbane, time
- timestamp:
field: fields.dns.datetime
timezone: 'Australia/Brisbane'
layouts:
- '2006/01/02 15:04:05'
test:
- '2021/12/07 16:33:51'
ignore_missing: true
ignore_failure: true
# garbage removal
- drop_fields:
fields: ['fields.dns.datetime', 'fields.dns.header.flags.hex', 'fields.dns.opcode', 'fields.dns.qflags']
pipeline: 'filebeat-microsoft-dns-pipeline'
publisher_pipeline.disable_host: true
Ingest pipeline,
{
"description": "Pipeline to process Microsoft DNS logs",
"version": 1,
"processors": [
{
"grok": {
"field": "message",
"patterns": [
"Msg length = %{DATA} \\(%{DATA:dns.length:int}\\)"
],
"ignore_missing": true,
"ignore_failure": true,
"description": "Extract total packet length in bytes"
}
},
{
"set": {
"field": "dns.question.class",
"value": "IN",
"override": false,
"ignore_failure": true,
"description": "Set question class if not already set"
}
},
{
"geoip": {
"target_field": "source.geo",
"ignore_missing": true,
"field": "source.ip"
}
},
{
"geoip": {
"field": "destination.ip",
"target_field": "destination.geo",
"ignore_missing": true
}
},
{
"geoip": {
"database_file": "GeoLite2-ASN.mmdb",
"field": "source.ip",
"target_field": "source.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true
}
},
{
"geoip": {
"field": "destination.ip",
"target_field": "destination.as",
"properties": [
"asn",
"organization_name"
],
"ignore_missing": true,
"database_file": "GeoLite2-ASN.mmdb"
}
},
{
"rename": {
"ignore_missing": true,
"field": "source.as.asn",
"target_field": "source.as.number"
}
},
{
"rename": {
"field": "source.as.organization_name",
"target_field": "source.as.organization.name",
"ignore_missing": true
}
},
{
"rename": {
"field": "destination.as.asn",
"target_field": "destination.as.number",
"ignore_missing": true
}
},
{
"rename": {
"field": "destination.as.organization_name",
"target_field": "destination.as.organization.name",
"ignore_missing": true
}
},
{
"append": {
"value": "{{host.name}}",
"allow_duplicates": false,
"if": "ctx.host?.name != null && ctx.host?.name != ''",
"field": "related.hosts"
}
}
],
"on_failure": [
{
"append": {
"field": "error.message",
"value": "{{ _ingest.on_failure_message }}"
}
}
]
}
hi @colin-stubbs, The configuration that you posted, is this for logstash or is it possible to use this as a elasticsearch ingest pipeline?
I created a ingest pipeline for DNS logs from a Windows Server 2019 but get "dns-regex" and only get the "message" field without other fields.
Can you help me get this going?
We tried to use packetbeat with but it's not possible to run packetbeat with windows defender for identity simultaneously.
Thank you.
Hi @gittihub123, it's an Elasticsearch ingest pipeline, https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest.html
Hi @colin-stubbs, The pipeline is not working on our environment. I will upload my configuration and error message in a bit. Thank you.
Hi @colin-stubbs, We had some issue with Packetbeat on our Windows server but the issue is now solved. It was a misconfiguration, we were sniffing the wrong interface. That caused Windows defender for identity and packetbeat to crash simualtaneously. However, the ingest-pipeline didn't work as expected. If you are interested in input from me, please PM me and we can updated the pipeline together.
Thanks.
Add support for Microsoft DNS logs ingested via filebeat from files written to disk my Microsoft DNS server.
I will issue a pull request from a form containing working code/config for this.
Hi Colin,
Did you actually make a pull request? I need this too and would use your pull request if I could.. otherwise I could take the stuff from your comment in this thread.
Glen
any updates on this ?
Hi! We just realized that we haven't looked into this issue in a while. We're sorry!
We're labeling this issue as Stale
to make it hit our filters and make sure we get back to it as soon as possible. In the meantime, it'd be extremely helpful if you could take a look at it as well and confirm its relevance. A simple comment with a nice emoji will be enough :+1
.
Thank you for your contribution!
Hi, Yes this is still highly wanted.
/G
Guys, is there any agent or filebeat configuration that can parse ship DNS debug logs from text file in windows server. If not, how are other people analyzing their windows DNS server query logs...
Pinging @elastic/sec-windows-platform (Team:Security-Windows Platform)
We are a new paid elastic customer and are surprised there is not a drop-in integration available. Please get this out!
Also wanting to have a out-of-the-box integration to pull DNS Logs from a Windows DNS Server. The other easiest method I have sees is to use the "Network Packet Capture" (Packetbeat) Integration in a Policy specifically for your Windows DNS Servers and turn on "Capture DNS Traffic" for Port 53 (or whatever you are using) and this will do a decent job. But I have issues with running Packetbeat and would prefer to just read from the DNS Log File on disk.
We logged this request 3 years back and logged a P1- case with Elastic support due to business impact while missing the DNS logs - Case #01597447. The only workaround right now was to build our own custom logs integration with the multiline parser. This is unbelievable that this basic feature in missing in elastic.
We logged this request 3 years back and logged a P1- case with Elastic support due to business impact while missing the DNS logs - Case #01597447. The only workaround right now was to build our own custom logs integration with the multiline parser.
This is unbelievable that this basic feature in missing in elastic.
My current solution is to use the Network Packet Capture Integration with just DNS turned on. It parses everything out properly and has pretty much all the data we need/want. But the main downfall is it has to use npcap, which on some systems has caused conflicts with other applications that also need npcap. So I'm only using it on DNS Servers. For other systems, I'm using the Defend Integration to capture DNS, but it doesn't collects the DNS Logs for the system itself running on, not the DNS Server Logs. Hope that makes sense.
We logged this request 3 years back and logged a P1- case with Elastic support due to business impact while missing the DNS logs - Case #01597447. The only workaround right now was to build our own custom logs integration with the multiline parser. This is unbelievable that this basic feature in missing in elastic.
My current solution is to use the Network Packet Capture Integration with just DNS turned on. It parses everything out properly and has pretty much all the data we need/want. But the main downfall is it has to use npcap, which on some systems has caused conflicts with other applications that also need npcap. So I'm only using it on DNS Servers. For other systems, I'm using the Defend Integration to capture DNS, but it doesn't collects the DNS Logs for the system itself running on, not the DNS Server Logs. Hope that makes sense.
Hi, this works, but if the logs are in a text file which is generated from dns debug logs of Microsoft DNS server, this solution can't be used. Otherwise this work around is perfect
If you go for custom logs integration with a log path you need like " C:\Windows\System32\dns\dns.log " with a custom pipeline (multiline) and the right regex in the integration it will give you the logs. multiline: type: pattern pattern: '^\d{1,2}/\d{2}/\d{4}' negate: true match: after
If we ever deal with US customers it might need to be ^\d{1,2}/\d{1,2}/\d{4} to account for month/day/year
Closing this issue as we now have a Windows DNS integration which supports both Analytical & Audit events. https://docs.elastic.co/integrations/microsoft_dnsserver
Thanks a lot @jamiehynds
Thanks a lot @jamiehynds
Full credit to @chemamartinez who built the ETW input and the integration itself. Look forward to hearing how you get on with the integration @kalramani