fluent-bit
fluent-bit copied to clipboard
out_azure_logs_ingestion: make the stream_name explicit
While the stream name for Custom log table DCRs created through the Azure Portal GUI are predictable (i.e., if the Custom Log table is Example_CL, the stream name will be Custom-Example_CL), this is not always the case.
This is purely convention for DCRs created through the Azure Portal GUI, but by using the API or ARM templates, it's possible to create DCRs have completely arbitrary stream names, which is necessary if you're sending data to built-in tables (i.e., Syslog or CommonSecurityLog), and/or have multiple transformation streams defined in the same DCR.
This PR creates the stream_name parameter, so that it can be set explicitly.
One thing I'm not sure (and would appreciate feedback on) is if stream_name should be Optional, and assume the old behavior... while it would be nice for config backwards compatibility, the effect of getting it wrong is that logs ultimately won't end up in the right place, and other applications/SDKs require the stream name to be explicitly set.
Enter [N/A] in the box, if an item is not applicable to your change.
Testing Before we can approve your change; please submit the following in a comment:
- [x] Example configuration file for the change
- [x] Debug log output from testing the change
- [x] Attached Valgrind output that shows no leaks or memory corruption was found
If this is a change to packaging of containers or native binaries then please confirm it works for all targets.
- [N/A] Run local packaging test showing all targets (including any new ones) build.
- [N/A] Set
ok-package-testlabel to test for all targets (requires maintainer to do).
Documentation
- [x] Documentation required for this feature: https://github.com/fluent/fluent-bit-docs/pull/1305
Backporting
- [N/A] Backport to latest stable release.
Fluent Bit is licensed under Apache 2.0, by submitting this pull request I understand that this code will be released under the terms of that license.
Example config (you can use the instructions at https://learn.microsoft.com/en-us/azure/sentinel/connect-logstash-data-connection-rules#example-dcr-that-ingests-data-into-the-syslog-table to create an appropriate DCR):
[INPUT]
Name dummy
Dummy {"Computer": "dummy", "Message":"This is a test."}
[OUTPUT]
Name stdout
Format json_lines
Match *
[OUTPUT]
Name azure_logs_ingestion
Match *
client_id 00000000-0000-0000-0000-000000000000
client_secret 00000~0000000000000.0000000000000000-000
tenant_id 00000000-0000-0000-0000-000000000000
dce_url https://example-xxxx.westus2-1.ingest.monitor.azure.com
dcr_id dcr-00000000000000000000000000000000
table_name Syslog
stream_name Custom-SyslogStream
time_generated true
time_key TimeGenerated
Compress true
Valgrind output:
==41351==
==41351== HEAP SUMMARY:
==41351== in use at exit: 1,524 bytes in 4 blocks
==41351== total heap usage: 26,563 allocs, 26,559 frees, 21,018,260 bytes allocated
==41351==
==41351== LEAK SUMMARY:
==41351== definitely lost: 0 bytes in 0 blocks
==41351== indirectly lost: 0 bytes in 0 blocks
==41351== possibly lost: 0 bytes in 0 blocks
==41351== still reachable: 1,524 bytes in 4 blocks
==41351== suppressed: 0 bytes in 0 blocks
==41351== Reachable blocks (those to which a pointer was found) are not shown.
==41351== To see them, rerun with: --leak-check=full --show-leak-kinds=all
==41351==
==41351== Use --track-origins=yes to see where uninitialised values come from
==41351== For lists of detected and suppressed errors, rerun with: -s
==41351== ERROR SUMMARY: 468226 errors from 1000 contexts (suppressed: 0 from 0)
Just checking, was there anything else I need to do?
@jlaundry @edsiper and @patrick-stephens I originally developed this plugin. Will test it out today and post any change request if necessary.
@kforeverisback just checking, did you manage to take a look?