[Spike] Trigger for messaging systems
We need to research how we can add a trigger for messaging systems (message queues / streams) on Tracetest (Kafka, RabbitMQ, etc) and plan how to include this feature on Tracetest
Some common open-source message systems:
Some common proprietary message systems:
- Amazon Kinesis
- Amazon SQS
- Google Cloud Pub/Sub
Protocols:
Conventions for Message systems: https://opentelemetry.io/docs/specs/otel/trace/semantic_conventions/messaging/
Team, giving a heads up about this issue:
I believe that we can focus on implementing the trigger using the AMQP protocol first and later on MQTT since most of the open-source message brokers support them. There are discussions on Trace-Context standardization on both protocols (here and here). I'm building a PoC of that using Kafka on poc/quick-start-go-and-kafka branch to validate this hypothesis and see if we can go through this way.
In case of failure, I think that Plan B could implement direct messaging to the following platforms (in order of priority):
@kdhamric @olha23, as soon as I get an answer, I will get the required fields that we need to configure the trigger to work on the design of the CLI and FE.
Team, for the first version, we'll focus on two messaging systems:
- Kafka (that has its protocol) and
- AMQP-based systems, like Rabbit and ActiveMQ).
Kafka Specs: https://github.com/kubeshop/tracetest/issues/2983#issuecomment-1657181053 Kafka Proof-of-Concept: https://github.com/kubeshop/tracetest/pull/3004
Kafka
To use Kafka, the trigger will need the following data:
Kafka connection:
| Field | Description |
|---|---|
| Broker urls | Array containing the host addresses that we can send a message |
| Topic | Name that categorizes a message. Usually, messages are consumed by topics. |
| Authentication | Can be "None" or "Plain" |
A "Plain" authentication has a plain user and password.
Message fields (based on Kafka Record format):
| Field | Description |
|---|---|
| Headers | Set of key/value pairs telling something about this message. OTel uses these headers to propagate a context. |
| Key | Value that identify a message |
| Value | Content of a message |
Some examples of UI to publish messages to Kafka can be seen here:
- https://www.confluent.io/blog/create-kafka-messages-from-within-control-center-for-better-kafka-management/
- https://towardsdatascience.com/overview-of-ui-tools-for-monitoring-and-management-of-apache-kafka-clusters-8c383f897e80
- https://dzone.com/articles/building-a-simple-kafka-client-for-the-web-and-des
A Test trigger YAML for Kafka could be something like this:
type: Test
spec:
id: jFpHiL34R
name: Test Kafka Message Publishing
trigger:
type: kafka
kafka:
brokerUrls:
- kafka-1:9092
- kafka-2:9092
topic: ExampleMessage
headers:
- key: my-header
value: my-value
messageKey: "message-key"
messageValue: "{\n \"hello\": \"kafka!\"\n}"
specs:
- selector: span[tracetest.span.type="messaging" name="messaging receive" messaging.system="kafka" messaging.operation="receive"]
name: It processed a message from Kafka
assertions:
- attr:tracetest.selected_spans.count = 1
@danielbdias can you please check the mockups and left comment if something wrong? https://www.figma.com/file/nXy4eBvpiQ3P4Jer0ogJOw/v0.11-release?type=design&node-id=859%3A3988&mode=design&t=90tw4FnqB8asvk7n-1
Folks, about Kafka authentication: looking deeper into the documentation, I've found the documentation for SASL on Kafka:
- https://docs.confluent.io/platform/current/kafka/overview-authentication-methods.html
- https://developer.ibm.com/tutorials/kafka-authn-authz/
On this first version, I plan to support only "Plain / SCRAM" authentication that uses just a User/Password. In the future, we can add more options to it.
Sounds good Daniel. As we get users with other auth needs we can add them.