a-kafka-story
a-kafka-story copied to clipboard
Kafka ecosystem ... but step by step!
Please checkout these awesome references
- http://developer.confluent.io/
- https://kafka-tutorials.confluent.io/
And if you want to learn another way, just follow these steps.
Make docker and maven do their thing once for all by running ./fetch.sh
Then jump in the Kafka Story!
- One zookeeper, one kafka broker
- One zookeeper, many kafka brokers
- Java consumer, java producer
- Let's add data with telegraf
- Let's setup better defaults
- Enter kafka stream
- Capture JMX metrics
- Grafana
- Kafka Connect
- Kafka Connect and Schema Registry
- Change Data Capture
- Change Data Capture and Schema Registry
- Change Data Capture and Schema Registry and export to S3
- Ksql
- Ksql server and UI
- Change Data Capture, Schema Registry and Ksql
- Change Data Capture, JSON, Ksql and join
- Random producer and Complex joins
- Sync random producer and mysql, capture CDC diff and push it to telegraf
Don't like Docker ? Please download Confluent platform here: https://www.confluent.io/download/
Also, please take a look at
- https://github.com/confluentinc/cp-demo
- https://github.com/confluentinc/demo-scene
- https://github.com/confluentinc/examples
- https://github.com/confluentinc/kafka-streams-examples
- https://www.confluent.io/stream-processing-cookbook/