kafka-application4s icon indicating copy to clipboard operation
kafka-application4s copied to clipboard

Kafka App for Scala

Master Workflow

This module is the attached source code from the blog post Getting Started with Scala and Apache Kafka. It discusses how to use the basic Kafka Clients in a Scala application. Originally inpired by the first scala example, it goes beyond by showing multiple ways to produce, to consume and to configure the clients.

  1. Try it
  2. Produce
  3. Consume
  4. Read More

Try it

git clone https://github.com/DivLoic/kafka-application4s.git
cd kafka-application4s
sbt compile

Local

You first need to run Kafka and the Schema Registry. Any recent installation of Kafka or the Confluent platform can be used. Many installation methods can be found on the CP Download Page.

i.e. Confluent Cli on Mac

curl -sL https://cnfl.io/cli | sh -s -- latest -b /usr/local/bin
export CONFLUENT_HOME=...
export PATH=$PATH:$CONFLUENT_HOME
confluent local services schema-registry start

Cloud

The module also works with a cluster hosted on Confluent Cloud. You will find in consumer.conf and producer.conf the commented config related to the cloud. After that, you will need either to edit these files or to define the following variables:

export BOOTSTRAP_SERVERS="...:9092"
export CLUSTER_API_KEY="..."
export CLUSTER_API_SECRET="..."
export SCHEMA_REGISTRY_URL="https:/..."
export SR_API_KEY="..."
export SR_API_SECRET="..."

For more on Confluent Cloud login see the documentation.

Produce

Run:

sbt produce "-Djline.terminal=none" --error  

asciicast

Consume

Run:

sbt consume "-Djline.terminal=none" --error  

asciicast

Read more