kafka-tutorials icon indicating copy to clipboard operation
kafka-tutorials copied to clipboard

Convert envProps to loading all props from properties file

Open ybyzek opened this issue 4 years ago • 2 comments

Summary Allow KTs to be more portable between Docker and non-Docker environments:

e.g. convert this line

props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, envProps.getProperty("bootstrap.servers"));

to below (no explicit setting of BOOTSTRAP_SERVERS_CONFIG is required)

final Properties props = KafkaProducerApplication.loadProperties(args[0]);

Target KTs to change

grep -ir put * | grep -i getProperty | grep -v input | awk -F: '{print $1;}' | uniq

aggregating-average/kstreams/code/src/main/java/io/confluent/developer/RunningAverage.java aggregating-count/kstreams/code/src/test/java/io/confluent/developer/AggregatingCountTest.java aggregating-count/kstreams/code/src/main/java/io/confluent/developer/AggregatingCount.java aggregating-minmax/kstreams/code/src/test/java/io/confluent/developer/AggregatingMinMaxTest.java aggregating-minmax/kstreams/code/src/main/java/io/confluent/developer/AggregatingMinMax.java aggregating-sum/kstreams/code/src/test/java/io/confluent/developer/AggregatingSumTest.java aggregating-sum/kstreams/code/src/main/java/io/confluent/developer/AggregatingSum.java cogrouping-streams/kstreams/code/src/test/java/io/confluent/developer/CogroupingStreamsTest.java cogrouping-streams/kstreams/code/src/main/java/io/confluent/developer/CogroupingStreams.java connect-add-key-to-source/kstreams/code/src/test/java/io/confluent/developer/connect/jdbc/specificavro/StreamsIngestTest.java connect-add-key-to-source/kstreams/code/src/main/java/io/confluent/developer/connect/jdbc/specificavro/StreamsIngest.java dynamic-output-topic/kstreams/code/src/test/java/io/confluent/developer/DynamicOutputTopicTest.java dynamic-output-topic/kstreams/code/src/main/java/io/confluent/developer/DynamicOutputTopic.java filtering/kstreams/code/src/test/java/io/confluent/developer/FilterEventsTest.java filtering/kstreams/code/src/main/java/io/confluent/developer/FilterEvents.java finding-distinct/kstreams/code/src/test/java/io/confluent/developer/FindDistinctEventsTest.java finding-distinct/kstreams/code/src/main/java/io/confluent/developer/FindDistinctEvents.java fk-joins/kstreams/code/src/test/java/io/confluent/developer/FkJoinTableToTableTest.java fk-joins/kstreams/code/src/main/java/io/confluent/developer/FkJoinTableToTable.java joining-stream-table/kstreams/code/src/test/java/io/confluent/developer/JoinStreamToTableTest.java joining-stream-table/kstreams/code/src/main/java/io/confluent/developer/JoinStreamToTable.java kafka-producer-application/kafka/code/src/test/java/io/confluent/developer/KafkaProducerApplicationTest.java kafka-producer-application/kafka/code/src/main/java/io/confluent/developer/KafkaProducerApplication.java kafka-producer-application-callback/kafka/code/src/test/java/io/confluent/developer/KafkaProducerCallbackApplicationTest.java kafka-producer-application-callback/kafka/code/src/main/java/io/confluent/developer/KafkaProducerCallbackApplication.java kafka-streams-schedule-operations/kstreams/code/src/test/java/io/confluent/developer/KafkaStreamsPunctuationTest.java kafka-streams-schedule-operations/kstreams/code/src/main/java/io/confluent/developer/KafkaStreamsPunctuation.java merging/kstreams/code/src/test/java/io/confluent/developer/MergeStreamsTest.java merging/kstreams/code/src/main/java/io/confluent/developer/MergeStreams.java message-ordering/kafka/code/src/test/java/io/confluent/developer/KafkaProducerApplicationTest.java message-ordering/kafka/code/src/main/java/io/confluent/developer/KafkaProducerApplication.java naming-changelog-repartition-topics/kstreams/code/src/main/java/io/confluent/developer/NamingChangelogAndRepartitionTopics.java serialization/kstreams/code/src/test/java/io/confluent/developer/serialization/SerializationTutorialTest.java serialization/kstreams/code/src/main/java/io/confluent/developer/serialization/SerializationTutorial.java serialization/kstreams/markup/dev/make-topology.adoc splitting/kstreams/code/src/test/java/io/confluent/developer/SplitStreamTest.java splitting/kstreams/code/src/main/java/io/confluent/developer/SplitStream.java streams-to-table/kstreams/code/src/test/java/io/confluent/developer/StreamsToTableTest.java streams-to-table/kstreams/code/src/main/java/io/confluent/developer/StreamsToTable.java time-concepts/ksql/code/src/main/java/io/confluent/developer/KafkaProducerDevice.java transforming/kafka/code/src/test/java/io/confluent/developer/TransformEventsTest.java transforming/kafka/code/src/main/java/io/confluent/developer/TransformEvents.java transforming/kstreams/code/src/test/java/io/confluent/developer/TransformStreamTest.java transforming/kstreams/code/src/main/java/io/confluent/developer/TransformStream.java tumbling-windows/kstreams/code/src/test/java/io/confluent/developer/TumblingWindowTest.java tumbling-windows/kstreams/code/src/main/java/io/confluent/developer/TumblingWindow.java

ybyzek avatar Dec 09 '20 18:12 ybyzek

@bbejeck I have a hypothesis as to why you implemented some of these KTs with envProps (middleman) instead of going directly to the Properties used to construct the Kafka client.

Because in some instances, there are additional properties the code wants to extract from the input file. Here is an example:

      topics.add(new NewTopic(
          envProps.getProperty("output.topic.name"),
          Integer.parseInt(envProps.getProperty("output.topic.partitions")),
          Short.parseShort(envProps.getProperty("output.topic.replication.factor"))));

Possible solution: maybe these should be 2 separate input files? One file for connection info (that go into Properties parameter) and one file for other (that are used elsewhere in the code)?

ybyzek avatar Dec 09 '20 23:12 ybyzek

Note: this is being done one KT at a time during the blitz on creating CCloud KTs

ybyzek avatar May 19 '21 10:05 ybyzek