Ben Fradet

Results 114 comments of Ben Fradet

True, forgot about the implicits and Java. You might want to try: ```java import static com.github.benfradet.spark.kafka010.write.dStreamToKafkaWriter KafkaWriter w = dStreamToKafkaWriter(javaDStream.dStream()); w.writeToKafka(...); ```

Have you tried having `AbstractFunction1` extend `Serializable`: ```java Function1 f = new AbstractFunction1() extends Serializable { @Override public ProducerRecord apply(final String s) { return new ProducerRecord("my-topic", s); } }; ```...

Great stuff 👍 Feel free to open a PR adding a section in the readme with your code

What do you mean? the procuder record is created in the function `f` so every time a message is sent a new producer record is created.

Yeah, you wrote: >Does it take care that __ProducerRecord__ is created only once and used across all executors? that's what confused me. To answer your question, only one __Producer__ is...

You need to turn your java map into a scala one: ```java import scala.collection.JavaConverters._ // ... kafkaWriter.writeToKafka(producerConfig.asScala(), f,Option.empty()); ```

I haven't changed anything in my workflow :fearful:

Hello, it's not supported at the moment no. I believe it is a fairly recent endpoint. A PR would definitely be welcome though :)

The issue is that this will be a bit noisy for watchers of the repo, however I think we can add some unit tests?