Ben Fradet
Ben Fradet
> Is there a particular reason you haven't been mocking dependencies out of your unit tests? That's a fallacy I'm aware of but haven't had the time to fix. Ideally,...
seems like a sensible idea :+1:
Hello, this is just at the idea stage. I'm not familiar enough with Python and PySpark to do it myself.
hello, no this isn't supported at all at the moment
hey 0.5.0 is 2.12 compatible: https://search.maven.org/artifact/com.github.benfradet/spark-kafka-writer_2.12/0.5.0/jar
ah yes I might have published 2.12 only, I'll see what I can do for 2.11
this is weird because we stop the session for each test without issues https://github.com/BenFradet/spark-kafka-writer/blob/master/src/test/scala/com/github/benfradet/spark/kafka/writer/SKRSpec.scala#L68-L77
ah, it might well be :thinking:
It should be possible instantiating your own `Function1` like: ```java Function1 f = new AbstractFunction1
My bad didn't take into account the fact that you had a `JavaDStream`. Can't you call [`dstream`](http://spark.apache.org/docs/latest/api/java/org/apache/spark/streaming/api/java/JavaDStream.html#dstream()) on your `JavaDStream`? and then `writeToKafka`?