zipkin-sparkstreaming
zipkin-sparkstreaming copied to clipboard
Document and test how to add an ad-hoc adjuster
Adding an adjuster should be as simple as the following..
- Set your classpath to zipkin-sparkstreaming-job in your classpath (laziest could use the all jar)
- Create a java source file that..
- extends
zipkin.sparkstreaming.Adjuster
- has an annotation
org.springframework.context.annotation.Configuration
- Create a resource file
META-INF/spring.factories
that contains
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
your.package.FooAdjuster
Put your compiled class and META-INF/spring.factories
into a jar. Place that jar in the classpath of the spark job.
In maven, the following structure would accomplish this.
src/main/java/your/package/FooAdjuster.java
src/main/resources/META-INF/spring.factories
Why this should work..
The spring factories thing allows spring boot to pick up and load the class you made. This is called auto-configuration... like java service loader, but better. Read more here.
ZipkinSparkStreamingConfiguration
looks for any beans of type Adjuster
and collects them. These are applied in the spark job. By default, only the all-jar goes across the cluster. However, any jar which contains an adjuster is also sent (implicitly via SparkConf.sparkJars
). This can also be manually controlled via zipkin.sparkstreaming.spark-jars
.
so the key here is that we shouldn't need a custom build to use an ad-hoc adjuster. In other words, users shouldn't need to re-shade zipkin-dependencies-job, or invalidate docker layers.
I think @naoman is going to give this a swing in https://github.com/openzipkin/zipkin-sparkstreaming-example, then we could probably link in the main readme to that
The code is checked in. I've also added a link in the main readme.