Ran Tao
Ran Tao
ForEachDStream ```scala override def generateJob(time: Time): Option[Job] = { parent.getOrCompute(time) match { case Some(rdd) => val jobFunc = () => createRDDWithLocalProperties(time, displayInnerRDDOps) { foreachFunc(rdd, time) } Some(new Job(time, jobFunc)) case...
spark streaming的job会涉及到DAGScheduler吗? 似乎 没有看到job的stage是如何划分的, job直接扔到线程池去执行了.
@klion26 嗯嗯 action操作最终会触发到sparkcontext的runjob方法,刚刚看了下源码 :smile:
@MartijnVisser @XComp Hi, guys. PTAL. thanks
@MartijnVisser hi, Martijn. Can you help to review it?
> @chucheng92 Looking at the dependency tree, there's still commons-collections coming in via Flink itself. If we want to remove the dependency, we should also make sure that it's not...
> @chucheng92 Looking at the dependency tree, there's still commons-collections coming in via Flink itself. If we want to remove the dependency, we should also make sure that it's not...
looks good to me in general. just one question, does this exception have to be thrown at runtime? i noticed that the exception is thrown directly in SqlFunctions.
@caicancai could you improve the current commit name? It is different from jira and pr names.
@leonardBang @PatrickRen sorry to bother you, can someone of you have a look please?