Jim Kleckner

Results 52 comments of Jim Kleckner

@deepujain, If you are using YARN, bring up the page `NEW,NEW_SAVING,SUBMITTED,ACCEPTED,RUNNING Applications` by browsing on the master node port `9026` (what AWS EMR uses but can vary) as in `http://127.0.0.1:9026/cluster`...

@hokiegeek2 glad you found it. Recently, I found that spark jobs could hang because exceptions didn't pass up to an exit and added this snippet. Now the testing process doesn't...

> .settings( . . . ) .disablePlugins(AssemblyPlugin) OMG, finally found this simple and important fix.

Looks like fixing this bug requires a substantial investment in learning scalaz. Anyone familiar see how to apply the `eclipseOutput` setting to the `kind=output` directory path?

+2 for supporting eclipse on sbt. Circular link back to one recipe for accomplishing this: https://groups.google.com/d/msg/simple-build-tool/zA27U9AoNSU/dikm32YWKQUJ

Nice! Note that if you are using Paul Phillips' script, it would go into ~/.sbt/0.11.3/plugins for example.

I'm testing with `docker-desktop` with `K8s Rev: v1.19.3` on my Mac. From a clean k8s state in docker, run: `helm install spark-operator .../github.com/GoogleCloudPlatform/spark-on-k8s-operator/charts/spark-operator-chart --namespace spark-operator --set sparkJobNamespace=spark --set rbac.create=true --set...

That was what I had for my original spark app spec that worked with the old chart: ``` spec: ... executor: instances: 5 # Number of pods ... labels: version:...

Creating a link to related multi-version issue #610