spark icon indicating copy to clipboard operation
spark copied to clipboard

[SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side

Open huangxiaopingRD opened this issue 1 year ago • 3 comments

What changes were proposed in this pull request?

Don't set the executor id to "driver" when SparkContext is created by the executor side

Why are the changes needed?

fix a bug

Does this PR introduce any user-facing change?

No

How was this patch tested?

Was this patch authored or co-authored using generative AI tooling?

No

huangxiaopingRD avatar Jan 23 '24 12:01 huangxiaopingRD

May I ask your use case, @huangxiaopingRD ? It would be great if you can put that into the PR description because spark.executor.allowSparkContext is not recommended in the community since Apache Spark 3.1.

Thank you for your review @dongjoon-hyun , it just makes the code more reasonable, there is no special use case.

huangxiaopingRD avatar Jan 24 '24 02:01 huangxiaopingRD

Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.

yaooqinn avatar Jan 24 '24 05:01 yaooqinn

Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.

You are right. But usually, the EXECUTOR_ID obtained by NettyRpcEnv from SparkConf has been correctly set in advance. Details can be seen in https://github.com/apache/spark/pull/23560/files

huangxiaopingRD avatar Jan 24 '24 09:01 huangxiaopingRD

We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable. If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!

github-actions[bot] avatar May 04 '24 00:05 github-actions[bot]