spark
spark copied to clipboard
[SPARK-46813][CORE] Don't set the executor id to "driver" when SparkContext is created by the executor side
What changes were proposed in this pull request?
Don't set the executor id to "driver" when SparkContext is created by the executor side
Why are the changes needed?
fix a bug
Does this PR introduce any user-facing change?
No
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?
No
May I ask your use case, @huangxiaopingRD ? It would be great if you can put that into the PR description because
spark.executor.allowSparkContextis not recommended in the community since Apache Spark 3.1.
Thank you for your review @dongjoon-hyun , it just makes the code more reasonable, there is no special use case.
Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.
Although I am surprised that we have such an ability, the code change here looks incorrect to me, as the RpcEnv in a sub-SparkContext relies on this property.
You are right. But usually, the EXECUTOR_ID obtained by NettyRpcEnv from SparkConf has been correctly set in advance. Details can be seen in https://github.com/apache/spark/pull/23560/files
We're closing this PR because it hasn't been updated in a while. This isn't a judgement on the merit of the PR in any way. It's just a way of keeping the PR queue manageable. If you'd like to revive this PR, please reopen it and ask a committer to remove the Stale tag!