[SPARK-52381][Core] JsonProtocol: Only accept subclasses of SparkListenerEvent
What changes were proposed in this pull request?
JsonProtocol tidy up. Only parse JSON relating to Spark events. https://issues.apache.org/jira/browse/SPARK-52381
Why are the changes needed?
Tidier code and https://lists.apache.org/thread/9zwkdo85wcdfppgqvbhjly8wdgf595yp
Does this PR introduce any user-facing change?
No
How was this patch tested?
Unit test
Was this patch authored or co-authored using generative AI tooling?
No
Test failure looks related, otherwise LGTM
[info] - SPARK-47148: AQE should avoid to submit shuffle job on cancellation *** FAILED *** (6 seconds, 95 milliseconds)
[info] scala.`package`.Seq.apply[org.apache.spark.SparkException](error).++[Throwable](scala.Option.apply[Throwable](error.getCause())).++[Throwable](scala.Predef.wrapRefArray[Throwable](error.getSuppressed())).exists(((e: Throwable) => e.getMessage().!=(null).&&(e.getMessage().contains("coalesce test error")))) was false (AdaptiveQueryExecSuite.scala:938)
cc @LuciferYang
It looks like the discussion thread is not accessible ? I am trying to understand the rationale for the change - is it to prevent a class cast exception ?
Test failure looks related, otherwise LGTM
[info] - SPARK-47148: AQE should avoid to submit shuffle job on cancellation *** FAILED *** (6 seconds, 95 milliseconds) [info] scala.`package`.Seq.apply[org.apache.spark.SparkException](error).++[Throwable](scala.Option.apply[Throwable](error.getCause())).++[Throwable](scala.Predef.wrapRefArray[Throwable](error.getSuppressed())).exists(((e: Throwable) => e.getMessage().!=(null).&&(e.getMessage().contains("coalesce test error")))) was false (AdaptiveQueryExecSuite.scala:938)cc @LuciferYang
@pan3793 that test seems to have passed on the last build for this PR. I have modified the test in this PR to make the assertion easier for ScalaTest to infer informative failure messages for - but the slow sql tests passed so the change didn't help.
It looks like the discussion thread is not accessible ? I am trying to understand the rationale for the change - is it to prevent a class cast exception ?
Friendly ping @mridulm, What do you think about this PR?
IMO this is not a security issue - as discussed in the thread, users need too much access to begin with in order to leverage the "exploit" (pls do let me know if I am missing some nuance here where this can be a problem !). Having said that, the change itself looks fine and does not introduce any behavior change.
I am not strongly in favor of, or against the change @LuciferYang - will let you drive it.
@pjfanning Is this PR the final solution for the issue in https://lists.apache.org/thread/9zwkdo85wcdfppgqvbhjly8wdgf595yp ?If not, please remove the relevant parts from the PR description.
also cc @xuanyuanking
@pjfanning Is this PR the final solution for the issue in https://lists.apache.org/thread/9zwkdo85wcdfppgqvbhjly8wdgf595yp ?If not, please remove the relevant parts from the PR description.
also cc @xuanyuanking
@LuciferYang this is a fix for the issue described in https://lists.apache.org/thread/9zwkdo85wcdfppgqvbhjly8wdgf595yp - I don't think any other changes would be needed other than what is here. I am one of the maintainers of the core Jackson libraries and have a pretty good idea of how they work.
Merged into master for Apache Spark 4.1.0. Thanks @pjfanning @pan3793
Since directly merging changes into branch-4.0 or branch-3.5 will result in compilation failures, if there is a need to merge changes into these older branches, please kindly submit a corresponding pr. @pjfanning Thanks ~
Thanks @LuciferYang - I created #51312 for branch 4.0.