spark
spark copied to clipboard
[SPARK-48737] Perf improvement during analysis - Create exception only when it is necessary for default expression resolving
What changes were proposed in this pull request?
When resolving of default value being done in method resolveColumnDefaultInAssignmentValue, exception that should be thrown in case default value is not resolvable is created in any case (positive or negative outcome), but exception creation can be expensive (because reading error-classes file to get exception message) we should avoid it and create exception only when it is needed.
Why are the changes needed?
Perf improvement (less file reads)
Does this PR introduce any user-facing change?
No
How was this patch tested?
No tests yet
Was this patch authored or co-authored using generative AI tooling?
No
@cloud-fan Can you review it? Is there some way to write test for this, I don't know how to track exception creation
thanks, merging to master!