Himadri Pal
Himadri Pal
here is a revised PR #9803 for this issue for spark 3.5
Thanks for reviewing @RussellSpitzer, fixed all the review comments.
@JanKaul WDYT? I think this PR is ready for review, I can add the update and delete in a separate PR.
I was trying to take a look at this one - I added this test in the CometCastSuite - ```scala test("cast between decimals with different precision and scale") { val...
thank you @andygrove for the guidance and tip. I'll explore `spark.comet.explainFallback.enabled=true` as well.
this [arrow pr](https://github.com/apache/arrow-rs/pull/6836) will fix this issue completely. waiting for [arrow release](https://github.com/apache/arrow-rs/issues/6342) and then Datafusion release later.
I was looking at this issue and now - with comet enabled and comet disabled shows same result - ```sql +-----+-----------+ | n| converted| +-----+-----------+ |-10.0| -10.00| | +1.0| 1.00|...
take, this [arrow-pr](https://github.com/apache/arrow-rs/pull/6905#pullrequestreview-2523334113) should fix one.
@szehon-ho please take a look.
> > With this fix, incompatible number of buckets do not return 1 as GCD, hence the buckets do not reduce to 1 when it used in incompatible number of...