Sean Owen
Sean Owen
Wait, this was merged to master? Let's not merge to 3.5, no, because i don't think this actually fixes the problem (see the thread here - I didn't grok that...
It looks like there are many not-quite-the-same changes related to this JIRA. The change I'm concerned about is in https://github.com/apache/spark/pull/43494 - the whole trying to do floating-point math with a...
Ok, I think there are then multiple related but different changes going on here. I'm only questioning the handling of floating point part, which doesn't really work. Ideally that would...
I'm confused too, didn't we have a long conversation about this? The essence of your fix is this: https://github.com/apache/spark/pull/44690#discussion_r1477679566 but it merely doesn't work in different cases: https://github.com/apache/spark/pull/44690#discussion_r1479110884
So, this _doesn't_ work: ``` scala> val ONE_ENTIRE_RESOURCE: Long = 10000000000000000L | val taskAmount = 1.0/11.0 | var total: Double = ONE_ENTIRE_RESOURCE | for (i = taskAmount * ONE_ENTIRE_RESOURCE) {...
float -> integer conversion in the JVM always truncates so yes (for positive numbers) you are rounding down by doing this. I think my point is, the fix actually has...
Well, long * double is always carried out in double precision. Casting it to long doesn't make the math somehow exact. You will always have some truncation when casting to...
This is how your current change works though, by rounding down by (more than) 1 ulp. Yes, error accumulates, and yes you still end up with the 'wrong' total resource...
I'm guessing you have an old version of pytorch or transformers. Please install the requirements as shown in this repo's requirements.txt
What version of torch and transformers do you have, just to be sure? it could be that we need to pin torch