kuzu
kuzu copied to clipboard
Change default interpretation of real literals from DOUBLE to DECIMAL
In principle should fix #3949
Breaks a million other tests because decimal -> string formatting is slightly different from double -> string formatting, leading to errors where tests expect 1.000000 but get 1.0. This is probably not the only cause but it's very prevalent
Uses inferMinimalType from the csv-sniffing
branch to determine whether or not the string should be interpreted as a decimal or double.
Contributor agreement
- [x] I have read and agree to the Contributor Agreement.