Precision not respected for decimals in struct
Hi! I am using the databricks-sql-go library in my application and seeing this weird behaviour when querying decimal columns. I am using CAST(... AS DECIMAL(5, 2)) to get my desired precision and it works for DECIMAL columns but when I do the same for values inside structs, the precision does not apply.
For example in this sql
SELECT
'1' AS id,
19.99 as test_decimal,
named_struct('col1', 'Field1', 'col2', 19.99) as arr,
CAST(19.99 AS DECIMAL(5, 2)) as test_decimal2,
named_struct('col1', 'Field1', 'col2', CAST(19.99 AS DECIMAL(5, 2))) as arr2;
Heres a look at the sql row in debug mode
Note that the decimal value in both structs is
19.990000000000002
With this sql I am querying the decimal value of 19.99 in different ways and I was expecting CAST(19.99 AS DECIMAL(5, 2) to be respected but instead I still get 19.990000000000002
Is there a better way to do this? Is this an issue with the driver and are there any plans to fix this on the driver side?
Interesting. I was able to reproduce with the following code:
package main
import (
"context"
"database/sql"
"fmt"
_ "github.com/databricks/databricks-sql-go"
)
func main() {
db, err := sql.Open("databricks", "token:*****@*****:443/sql/1.0/warehouses/*****")
if err != nil {
panic(err)
}
defer db.Close()
row := db.QueryRowContext(context.Background(), `SELECT '1' AS id,
19.99 as test_decimal,
named_struct('col1', 'Field1', 'col2', 19.99) as arr,
CAST(19.99 AS DECIMAL(5, 2)) as test_decimal2,
named_struct('col1', 'Field1', 'col2', CAST(19.99 AS DECIMAL(5, 2))) as arr2;`)
var id string
var testDecimal float64
var arr any
var testDecimal2 float64
var arr2 any
err = row.Scan(&id, &testDecimal, &arr, &testDecimal2, &arr2)
if err != nil {
panic(err)
}
// Set a breakpoint on the next line to inspect variables in your debugger
fmt.Println(id, testDecimal, arr, testDecimal2, arr2)
}