databricks-sql-go
databricks-sql-go copied to clipboard
Fix type inference for int64/uint64 (BIGINT) and float64 (DOUBLE)
Summary
Fixes type inference bugs for numeric parameters:
-
int64/uint64: Were incorrectly mapped to
SqlIntegerinstead ofSqlBigInt(fixes #250) -
float64: Was incorrectly mapped to
SqlFloatinstead ofSqlDouble(fixes #314)
Problems Fixed
1. int64/uint64 → BIGINT
When inserting int64/uint64 values into BIGINT columns, the driver was sending them with type INTEGER instead of BIGINT, causing the server to reject large values with error:
[INVALID_PARAMETER_MARKER_VALUE.INVALID_VALUE_FOR_DATA_TYPE] An invalid parameter mapping was provided:
the value '1311768467463790320' for parameter 'null' cannot be cast to INT because it is malformed.
Additionally, int64 was using strconv.Itoa(int(value)) which truncates values larger than int32.
2. float64 → DOUBLE
When inserting float64 values into DOUBLE columns, the driver was sending them with type FLOAT (32-bit) instead of DOUBLE (64-bit), causing:
- Precision loss for high-precision float64 values
- Potential overflow for values beyond float32 range (~3.4e38)
3. Panic with explicit Parameter type
When using Parameter{Type: SqlBigInt, Value: int64(...)} with a non-string value, the driver panicked at convertNamedValuesToSparkParams due to unsafe type assertion.
Changes
-
parameters.go:- int64 now uses
strconv.FormatInt()and maps toSqlBigInt - uint64 now maps to
SqlBigInt - float64 now maps to
SqlDoubleinstead ofSqlFloat - Added safe type assertion with fallback in
convertNamedValuesToSparkParams
- int64 now uses
Test plan
- [x] Added unit tests for int64/uint64 type inference (
TestParameter_BigInt) - [x] Added unit tests for float64/float32 type inference (
TestParameter_Float) - [x] Verified large int64 values are correctly inserted and retrieved from BIGINT columns
- [x] Verified float64 values with high precision are correctly inserted and retrieved
- [x] All existing parameter tests pass
🤖 Generated with Claude Code