pyodbc
pyodbc copied to clipboard
Numeric value out of range (0) with Decimal types in TVPs
Environment
- Python: 3.8.10 (64-bit)
- pyodbc: 4.0.32
- OS: Windows 10 Enterprise
- DB: Microsoft SQL Server 2014
- driver: ODBC Driver 17 for SQL Server (version 2017.174.01.01)
Note: Was able to reproduce with same code in the following as well:
- Python: 3.8.12
- pyodbc: 4.0.32
- OS: Red Hat 9.3.1-2
- DB: Microsoft SQL Server 2014
- driver: ODBC Driver 17 for SQL Server
Issue
This looks similar to https://github.com/mkleehammer/pyodbc/issues/596 in that pyodbc seems to deduce the type of the TVP from the first row, but seems to do this with precision of Decimals. That is to say if the first row uses '4.000' then '25.000' in the second row (in the same column) is "too big."
Reproduction
from decimal import Decimal
import pyodbc
CONN_STRING = "DRIVER={ODBC Driver 17 for SQL Server};DATABASE=somedatabase;Server=someserver;UID=someuser;PWD=somepassword"
conn = pyodbc.connect(CONN_STRING)
cursor = conn.cursor()
# cursor.execute('CREATE TYPE dbo.DecimalTestingType AS TABLE(val DECIMAL(20,4))')
# cursor.execute('CREATE PROCEDURE dbo.DecimalTestProcedure @Tvp dbo.DecimalTestingType READONLY AS SELECT * FROM @Tvp')
data = [
[Decimal('4.0000')],
[Decimal('25.0000')],
]
"""
# Note: Using the following instead succeeds
data = [
[Decimal('25.0000')],
[Decimal('4.0000')],
]
"""
# Throws "pyodbc.DataError: ('22003', '[22003] [Microsoft][ODBC Driver 17 for SQL Server]Numeric value out of range (0) (SQLPar
amData)')"
cursor.execute('EXECUTE dbo.DecimalTestProcedure ?', [data])
results = cursor.fetchall()
print(results)
Thanks for the issue report and the repro code - we'll take a look
Is there any solution for this issue?
It looks like it chooses the length for SQLBindParameter based on the first item, I'm trying to figure out a fix.
It looks like the way it designed it isn't obvious on how to fix it without breaking something else. There is some common code that detects type and width, but it's used by everything, not just TVP, so I'm not sure how to fix it in such a way as to not affect normal functionality.
Is there any solution for this yet? Thanks!
Any update on this? I'm trying to write a pandas dataframe to a sql database and am getting the same error.
The right way would probably be to rewrite the array parameter code to scan the parameters before performing the actual bind. It could make a best guess on the first row, then keep updating until it finds a row that needs different binding. (Which should be unlikely.)
Not only would it fix this issue, it would allow us to determine the correct type for None parameters.
There is no fix on this yet. I'll look at this ater 5.0 is released, which should not be long.
I am also experiencing this issue. Do you have any updates on when a solution will be implemented?
Any update on this?
Make sure the data is loaded as integers not float type or other types. To make sure, you can check this in debug mode. To fix this, try converting the data to integers using this: df = df.applymap(lambda x: int(x) if isinstance(x, float) else x) this function converts float to int type.
Workaround for SQL Server 2016+:
With the user-defined table type and the stored procedure defined in the original post, this code fails:
tvp_data = [(Decimal("4.0000"),), (Decimal("25.0000"),)]
sql = "EXECUTE dbo.DecimalTestProcedure ?"
rows = crsr.execute(sql, [tvp_data]).fetchall()
# pyodbc.DataError: ('22003', '[22003] [Microsoft][ODBC Driver 17 for SQL Server]Numeric value out of range (0) (SQLParamData)')
but this code succeeds
tvp_data = [(Decimal("4.0000"),), (Decimal("25.0000"),)]
sql = """\
SET NOCOUNT ON;
DECLARE @tvp dbo.DecimalTestingType;
INSERT INTO @tvp (val)
SELECT val FROM OPENJSON(?)
WITH (
val decimal(20, 4) '$.val'
);
EXEC dbo.DecimalTestProcedure @tvp
"""
tvp_json = [dict(zip(["val"], row)) for row in tvp_data]
rows = crsr.execute(sql, json.dumps(tvp_json, default=str)).fetchall()
print(rows)
# [(Decimal('4.0000'),), (Decimal('25.0000'),)]