pyodbc
pyodbc copied to clipboard
Please make the initial allocation size in ReadVarColumn configurable
... as the comment suggests: https://github.com/mkleehammer/pyodbc/blob/a4b0b75dc88c910ed69561038bc21fe0ce7da00e/src/getdata.cpp#L95
Ideally a configured value of 0 (or -1) would mean: use columnSize * cbElement.
This would help to work around a bug in the Oracle BI ODBC drivers SQLGetData implementation.
The driver expects to get cbAllocated instead of cbAvailable as the 5th parameter in the SQLGetData call. And even worse: it continues to return SQL_SUCCESS_WITH_INFO with a single NULL-byte when it has no more data until the 5th parameter is as big as cbData.
The result is that pyodbc seg-faults after all RAM has been allocated for the buffer.
The only sane way to work around this, without breaking compatibility, is to ensure to never get a SQL_SUCCESS_WITH_INFO return. Which means make the initial buffer as big as the maximum column size.
I don't have an Oracle install to test against. Would a flag that caused pyodbc to pass cbAllocated actually fix the issue? That is, would it return SQL_SUCCESS if it thought it had the right amount of buffer?
Also, is there a bug report open somewhere on the driver we link to and follow?