pyodbc icon indicating copy to clipboard operation
pyodbc copied to clipboard

ODBC SQL type -151 is not yet supported

Open lenovopast17 opened this issue 7 years ago • 17 comments
trafficstars

Hi,

I tried connecting using:

for i, table in enumerate(db_name): mssql_metadata.append(pd.read_sql_query("SELECT * FROM " + table, cnxn)) mssql_metadata[i].to_csv(path + table + ".csv")

but it shows this error:


ProgrammingError Traceback (most recent call last) in () 6 cursor = cnxn.cursor() 7 for i, table in enumerate(db_name): ----> 8 mssql_metadata.append(pd.read_sql_query("SELECT * FROM " + table, cnxn)) 9 mssql_metadata[i].to_csv(path + table + ".csv")

~/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py in read_sql_query(sql, con, index_col, coerce_float, params, parse_dates, chunksize) 330 return pandas_sql.read_query( 331 sql, index_col=index_col, params=params, coerce_float=coerce_float, --> 332 parse_dates=parse_dates, chunksize=chunksize) 333 334

~/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py in read_query(self, sql, index_col, coerce_float, params, parse_dates, chunksize) 1451 parse_dates=parse_dates) 1452 else: -> 1453 data = self._fetchall_as_list(cursor) 1454 cursor.close() 1455

~/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py in _fetchall_as_list(self, cur) 1460 1461 def _fetchall_as_list(self, cur): -> 1462 result = cur.fetchall() 1463 if not isinstance(result, list): 1464 result = list(result)

ProgrammingError: ('ODBC SQL type -151 is not yet supported. column-index=18 type=-151', 'HY106')

lenovopast17 avatar Jun 07 '18 09:06 lenovopast17

-151 is SQL_SS_UDT, is that the column type of your table?

Also, please provide more information about your environment - client OS, client driver, server, etc.

v-chojas avatar Jun 07 '18 14:06 v-chojas

I see -155 when the column type is DATETIMEOFFSET. The non-tz version DATETIME (93) works, but is (for me) horribly wrong to use without TZ.

Microsoft asserts that DATETIME has utility in a multi-country installation:

To reflect an abstract date and time that is not tied to a specific time and place (for example, most stores in an international chain open on weekdays at 9:00 A.M.).

I can't see this as anything other than a one-off justification for a DBA's myopic architecture. (Frankly, I don't know of any chain where every store in the chain opens at exactly the same time every day. If there are examples where this is true, though, I suggest that even then it's pigeonholing and a horrible default.)

For me: win10, "ODBC Driver 13 for SQL Server", python-3.6.5, pyodbc-4.0.23, SQL Server provided by docker's mssql-server-linux:latest, currently showing:

1> select @@VERSION
2> GO
Microsoft SQL Server 2017 (RTM-CU9-GDR) (KB4293805) - 14.0.3035.2 (X64) 
	Jul  6 2018 18:24:36 
	Copyright (C) 2017 Microsoft Corporation
	Developer Edition (64-bit) on Linux (Ubuntu 16.04.5 LTS)

r2evans avatar Aug 16 '18 19:08 r2evans

Unrelated to solving the issue, I exclusively use datetime's without time zones in almost all of my applications because we use UTC timestamps and I have to store the offset elsewhere anyway, such as the location of an office or a device we're tracking. Browsers can always display the values in the user's time zone, and if I need it in the original I end up having to do a ton of work anyway that database time zones never help with since they don't have the TZ rules that change over time (when is DST, etc.).

As for reading these in, we could certainly read them into naive datetimes in UTC. Would that be helpful?

There is no standard tzinfo implementation, but we could provide one, but it would only have the hour offset. I kind of doubt that would work for most people if they are using tzinfo's in their applications, they probably have a specific library they want to use. We'd then need a way to configure a hook / factory for them.

You can use Connection.add_output_converter to convert them yourself, but that might not be easy.

Any suggestions for what datatype to receive?

mkleehammer avatar Aug 19 '18 19:08 mkleehammer

Unfortunately, time has shown that the programmers (in a company not under my control) responsible for the (disparate) data streams are inconsistent with their handling of timezones. The only thing I can trust is that their implementation is spur-of-the-moment and their testing insufficient. I understand the temptation to always go with UTC, but it ain't gonna happen in my case. I believe postgres (for instance) always stores the TIMESTAMPTZ in UTC, which is fine, but I feel the act of requiring them to include the zone as they "know it" at the time is the best way to improve data reliability.

So there are actually two issues with this:

  1. Accepting data of this type in a query's return value. I don't know in what struct ODBC returns -151 or -155, so ...
  2. Uploading/inserting data into a column of this type. Should be easier, since all DBMS's I have dealt with accept string representations (with no apparent loss of precision AFAICT).

r2evans avatar Aug 20 '18 00:08 r2evans

If you want to discuss regarding -155 (SQL_SS_TIMESTAMPOFFSET), the correct issue is https://github.com/mkleehammer/pyodbc/issues/134 . This one is -151 (SQL_SS_UDT).

v-chojas avatar Aug 23 '18 16:08 v-chojas

Oh crap, I must have mis-linked and started typing without verifying, sorry! Should I re-type my comments there?

r2evans avatar Aug 23 '18 22:08 r2evans

For people like me who are looking to just make the damn query work so one wayward column isn't holding the whole thing hostage, here's a fix:

def HandleHierarchyId(v):
      return str(v)

conn = pyodbc.connect(connection_string)
conn.add_output_converter(-151, HandleHierarchyId)

From https://stackoverflow.com/a/58565621/4355695, due credits to user https://stackoverflow.com/users/2842348/adrien who posted it there.

answerquest avatar Mar 25 '20 12:03 answerquest

thanks for providing this! very annoying when you don't even want a datetime column in your output....

stevenhurwitt avatar Aug 11 '20 16:08 stevenhurwitt

-151 is a geography type field, i.e. spatial. I had to just select around the case I'm working with.

ghost avatar Jan 07 '22 05:01 ghost

-151 is a geography type field, i.e. spatial. I had to just select around the case I'm working with. I'm also faced with this issue while working with geography types. Converting geography to wkt in the query solved the problem. cursor.execute(""" SELECT column_name.STAsText() FROM table_name """)

garikdza avatar Nov 16 '22 12:11 garikdza

@garikdza in order to use 'STAsText()' is needed to set something prior? When trying to do that im getting [FreeTDS][SQL Server]Cannot find either column "column_name" or the user-defined function or aggregate "column_name. STAsText", or the name is ambiguous.

ghost avatar Nov 18 '22 20:11 ghost

@nasked11 you have to replace the word "column_name" with your database table's column name which is of the geometry / geography type.

answerquest avatar Nov 19 '22 17:11 answerquest

Hi all, this is what I have done:

from binascii import hexlify

def _handle_geometry(geometry_value): return f"0x{hexlify(geometry_value).decode().upper()}"

and then on connection: .add_output_converter(-151, _handle_geometry)

Benks-1 avatar Feb 10 '23 14:02 Benks-1