snowflake-connector-python
snowflake-connector-python copied to clipboard
SNOW-715862: fetch_pandas_all for large timestamp get out of bound timestmap error
Please answer these questions before submitting your issue. Thanks!
-
What version of Python are you using? Python 3.9
-
What operating system and processor architecture are you using?
Ubuntu
- What are the component versions in the environment (
pip freeze
)?
snowflake-connector-python==2.8.3
- What did you do? cur.execute("select TO_TIMESTAMP('9999-01-01T00:00:59Z') as d1") df = cur.fetch_pandas_all() will throw following exception:
File \"/usr/local/lib/python3.9/site-packages/pyarrow/pandas_compat.py\", line 1153, in _table_to_blocks\n result = pa.lib.table_to_blocks(options, block_table, categories,\n File \"pyarrow/table.pxi\", line 2602, in pyarrow.lib.table_to_blocks\n File \"pyarrow/error.pxi\", line 100, in pyarrow.lib.check_status\npyarrow.lib.ArrowInvalid: Casting from timestamp[us] to timestamp[ns] would result in out of bounds timestamp: 253402300799000000"}
-
What did you expect to see? fetch_pands_all() should allow all valid timestamp to loaded, for out of bound timestamp, allow a way to return NaT or max Timestamp value
-
Can you set logging to DEBUG and collect the logs?
import logging import os for logger_name in ('snowflake.connector',): logger = logging.getLogger(logger_name) logger.setLevel(logging.DEBUG) ch = logging.StreamHandler() ch.setLevel(logging.DEBUG) ch.setFormatter(logging.Formatter('%(asctime)s - %(threadName)s %(filename)s:%(lineno)d - %(funcName)s() - %(levelname)s - %(message)s')) logger.addHandler(ch)