Results 367 comments of QP Hou

@chrisfw I am able to query the file you uploaded as well. Do I need to re-export it in duckdb using their default compression to reproduce the problem?

@chrisfw could you confirm whether the `files_snappy.zip` file you uploaded previously is exported from duckdb or through odbc2parquet? I don't have odbc2parquet setup locally, so I was using your uploaded...

Thanks @robdrysdale for the bug report, I have been busy working on relation support in datafusion lately. Will dig into look at it this weekend.

@robdrysdale could you show me the code you used to generate the parquet file from pandas dataframe? The Date column is indeed saved as `us` in parquet, but when I...

OK, I double checked the schema in the parquet file, it is stored as `us`. So either arrow or datafusion is reading the column type with the incorrect type. I...

I can confirm this is due to the column type set in parquet metadata, more specifically value from `ARROW:schema`. In arrow parquet reader, schema provided from the metadata key value...

Ha that makes sense, nano seconds timestamp was added in parquet 2.0. Anyway, I will follow up with upstream issue and link it back here.

yeah, for now you will need to manually cast from string to timestamp to make the comparison work. we could add automatic type casting in datafusion, or maybe it's already...

That sounds good, I also use the s3 storage backend in delta-rs as the inspiration for roapi's s3 io module.

Streaming update is definitely something I wanted to add to roapi as well :) @jbremmer how are you deploying roapi with corresponding files today?