Data Types Issue
Issue when creating hyper file from spark dataframe. Issue with file types, decimal and timestamp
Column 'xxx' in Parquet file '/tmp/hyperleaup/hyper_files/testtesttest.snappy.parquet' cannot be scanned by Hyper due to its data type.: The Parquet data type (physical type 'FIXED_LEN_BYTE_ARRAY(11)', logical type '{"DECIMAL": {"scale": 2, "precision": 25}}', converted type 'DECIMAL') of the column is either not supported by Hyper or it is inconsistent and the file is possibly corrupt.
Type 'timestamp' is incompatible to type 'timestamptz' of column 'xxxx' in Parquet file '/tmp/hyperleaup/hyper_filestesttesttest.snappy.parquet'.
Thanks for reporting this issue @wright-h . PR #34 will address these data type issues.