read_parquet super slow
My computer has 64 GB RAM and the parquet files are not large at all. However, they take surprisingly long time to read. I don't know what is the potential cause. If you need more info, please feel free to let me know.

Hi @PursuitOfDataScience . Could you give us a little bit more information on how are you reading the parquet files? Are you reading them through R, Pyarrow, C++? Maybe you can share a snippet of the read. Thanks
Just using arrow::read_parquet() in R. Everything is regular, nothing fancy. Anything else I need to provide?
Thanks! What version of Arrow are you using?
arrow_8.0.0
How were the files produced? Which compression and encoding do they use? How many columns are there?
They were SAS datasets. I used haven::read_sas() to load them into memory and then used arrow::write_parquet() to write them into disk for future use.
Thanks, can you answer the other questions as well: which compression and encoding do they use? How many columns are there?
I am not sure about the first question, but there are 80 columns or so.