gcs-connector-for-apache-kafka icon indicating copy to clipboard operation
gcs-connector-for-apache-kafka copied to clipboard

[QUESTION] How to properly encode CSV files for easy readability (for BigQuery external tables)?

Open rhaycock opened this issue 3 years ago • 0 comments

I am trying to create files in CSV format for storage on a GCS bucket. It seems with encoding set to Base64, the CSV files remain as such in plaintext, and when I try to create an external table in BigQuery with these files, the data remains as Base64 in the query results.

When I use "none" as the encoding, querying the external table throws an ASCII error (Bad character (ASCII 0) encountered.).

What is the correct way to go about this? I figured it would work if I could use the Avro converter for the key.converter property, but it seems this is not possible.

rhaycock avatar Jun 17 '22 23:06 rhaycock