gcs-connector-for-apache-kafka
gcs-connector-for-apache-kafka copied to clipboard
[QUESTION] How to properly encode CSV files for easy readability (for BigQuery external tables)?
I am trying to create files in CSV format for storage on a GCS bucket. It seems with encoding set to Base64, the CSV files remain as such in plaintext, and when I try to create an external table in BigQuery with these files, the data remains as Base64 in the query results.
When I use "none" as the encoding, querying the external table throws an ASCII error (Bad character (ASCII 0) encountered.).
What is the correct way to go about this? I figured it would work if I could use the Avro converter for the key.converter property, but it seems this is not possible.