matanper
matanper
Solved it, thanks 🙏
When running on sample it works (I did `create table temp as select * from source_table limit 1000` and it synced well), but when running on the full data (15M...
this line cuts at 3.7M rows while the table has 15M rows, so I suspect maybe it's just OOM problem (it doesn't have enough memory to load the whole parts?)...
I didn't find the problem, the CSVs are looking fine to me. However, I changed the export to use parquet and it worked https://github.com/slingdata-io/sling-cli/pull/309/files
Now I have error `pq: invalid input syntax for type timestamp with time zone: "2024-05-01 00:43:11.319289 +0000 UTC"` I see in target `datetime_format` is only for file target
Redshift does export them as `timestamp[ns]` type (example value `2024-02-15 02:34:12.862000`)
About the CSV: I have a column which has mixed text and numeric values, I had to manually set it type to string. However, it's still works partially ,when I...
Another error on other table: ``` ~ could not copy data --- database_postgres.go:205 BulkImportStream --- ~ could not execute statement --- database_postgres.go:190 func2 --- pq: invalid input syntax for type...
I found out that escaping the null option in redshift unload (`NULL '\\N'`) solved the null problems