Fritz Larco

Results 128 comments of Fritz Larco

Version should be: `Version 1.2.16.dev (2024-08-25)` You have to download again. Also, remove `single: true`: ```yaml source: MY_SFTP target: SNOWFLAKE defaults: mode: truncate streams: "myfolder/SITE_????-??-??.csv": object: 'myschema.site' single: false env:...

Just pushed this [commit](https://github.com/slingdata-io/sling-cli/pull/303/commits/9ea426decf7514acd657cbbbce96ac5e19620925). Could you try to build the binary on branch `v1.2.11` and test?

Actually, since you're using a mac, I just compiled it and uploaded here: https://f.slingdata.io/sling-mac-20240524.zip

Interesting 🤔 . That's strange. Redshift must output different columns lengths at some point? unless it's not quoting fields with new lines or escaping properly? What you could do is...

If it was OOM the process would just be killed, right? we wouldn't even see the error. You could also monitor the mem to confirm. Another idea, is to try...

Interesting, I'm a bit perplexed about those CSVs not working... For this timestamp error, I'm wondering if redshift writes the timestamp as string when exporting as parquet. Same exercise, could...

Ok, that should work then, I'll take a closer look.

> It fails on double column which has null (also good reason to use parquet) in file `core/dbio/templates/redshift.yaml`, in `copy_to_s3`, can you remove the part `NULL '\N'`? I recently added...

Yeah I have to spend some time on this. I'm quite busy with everything at the moment so I haven't been able to do much. I just pushed a commit...

Do you have a dataset that is erroring that you could share with me so I can test with? I tested with a 10M records (~2GB) file that I have...