matanper
matanper
About the `wrong number of fields` error: I've found out it only happens on large files when they are processed in parallel. I tried adding env `CONCURRENCY_LIMIT=1` but it didn't...
same error on latest v1.2.11
Sent you an email
hey @flarco I still get this error for large datasets. It seems like it has to do with parallel reading of the csv, is there a way to force running...
I changed it and I see it used it: ``` 2024-06-30 10:47:19 DBG using target options: {"concurrency":1,"datetime_format":"auto","file_max_rows":0,"max_decimals":-1,"use_bulk":true,"add_new_columns":true,"adjust_column_type":false,"column_casing":"source"} ``` but when merging the csv readers it's still using concurrency=10, and fails:...
I added `SLING_MERGE_READERS=false` and it seems to work now
Interesting idea 😃 In my opinion it might be better to define the data restrictions in the constructor, since deepchecks determines that conditions have only tabular display, while the check...
same here I started caching the pnpm dest dir ``` - name: Restore pnpm installation id: pnpm-cache uses: actions/cache@v4 with: path: ~/.cache/setup-pnpm key: pnpm-bin-${{ runner.os }}-${{ runner.arch }}-v${{ inputs.pnpm-version }}...