CsvHelper
CsvHelper copied to clipboard
add max buffer sizes and if should process bad field flag
It is possible that buffers and field buffers can grow with no limits what leads to big arrays allocations and large memory consumption, what finally leads to OutOfMemory exception. This PR introduces properties to limit max sizes of those buffers. And beside that ProcessBadDataFields flag which makes 'bad field' not processed. Since default is true, current behaviour would not be changed. Otherwise that field will be set to array of empty char because in such flag setting we are not interested in processing bad field. In some cases it can improve performance and memory consumption. WARNING: Limiting max processed field buffer size can lead to field value truncation (if that buffer is too small to fit in field). But it works well with ShouldSkipRecord configured to skip too long fields.
A similar feature was added in #1278 (version 30) as CsvConfiguration.MaxFieldSize
. Is this different?
A similar feature was added in #1278 (version 30) as
CsvConfiguration.MaxFieldSize
. Is this different?
That is completely different since it lets skip too long rows/fields. MaxFieldSize would throw whole parsing if set.
BTW: Tomorrow I will push improvements to that pr's branch.
I've just added mentioned improvement.
@nicemanman thanks for review but you mostly reviewed things which are not related to this pull request. Please create new pull request regarding changes you propose which are not related to this branch.
@nicemanman thanks for review but you mostly reviewed things which are not related to this pull request. Please create new pull request regarding changes you propose which are not related to this branch.
Okay, I will, thanks