Concurrent/Parallel Bug
We've been successfully using this library for some quick csv parsing and I think some of the techniques you use are neat. We did hit a strange issue today though and I'm concerned its a concurrent/multithreading bug.
So the library, by default, asks for a parallel stream. While parsing a massive CSV file (1 GB) on a 32-thread machine, we got corrupted byte arrays and entirely invalid data. I spent a little time investigating and it seems like there is no locking / threading synchronization for any of the Byte slicing. If a parallel stream runs in multiple threads, splitting the job up, I would think methods like "nextBareSlice()" would have to allow multithread/concurrent access, otherwise you'd get strange results like we did.
Thanks!