how to parse a large file
I want to parse a large file to 2G. I wonder if there exists a multithread way to acclerate the parsing work.
There isn't anything builtin to combine that would let you parallelize the work. You will need to chunk up the file on your own to get that. Of course, you should profile first to see that parallelizing could be useful.
Depending on the format, monoidal parsing could be used to parallelize your parser (though it is pretty limited).
Dose the asyn method can be used to build dependency relations? Monoidal parsing any example on rust, I search out some Haskell version. Thank you for your help.
Dose the asyn method can be used to build dependency relations?
What do you mean?
Monoidal parsing any example on rust, I search out some Haskell version. Thank you for your help.
Afraid not, I only know of Haskell examples. If you know enough Haskell to understand it, they should be easy to port to rust, otherwise I wouldn't go with that approach.
Ok, thank you. I will try to optimize it first.