cereal
cereal copied to clipboard
question for memory usage when serializing huge data structure
Thanks for this excellent library, one question:
we may have huge data structure, in TBytes level. We use cereal to serialize the data structure into a big std::string, then compress and save to disk.
Would this cause memory usage issue? I mean, the data structure itself has already used TBytes, then we serialize it to a string, the string again takes TBytes.
Is it possible to make a cursor type for the data structure, like:
auto data_cursor = cereal_serialize(const T &t);
char buf[1M];
size_t len = 0;
while((len = data_cursor.read(buf, 1M)) > 0){
compress(buf, len);
append_file(output_file, buf);
}
the cursor itself doesn't hold any data, it just read data from the T and track the read location.
user polls from the cursor for data, using a 1M size buffer.
Does cereal already have this functionality? Or this can be a feature request?
Thanks, etorth