Heap Dumps larger than 2GB fail with "Unexpected top-level record type" exception
First of all, thanks for providing this great tool. It provides a great base for custom HPROF parsing. I have run into one small issue though. In HprofParser.parseRecord(), if the HPROF is larger than 2GB (from a 64 bit JVM for example), calling
int bytesLeft = in.readInt();
to get the size of the heap dump (tag is 0xc) results in a negative number. This throws off the rest of the parsing and causes an "Unexpected top-level record type" exception (for me at least). I modified the code and changed the above to
long bytesLeft = in.readInt() & 0xFFFFFFFFL;
And modified a few other spots in the code. It's now working for me with heap dumps larger than 2GB.
Again, thanks for the great tool. If you could incorporate this change, along with any other changes you think would be required to support 64 bit HPROF files, I think it would greatly improve the usability of this tool.
Thanks for reporting this! If you send me a pull request with your change, I'm happy to merge it. I never did test with very large heap dumps.
I tried following the steps outlined here to create a pull request
https://help.github.com/articles/creating-a-pull-request
But I am unable to push the changes, I get "Authentication failed"
Sorry, the error is
error: The requested URL returned error: 403 Forbidden while accessing https://github.com/eaftan/hprof-parser.git/info/refs