porcupine
porcupine copied to clipboard
Huge visualization files
When testing changes to etcd robustness testing we have managed to generate 1.5GB visualization file.
The operation history is about 3k operatons, which is pretty short, however due to high request failure rate (50%) the linearization times out on 5 minutes. I expect the most of those 1.5GB comes from huge number of partial linearizations. Which makes sense as there might be exponential number of them.
The file is too big to be loaded by browser, could we maybe limit the size of partial linearizations to ensure that file size doesn't explode and we can still open it?
If you want to check out the file see https://github.com/etcd-io/etcd/actions/runs/9178286126?pr=17833, download one of the artifacts and look for "history.html" files in the archive.