godot-benchmarks icon indicating copy to clipboard operation
godot-benchmarks copied to clipboard

Add benchmarks for binary size / RAM use

Open Ivorforce opened this issue 1 year ago • 2 comments

Hiya, I've used this benchmarking suite a bunch of times in the past week or so. I gotta say, great project, it's very useful! 😁

I also like the graphs; I think it's a great tool to observe if there is an unexpected change in staging somewhere. Very valuable!

I've just been thinking, another type of metric that may be interesting to keep track of is the size of Godot. I'm particularly thinking of 3 categories:

  • Size of final binary on disk (ideally per platform) — to track our general progress and platform parity.
  • Size of individual modules — to track which of them are large and could be optimized.
  • Size of Godot in RAM — to track if there are unneccearily large allocations sitting around.

For example, I've recently found ucaps.h, which has a large database of char32_t <-> char32_t mappings. I think it's about 5kb. That's not too much, but who knows if there are any others that could be optimized?

Implementation

Size of final binary on disk

It's a little awkward, but this could probably best be tracked by visiting the tested hash's GitHub action (example), downloading each release binary, unzipping, and checking the size on disk.

A much faster proxy would just use the Size parameter from the actions page.

Size of individual modules

Unfortunately, this requires a full build of the engine. This would definitely complicate the benchmark, so it would probably have to be offloaded to passing something like --godot-build-folder, running scons in it, and globbing the final object files to check their size. It's not a perfect proxy to each module's contribution to the final binary size, but it's something.

(but perhaps there is a way to ask llvm for objects inside the final binary in debug binaries that still have their symbols? I might investigate.)

Size of Godot in RAM

I think this would involve simply asking the OS for the applications' RAM use, though I have no idea how stable this metric would be.

It could also be interesting to run specialized size benchmarks, like adding 5000 nodes and checking for the RAM use then, before deallocating. Though that would be just a tad more effort.

Final Thoughts

So yeah, unfortunately none of these are trivial to implement, though all are realistic (and, I believe, interesting).

What do you think?

Ivorforce avatar Dec 07 '24 20:12 Ivorforce

  • Related to https://github.com/godotengine/godot-proposals/issues/9705.

There are already memory utilization and binary size benchmarks in place , but there are no graphs for them on the homepage yet. Adding graphs for them is welcome and is probably not too difficult.

See all the extra data appended to each JSON here:

https://github.com/godotengine/godot-benchmarks/blob/a8ee8585b55055d9e7e3e2ca138c3cb2e025e102/run-benchmarks.sh#L222-L249

Binary size comparisons would be best to have on the official CI directly, so we don't have to wait for PRs to be merged to see the binary size impact (see https://github.com/godotengine/godot-proposals/issues/9705). However, I couldn't figure out a way to reliably get a path to the artifact of the master branch run on the commit the PR is based on. If you could look into this, it would be very appreciated 🙂

Calinou avatar Dec 10 '24 15:12 Calinou

Thank you for the information! That is quite a lot more than I expected to see.

Regarding the proposal, I have an idea that might work (https://github.com/godotengine/godot/pull/100248). Let's see how far I get :)

Ivorforce avatar Dec 10 '24 15:12 Ivorforce