Handle cases where passing an overly long size could cause UB due to integer overflow
This should never happen in practice. If you end up in a situation where the size extends beyond the address space, you already have a serious problem elsewhere in your code. However, to improve safety of the API, we’ll add a check to prevent undefined behavior due to integer overflow when doing pointer arithmetic.
I don't see the benefit here. The chances that this check will hit are miniscule. The address space is enormous so there are plenty of sizes that will be too large but not detected by this. It adds a lot of complexity for no real gain. What triggered this change?
The reasoning here is that this could protect in an exploit chain where an attacker manages to control the size value due to an upstream bug. In this case, protozero would defend against a maliciously manipulated value by returning a zero size reader. I attempted to make the overhead as small as possible by using the __builtin_add_overflow, which should be available in most circumstances, and GCC says that ”The compiler will attempt to use hardware instructions to implement these built-in functions where possible, like conditional jump on overflow after addition, conditional jump on carry etc.”