libpng
libpng copied to clipboard
Make it possible to skip chunks entirely
This came up in this Chromium bug. It has an image with a very large iTXt chunk. Since 347538efbdc21b8df684ebd92d37400b3ce85d55, libpng checks iTXt's (and all other chunks') length against PNG_USER_CHUNK_MALLOC_MAX. The chunk is too big, so the image fails to decode.
This may be the right behavior - it prevented libpng from allocating an amount of memory that Chromium said was too much. But Chromium was going to ignore the chunk anyway. Chromium does it by not defining PNG_READ_iTXt_SUPPORTED. (FireFox appears to have the same bug. It disables iTXt differently, by using png_set_keep_unknown_chunks.) In either case, libpng checks against the limit, and either fails or allocates enough memory for the chunk. I'm wondering if it might be possible to allow skipping the chunk entirely. This would mean that libpng doesn't check the length because it won't ever allocate the memory anyway, since the client is ignoring it.
Attached the image in question below:
Try changing png_error to png_benign_error in line 3186 of pngrutil.c
Glenn
On Mon, Apr 9, 2018 at 3:25 PM, LeonScroggins [email protected] wrote:
This came up in this Chromium bug https://bugs.chromium.org/p/chromium/issues/detail?id=827754. It has an image with a very large iTXt chunk. Since 347538e https://github.com/glennrp/libpng/commit/347538efbdc21b8df684ebd92d37400b3ce85d55, libpng checks iTXt's (and all other chunks') length against PNG_USER_CHUNK_MALLOC_MAX. The chunk is too big, so the image fails to decode.
This may be the right behavior - it prevented libpng from allocating an amount of memory that Chromium said was too much. But Chromium was going to ignore the chunk anyway. Chromium does it by not defining PNG_READ_iTXt_SUPPORTED. (FireFox appears to have the same bug https://bugzilla.mozilla.org/show_bug.cgi?id=1452735. It disables iTXt differently, by using png_set_keep_unknown_chunks.) In either case, libpng checks against the limit, and either fails or allocates enough memory for the chunk. I'm wondering if it might be possible to allow skipping the chunk entirely. This would mean that libpng doesn't check the length because it won't ever allocate the memory anyway, since the client is ignoring it.
Attached the image in question below:
[image: partner delivered fail] https://user-images.githubusercontent.com/13983895/38387682-a5dedd8a-38e6-11e8-9db2-658a1d003a59.png
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/glennrp/libpng/issues/217, or mute the thread https://github.com/notifications/unsubscribe-auth/ABe25pBATAG9bNm28GzTyQX71cvPh88Kks5tm7W7gaJpZM4TNFgi .
Yes, that allows us to view the image. It does result in libpng allocating more than PNG_USER_CHUNK_MALLOC_MAX, but I suppose that's similar to an earlier version of 1.6 without png_check_chunk_length (except we now get a warning).
Since check_chunk_name
and check_chunk_length
are always called together, these functions should be combined into a single check_chunk_header
. That should allow us to issue a png_error
for image data, and a png_benign_error
for metadata.
I don't think a chunk that is longer than the specified limit is a benign error; I don't see how it gets handled. The chunk isn't ignored or skipped, the change just causes the limit checking to be completely disabled (and the limit checking is on by default).