Add support for image filters (nearest neighbor vs linear) to image rendering
The ImageViewer is a very useful way to let users pan and zoom an image. Because the ImageViewer supports zooming far in on an image, we have to consider how the image is rendered when highly zoomed-in. The default is bilinear interpolation, which is fine for many use cases. But when doing something like viewing pixel art (such as in iced's pokedex example), nearest neighbor might be preferred. This pull requests adds support for setting an Image or an ImageViewer to nearest neighbor rendering mode,
New files in this PR:
-
image_filter.rsin theiced_nativecrate, with the following contents:- The new enum
ImageFilter, withNearestNeighborandLinearvariants - The new struct
FilterOptionsstruct, withmag_filterandmin_filtermembers
- The new enum
Changes to existing files in this PR:
Several files were changed to weave the new FilterOptions struct down into iced_wgpu's image renderer. Rather than hardcoding a texture sampler with nearest/nearest, the renderer now lazily constructs and caches samplers using a HashMap, based on which filters have been selected.
Finally, the Image and ImageViewer user-facing types were edited to add builder-pattern methods that specify the min and mag filter.
Before this gets merged, this will have to be ported to any other renderers - my goal with submitting this now is to gauge interest before doing the work of adding all this to another renderer. Let me know what you think and I can get started on the other one.