Detect color type of images containing pixels of signed int values
Thank you for the efforts of building this already awesome library.
I think I have a special case which is not yet covered by the image crate and I'm not sure what I could do about it.
Overall, I'm trying to load images of mixed pixel channel types where pixel values can be either unsigned or signed integer values. The crate nicely supports pixel types of unsigned integer out of the box providing the ColorType enum but lacks support for color types of signed integer pixels. I see that signed pixels are somewhat supported since 32bit floats are already present in ColorType.
How could I detect both signed and unsigned integers color types like in some other libraries, e.g. OpenCV?
What file format are your images stored in? While this crate makes simplifying assumptions to abstract over all the formats it supports, the backend decoding crates generally expose more features from the underlying format. For instance, the DecodingResult returned by the tiff crate includes variants for signed integers.
You're right, our signed images are typically tiffs.
The main advantage of using image over format-specific crates would be autodetecting the format of uploaded image bytes without further metainfo and then format-independent postprocessing and possibly conversion.