nibabel
nibabel copied to clipboard
Handle RGB(A) data with get_fdata()
I am working with clinical data that includes a wide range of image types. When I encounter an image with RGB data, for example a screenshot, nibabel fails to load the image data. I am able to view the image in MRIcron. Steps to reproduce:
import nibabel as nib
rgbimage = nib.load(rgbimagefilename)
print(rgbimage.get_data_dtype())
[('R', 'u1'), ('G', 'u1'), ('B', 'u1')]
imagedata = rgbimage.get_fdata()
Traceback (most recent call last):
File "/home/me/.local/lib/python3.8/site-packages/IPython/core/interactiveshell.py", line 3417, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-119-57f46d1e6598>", line 1, in <module>
iamgedata = rgbimage.get_fdata()
File "/home/me/.local/lib/python3.8/site-packages/nibabel/dataobj_images.py", line 355, in get_fdata
data = np.asanyarray(self._dataobj, dtype=dtype)
File "/home/me/.local/lib/python3.8/site-packages/numpy/core/_asarray.py", line 136, in asanyarray
return array(a, dtype, copy=False, order=order, subok=True)
I was expecting more to the traceback there.
The problem is that you're asking for floating point data (with get_fdata
) on the image dtype which is the RGB thing, and we hadn't considered that.
For the immediate case, you could use:
data = np.array(rgbimage.dataobj)
For us - we have to consider what get_fdata
should mean for an RGB image - and what the scalefactors should mean in this case. Most obvious would be to scale each channel to floats, I suppose. What do others think?
That was the complete traceback (except my username changed to 'me'). For my current purpose, I just need to test that nibabel can read data from the file, so this works fine. Nice to know you're still out there, Dr Brett, being a force for good in the world. [Remember working on DTI spiking artefacts for Lolly?]
For us - we have to consider what
get_fdata
should mean for an RGB image - and what the scalefactors should mean in this case. Most obvious would be to scale each channel to floats, I suppose. What do others think?
From the little experience I have with RGB(A) images, I would probably either want a tuple of 8-bit ints or floats between 0 and 1. If memory serves, all operations I coded were basically wrapped in a (uchar -> percent luminance -> uchar) sequence except when the math was provably identical on integers. I believe matplotlib expects users to provide color data as floating points between 0 and 1, so it might be most intuitive to apply that scaling by default.
As far as scale factors go, I think they must be even more rare than RGB(A) images, since np.array(dataobj)
would fail for that dtype:
import numpy as np
import nibabel as nb
img = nb.Nifti1Image(
np.ones((5,5,5), dtype=nb.nifti1.data_type_codes.dtype['RGB']),
np.eye(4))
img.header.set_slope_inter(5, 5)
new_img = nb.Nifti1Image.from_bytes(img.to_bytes()) # RT to get a scaling dataobj
np.array(new_img.dataobj)
Yields:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-44-c72193a50650> in <module>
----> 1 np.array(new_img.dataobj)
~/anaconda3/lib/python3.7/site-packages/nibabel/arrayproxy.py in __array__(self, dtype)
391 Scaled image data with type `dtype`.
392 """
--> 393 arr = self._get_scaled(dtype=dtype, slicer=())
394 if dtype is not None:
395 arr = arr.astype(dtype, copy=False)
~/anaconda3/lib/python3.7/site-packages/nibabel/arrayproxy.py in _get_scaled(self, dtype, slicer)
358 scl_inter = scl_inter.astype(use_dtype)
359 # Read array and upcast as necessary for big slopes, intercepts
--> 360 scaled = apply_read_scaling(self._get_unscaled(slicer=slicer), scl_slope, scl_inter)
361 if dtype is not None:
362 scaled = scaled.astype(np.promote_types(scaled.dtype, dtype), copy=False)
~/anaconda3/lib/python3.7/site-packages/nibabel/volumeutils.py in apply_read_scaling(arr, slope, inter)
962 inter = inter.astype(ftype)
963 if slope != 1.0:
--> 964 arr = arr * slope
965 if inter != 0.0:
966 arr = arr + inter
TypeError: invalid type promotion
It's not really clear to me that we should accept scale factors. I don't know whether we'd interpret them as an alternative to scaling to 0-1 or as a second scaling. Might be worth creating a few test images and see what other tools make of them.