heudiconv
heudiconv copied to clipboard
nibabel can't read my data and heudiconv doesn't explain why
Summary
When I run
docker run -v $PWD:/base nipy/heudiconv \
-d "/base/Data/{subject}-{session}/Hopkins/dicom/*/*.dcm" \
-o /base/Nifti/ -f convertall -s 01 -ss 001 -c none --overwrite
I get an error that nibabel.nicom.dicomwrappers.WrapperError: File contains more than one StackID. Cannot handle multi-stack files
. Would anyone know what a StackID
even is? I can't find much about it either in the heudiconv docs or elsewhere online, but the stack trace indicates the problem occurs on line 105 in the heudiconv source where you can see a try-catch block, but only KeyErrors are caught. So, I tried catching all errors by using sed to remove KeyError
:
docker run -v $PWD:/base --entrypoint="" nipy/heudiconv bash -c \
"sed -i 's/ KeyError//g' /src/heudiconv/heudiconv/dicoms.py && heudiconv -d '/base/Data/{subject}-{session}/Hopkins/dicom/*/*.dcm' -o /base/Nifti/ -f convertall -s 01 -ss 001 -c none --overwrite"
which does bypass the error in that particular spot but moves the problem downstream so it simply fails a bit later in the run. I’ve never seen this issue with other datasets I’ve converted, so I assume the issue is actually with the data, not the code, but I have no idea what it could be. Does anyone else? Ultimately I hope to submit a PR that catches the nibabel exception and raises our own exception with more of an explanation.
Platform details:
Choose one:
- [ ] Local environment
- [x] Container: nipy/heudiconv (latest tag as of 5/22/20)
- Heudiconv version: 0.8.0
I am away from laptop ATM, but if you try with --minmeta (check for correct spelling in --help) would it work? Any chance you could share a sample of such dicoms?
I tried again with the --minmeta
option (the help says that is the correct spelling), but got the same multi-stack files error. I will ask again but unfortunately I almost certainly cannot share any data.
Interestingly, my dicom directories contains files prefixed either with MRe
, PSg
, and RAW
. Do these prefixes mean anything to you? When I convert just the MRe
files, I get the multi-stack files error. When I convert just the PSg-prefixed files
docker run -v $PWD:/base nipy/heudiconv -d \
"/base/Data/{subject}-{session}/Hopkins/dicom/*/PSg.*.dcm" \
-o /base/Nifti/ -f convertall -s 01 -ss 001 -c none --overwrite --minmeta
I instead get an error about conflicting study identifiers, though this is an error I've actually seen before and believe I know how to solve. When I convert just the RAW-prefixed files,
docker run -v $PWD:/base nipy/heudiconv -d \
"/base/Data/{subject}-{session}/Hopkins/dicom/*/RAW.*.dcm" \
-o /base/Nifti/ -f convertall -s 01 -ss 001 -c none --overwrite --minmeta
it actually works, but then the generated Nifti/.heudiconv/01/info/dicominfo_ses-001.tsv
file is empty (just a header).
@Terf nibabel's MultiframeWrapper implementation currently doesn't support multiple DICOM stacks within the same multiframe image https://github.com/nipy/nibabel/blob/b2a88b816eb74074d9c47e17e3ef68e29c0cdc04/nibabel/nicom/dicomwrappers.py#L498-L500
If you haven't already, I would try converting the DICOMs with dcm2niix
directly
Is there any way I can convert my data so nibabel will read it and I can use heudiconv? Or is there a way to get heudiconv to skip over files it isn't able to read and convert the rest?
When I use dcm2niix
it seems to work (generates niftis) but at the end prints (several times):
Unsupported transfer syntax '1.2.840.10008.1.2.1.99' (see www.nitrc.org/plugins/mwiki/index.php/dcm2nii:MainPage)
I'm confused if this means some of the DICOMs were not actually able to be converted? nitrc.org lists that exact transfer syntax and links to a tool (dcmconv) that can apparently convert DICOM file encoding. If I were to get rid of the Unsupported transfer syntax
dcm2niix warnings, would that solve the multi-stack DICOM issue, or are those separate issues?
Unfortunately separate. BTW what scanner/sequences those are?
I will check in Monday to see what more about my data I'm allowed to share, but in short these particular DICOMs are from a Phillips scanner. However, my dataset as a whole contains scans from multiple sites that each use a different scanner -- a focus of my labs research is actually to quantify differences between sites/scanners. Is it possible to combine all my data into a single BIDS dataset, or would I be better off creating a distinct BIDS dataset for every site?
As I remember it - dcmstack can read multiframe DICOMs at least - will it read your DICOMs?
@matthew-brett dcmstack does not seem to be able to read the entire directory of DICOMs.
tim@timbotron ~ % docker run -v /Volumes/2tb/takim/mscamras/Data/01-001/Hopkins/dicom/xxxxx:/data -ti --entrypoint="" nipy/heudiconv bash
root@153b6ab3b348:/# git clone https://github.com/moloney/dcmstack.git
[abbreviated output]
root@153b6ab3b348:/# cd dcmstack
root@153b6ab3b348:/dcmstack# python setup.py install
[abbreviated output]
root@153b6ab3b348:/dcmstack# python
Python 3.6.10 |Anaconda, Inc.| (default, Mar 23 2020, 23:13:11)
[GCC 7.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import dcmstack
>>> from glob import glob
>>> src_dcms = glob("/data/*.dcm")
>>> stacks = dcmstack.parse_and_stack(src_dcms)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/PSg.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py:1122: UserWarning: Skipping non-image data set: /data/RAW.xxxx.dcm
warnings.warn("Skipping non-image data set: %s" % dcm_path)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py", line 1226, in parse_and_stack
results[key] = stack_group(group, warn_on_except, **stack_args)
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py", line 1180, in stack_group
result.add_dcm(dcm, meta)
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmstack.py", line 605, in add_dcm
nii_wrp = NiftiWrapper.from_dicom_wrapper(dw, meta)
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmmeta.py", line 1561, in from_dicom_wrapper
result = klass(nii_img, make_empty=True)
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmmeta.py", line 1280, in __init__
self.meta_ext.check_valid()
File "/opt/miniconda-latest/lib/python3.6/site-packages/dcmstack/dcmmeta.py", line 307, in check_valid
'classification %s' % classes[0])
dcmstack.dcmmeta.InvalidExtensionError: The extension is not valid: Missing required base classification time
@yarikoptic @matthew-brett @mgxd It seems Philips scanners have a variety of output options but one of them is a sort of "enhanced" dicoms which causes this issue. My enhanced dicoms were additionally compressed (hence the unsupported transfer syntax) so I first used dcmconv to decompress then used the emf2sf tool from dcm4che to convert to standard dicoms.
apt-get update && apt-get install dcmtk
# only the MRe-prefixed dicoms are necessary, the PSg and RAW-prefixed files are Philips-specific files that are kept to recreate the scanning environment
dcmconv /base/Data/01-001/Hopkins/dicom/std_redacted/MRe.redacted.dcm ./out.dcm
docker run -ti -v $PWD:/base --entrypoint="" dcm4che/dcm4che-tools \
emf2sf --out-file test.dcm /base/out.dcm
I can then run heudiconv on these converted dicoms.
So this issue is data-specific and somewhat separate to heudiconv's functionality, however, it would be nice if the nibabel multi-frame exception could be caught and a more explanatory exception could be raised, describing how to convert enhanced dicoms to standard ones. Also, why does heudiconv not report the same unsupported transfer syntax warning that dcm2niix does?
At the very least, I would like to submit a PR that makes the unsupported transfer syntax more explicit and gives a more verbose explanation of the multi-frame/enhanced dicom issue. Depending on how big of a priority container size is for you, I think it would be very useful to include at least dcmconv and perhaps the larger dcm4che library in the heudiconv container so the solution can be fully automated. If you're on board with that idea, I'd love to submit a PR.
My PI says I can't share these particular data, but if it would help test an automated solution, I have other scans of an inanimate object (a water bottle actually) I can publicly share which came from the same scanner so will have the same issues.
My PI says I can't share these particular data, but if it would help test an automated solution, I have other scans of an inanimate object (a water bottle actually) I can publicly share which came from the same scanner so will have the same issues.
I love bottles!