heudiconv
heudiconv copied to clipboard
enh: add support for PAR/REC files
original thread on neurostars.
Since Dcm2niix already supports this, we would need to add either a PAR/REC flag or try/except blocks to avoid DICOM operations
EDIT: perhaps we can use nibabel's parrec
Still just trying to understand how this code goes together. Does the following sound about right: group_dicoms_info_seqinfos
is around where some modifications would be needed. The PAR files are getting passed correctly to that function, but the call to dcm.read_file
isn't set up for the PAR/REC specification. For other file formats, read_file
returns a dicom.dataset.FileDataset
which has a bunch of header information for the image to be used for building up seqinfo
. When applied to a PAR file read_file
just returns a FileDataset
with a single line of private tag data, and seqinfo
ends up empty when group_dicoms_info_seqinfos
returns.
@mgxd, you're suggesting to skip the loading by pydicom (and conversion into a Wrapper
) for PAR/REC files. Instead, when handling PAR/REC files build seqinfo in some separate way (probably relying on nibabel.parrec).
@psadil yes, group_dicoms_into_seqinfos
will need to be altered (or avoided within the pipeline altogether) when PAR/REC files are passed - maybe adding a new cl argument --parrec
that diverges from the traditional dicom route here and instead calls a new method specialized for handling these files. WDYT?
sounds good, I'll go with the --parrec argument.
Just as a heads up, I won't be fast at this (end of semester season + slow python coder), but so long as this doesn't need to be finished immediately (read: next couple of weeks) I'd be happy to work on it.
Finally have a bit of time to sit and work on this. Looks like the simplest workaround would be to create a temporary .nii file with nibabel's parrec2nii
load that as a dicom.dataset, and then delete the .nii. How does that sound?
Also, although pip2 install heudiconv
worked, I was having trouble running make install
on a newly cloned directory. make install
just produced
mkdir -p /usr/local/share/heudiconv/heuristics
mkdir -p /usr/local/share/doc/heudiconv/examples/heuristics
mkdir -p /usr/local/bin
install bin/heudiconv /usr/local/bin
install: cannot stat 'bin/heudiconv': No such file or directory
Makefile:6: recipe for target 'install' failed
make: *** [install] Error 1
Is there something else I should have installed first?
Forget about make install - it is outdated, we or someone needs to fix it up
Hi! Is someone still working on this? We have a large project and we are very eager for a solution for this as a partner has data in par/rec format (rest in nifti).
Handling this ended up being fairly far above my head. Sorry to have snagged it and then silently threw up my hands! I'm not aware of anyone else working on it.
I get that! This is far above my skill level also, was just hoping!
Would also love to have par/rec supported in heudiconv, we have a lot of datasets in this format. Am I right that no one is actively working on this anymore? Would be happy to contribute, but not very experienced with Python. @mgxd or @psadil do you have an idea where to continue?
PS: For anyone looking for a par/rec to BIDS converter, we are using bidsify (https://github.com/NILAB-UvA/bidsify) now. But still would be nice to have this featured in heudiconv.
@eduardklap I don't believe anyone is currently working on this issue.
The way I see this implementation going is by adding a function that mimics https://github.com/nipy/heudiconv/blob/6b30c75333d3ee75aca8ae07026b8d034d765adb/heudiconv/dicoms.py#L13 (groups scans into series, extracts series information, concatenates all series information) - this will require a bit of work. If this is something you'd like to work towards implementing, feel free to open a draft pull request and ping us whenever you get stuck!
In the meantime, thank you for sharing the link.
Good to know, thanks! I will have a look in the coming weeks and see what I can do