Calculate Euler numbers and place them in the sidecar json of surfaces.
This issue entails:
- Sending a PR to nipype with an interface for
mris_euler_number(we could have it first in niworkflows if that eases the process). - Adding a node to smriprep
Ref: http://freesurfer.net/fswiki/mris_euler_number
@oesteban what did you mean by
sidecar json of surfaces
That we could write out the euler number as part of the json corresponding to the surf.gii outputs.
Do those JSONs currently exist? I'm not seeing them in my fmriprep/ outputs...
Nope, but adding the metadata to the DataSink (perhaps you need to declare the JSON key in the interface initialization) will automatically generate the sidecars.
Example:
https://github.com/poldracklab/smriprep/blob/8535ce9e568a603ddfee2080269d4403ca1fec8a/smriprep/workflows/outputs.py#L201
This is the paper where Euler numbers are proposed - https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5856621/
Thanks @oesteban. Reading through the paper, it is unclear, but the following makes it seem likely that it is parsed, rather than recalculated:
However, given the widespread popularity of the FreeSurfer platform, it is also quite likely that many investigators have already calculated the Euler number for much of their data, allowing for immediate use in ongoing studies.
The information is definitely there to be parsed:
$ grep -A 2 "Computing euler" $DATASET/freesurfer/sub-02/scripts/recon-all.log
Computing euler number
orig.nofix lheno = -16, rheno = -10
orig.nofix lhholes = 9, rhholes = 6
Also, to belabor the point:
$ mris_euler_number $DATASET/freesurfer/sub-02/surf/lh.orig.nofix
euler # = v-e+f = 2g-2: 153126 - 459426 + 306284 = -16 --> 9 holes
F =2V-4: 306284 != 306252-4 (-36)
2E=3F: 918852 = 918852 (0)
total defect index = 18
$ mris_euler_number $DATASET/freesurfer/sub-02/surf/lh.orig
euler # = v-e+f = 2g-2: 151958 - 455868 + 303912 = 2 --> 0 holes
F =2V-4: 303912 = 303916-4 (0)
2E=3F: 911736 = 911736 (0)
total defect index = 0
There is a lot of room for information indicative of image quality in ?h.orig.nofix. By topological correction, only the most catastrophically bad data could avoid being corrected to 0 defects, which makes it pretty lossy compression, going from "how bad?" to "broken [y/n]".
I see @sattertt is the corresponding author on for Rosen, Roalf, et al, 2019, so possibly he could confirm the approach used (or flag down somebody who can).