qsiprep
qsiprep copied to clipboard
Utilising s2v parameters in eddy_config.json / Add slice groups eddy
Hi,
I am trying to apply slice-to-volume correction using eddy. However, the required attributes are not considered appropriate within the eddy_params.json file.
My eddy_params.json file:
{ "flm": "linear", "slm": "linear", "fep": false, "interp": "spline", "nvoxhp": 1000, "fudge_factor": 10, "dont_sep_offs_move": false, "dont_peas": false, "niter": 5, "method": "jac", "repol": true, "num_threads": 1, "is_shelled": true, "use_cuda": true, "cnr_maps": true, "residuals": false, "output_type": "NIFTI_GZ", "args": "", "mporder": 20, "s2v_niter": 10, "s2v_lambda": 5, "s2v_interp": "trilinear", "slspec": "data/slspec.txt" }
and I am calling qsiprep using:
singularity run --cleanenv --bind data:/data qsiprep-0.11.0.sif \ --participant-label sub-003 \ --use-syn-sdc \ --force-syn \ --output-resolution 2 \ --skip_bids_validation \ --hmc_model eddy \ --eddy-config data/eddy_params.json \ --fs-license-file data/license.txt \ -w data/qsiprep_work \ data/rawdata \ data/derivatives \ participant
The error is:
Traceback (most recent call last): File "/usr/local/miniconda/lib/python3.7/multiprocessing/process.py", line 297, in _bootstrap self.run() File "/usr/local/miniconda/lib/python3.7/multiprocessing/process.py", line 99, in run self._target(*self._args, **self._kwargs) File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/cli/run.py", line 908, in build_qsiprep_workflow force_syn=opts.force_syn File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/workflows/base.py", line 261, in init_qsiprep_wf force_syn=force_syn) File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/workflows/base.py", line 692, in init_single_subject_wf source_file=source_file File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/workflows/dwi/base.py", line 372, in init_dwi_preproc_wf name="hmc_sdc_wf") File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/workflows/dwi/fsl.py", line 128, in init_fsl_hmc_wf eddy = pe.Node(ExtendedEddy(**eddy_args), name="eddy") File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/interfaces/eddy.py", line 135, in __init__ super(ExtendedEddy, self).__init__(**inputs) File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/fsl/epi.py", line 963, in __init__ super(Eddy, self).__init__(**inputs) File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/fsl/base.py", line 174, in __init__ super(FSLCommand, self).__init__(**inputs) File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 673, in __init__ super(CommandLine, self).__init__(**inputs) File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 183, in __init__ self.inputs = self.input_spec(**inputs) File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/specs.py", line 66, in __init__ super(BaseTraitedSpec, self).__init__(**kwargs) traits.trait_errors.TraitError: Cannot set the undefined 's2v_niter' attribute of a 'EddyInputSpec' object.
Thanks!
@sidchop thanks for pointing this out! We're working on better supporting s2v and other GPU-based eddy features #181 and will add a fix for this in that PR
That's awesome- thanks! In the meantime, is there a way to turn eddy-correction "off", so pre-eddied data, using custom parameters not available in qsiprep, can be entered into the pipeline?
Hi, along the same line, I would like to use the option "multi band group" within eddy but I keep failing:
I use qsiprep v0.14.3 on HCP with singularity.
Here is the eddy_params.json I used:
{
"flm": "quadratic",
"slm": "linear",
"fep": true,
"interp": "spline",
"nvoxhp": 1000,
"fudge_factor": 10,
"dont_sep_offs_move": false,
"dont_peas": true,
"niter": 5,
"method": "jac",
"repol": true,
"num_threads": 1,
"is_shelled": true,
"use_cuda": true,
"cnr_maps": true,
"residuals": true,
"output_type": "NIFTI_GZ",
"args": "", "estimate_move_by_susceptibility": true, "mporder": 4, "multiband_factor":2, "outlier_type" : "gw", "multiband_offset":0
}
Here is relevant error message I got:
[Node] Error on "qsiprep_wf.single_subject_Hero_wf.dwi_preproc_ses_J0_acq_1mm_wf.hmc_sdc_wf.eddy" (/work/temp_data_ProtIsPrim/qsiprep_wf/single_subject_Hero_wf/dwi_preproc_ses_J0_acq_1mm_wf/hmc_sdc_wf/eddy)
211205-00:40:19,322 nipype.workflow ERROR:
Node eddy failed to run on host gpu008.cluster.
211205-00:40:19,331 nipype.workflow ERROR:
Saving crash info to /work/ProtIsPrim/derivatives/qsiprep/qsiprep/sub-Hero/log/20211205-003817_a4310145-2cf0-4c6d-983e-7eaa493b6d49/crash-20211205-004019-jsein-eddy-6336f3cc-f48e-40db-9a07-058a76cb8650.txt
Traceback (most recent call last):
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
result["result"] = node.run(updatehash=updatehash)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 516, in run
result = self._run_interface(execute=True)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 635, in _run_interface
return self._run_command(execute)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/pipeline/engine/nodes.py", line 723, in _run_command
cmd = self._interface.cmdline
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 732, in cmdline
allargs = [self._cmd_prefix + self.cmd] + self._parse_inputs()
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 988, in _parse_inputs
arg = self._format_arg(name, spec, value)
File "/usr/local/miniconda/lib/python3.7/site-packages/qsiprep/interfaces/eddy.py", line 231, in _format_arg
return super(ExtendedEddy, self)._format_arg(name, spec, value)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/fsl/epi.py", line 1006, in _format_arg
return super(Eddy, self)._format_arg(name, spec, value)
File "/usr/local/miniconda/lib/python3.7/site-packages/nipype/interfaces/base/core.py", line 862, in _format_arg
return argstr % value
TypeError: not all arguments converted during string formatting
211205-00:40:29,329 nipype.workflow ERROR:
could not run node: qsiprep_wf.single_subject_Hero_wf.dwi_preproc_ses_J0_acq_1mm_wf.hmc_sdc_wf.eddy
If I remove : "outlier_type" : "gw", "multiband_offset":0
from the eddy_params_json, the execution goes through. What do I miss here?
Thank you for your help!
Hi,
I am also trying slice to volume correction, and it seems like my slspec.txt file is not being found for use in the s2v correction implementation.
I am using the command (on docker) to run QSIprep v0.15.3:
/usr/local/pipelines/qsiprep -i /MRI_DATA/raw_data -o /MRI_DATA/derivatives -w /MRI_DATA/dtitesting -l /MRI_DATA/dtitesting/derivatives/qsilog.log -a "--output-resolution 1.71 --separate-all-dwis --eddy-config /workdir/eddy_params.json --unringing-method none --denoise-method dwidenoise --participant_label sub-test01"
My eddy_params.json file looks like this: { "flm": "linear", "slm": "linear", "fep": false, "interp": "spline", "nvoxhp": 1000, "fudge_factor": 10, "dont_sep_offs_move": false, "dont_peas": false, "niter": 5, "method": "jac", "repol": true, "num_threads": 1, "is_shelled": true, "use_cuda": true, "mporder": 6, "multiband_factor":3, "cnr_maps": true, "residuals": false, "output_type": "NIFTI_GZ", "args": "--slspec=/workdir/slspec.txt" } I have tried a number of different options for calling the slspec.txt file from within eddy_params, and have made sure the /workdir drive is correctly mounted in the docker script (other files in the same drive, e.g., eddy_params.json, can be accessed within the qsiprep call).
The command does work if I exclude the args entry in the eddy_params.json ("args": "--slspec=/workdir/slspec.txt"), however other than stating s2v correction was used in the html boilerplate, I do not think that there is another output that speaks to that (i.e., I am not sure if s2v was completed). Is there is a default slice order that qsiprep uses when the slspec file is not found or specified? In my case there is no slice timing or order information in the dwi json.
Thank you very much!
Caitlin
@CaitlinLloyd here is what worked for me to use slice-to-volume correction with a call of qsiprep within singularity:
singularity run --cleanenv -B /scratch/jsein/BIDS:/work \
--nv /scratch/jsein/my_images/qsiprep-0.16.0RC3.sif /work/$study \
/work/$study/derivatives participant --participant_label $sub \
-w /work/temp_data_${study} --output-resolution 1.2 --fs-license-file /work/freesurfer/license.txt \
--eddy-config /work/$study/derivatives/eddy_params.json \
--b0-threshold 50 --unringing-method mrdegibbs --denoise-method dwidenoise \
--output-space T1w template --template MNI152NLin2009cAsym --distortion-group-merge average
and this is the content of eddy_params.json
{
"flm": "quadratic",
"slm": "linear",
"fep": false,
"interp": "spline",
"nvoxhp": 1000,
"fudge_factor": 10,
"dont_sep_offs_move": false,
"dont_peas": false,
"niter": 5,
"method": "jac",
"repol": true,
"num_threads": 1,
"is_shelled": true,
"use_cuda": true,
"cnr_maps": true,
"residuals": true,
"output_type": "NIFTI_GZ",
"estimate_move_by_susceptibility": true,
"mporder": 8,
"slice_order": "/work/PREDYS/derivatives/slspec_PREDYS.txt",
"args": "--ol_nstd=5"
}
So it looks like the critical argument is "slice_order":
where you can input your slspec.txt
file.
Thank you so much for this. Unfortunately it is not working for me, even when I use the slice_order argument. The node input that is printed looks like this (which seems okay?):
args =
And then I get an error message, but I am not quite clear on what exactly is failing:
Traceback (most recent call last): File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node result["result"] = node.run(updatehash=updatehash) File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run result = self._run_interface(execute=True) File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface return self._run_command(execute) File "/usr/local/miniconda/lib/python3.8/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command raise NodeExecutionError( nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node eddy.
It looks like the in_bval
, in_bvec
, in_file
are not populated in your case. Did topup execute correctly?
Here is the content of qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/_report/report.rst
:
Node: single_subject_pilote2_wf (dwi_preproc_wf (hmc_sdc_wf (eddy (eddy)
========================================================================
Hierarchy : qsiprep_wf.single_subject_pilote2_wf.dwi_preproc_wf.hmc_sdc_wf.eddy
Exec ID : eddy
Original Inputs
---------------
* args : --ol_nstd=5
* cnr_maps : True
* dont_peas : False
* dont_sep_offs_move : False
* environ : {'FSLOUTPUTTYPE': 'NIFTI_GZ', 'OMP_NUM_THREADS': '8'}
* estimate_move_by_susceptibility : True
* fep : False
* field : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup/fieldmap_HZ.nii.gz
* field_mat : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup_to_eddy_reg/topup_reg_image_flirt.mat
* flm : quadratic
* fudge_factor : 10.0
* fwhm : <undefined>
* in_acqp : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_acqp.txt
* in_bval : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bval
* in_bvec : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bvec
* in_file : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.nii.gz
* in_index : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_index.txt
* in_mask : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/enhance_and_mask_b0/topup_imain_corrected_avg_mask.nii.gz
* in_topup_fieldcoef : <undefined>
* in_topup_movpar : <undefined>
* initrand : <undefined>
* interp : spline
* is_shelled : True
* json : <undefined>
* mbs_ksp : <undefined>
* mbs_lambda : <undefined>
* mbs_niter : <undefined>
* method : jac
* mporder : 8
* multiband_factor : <undefined>
* multiband_offset : <undefined>
* niter : 5
* num_threads : 8
* nvoxhp : 1000
* out_base : eddy_corrected
* outlier_nstd : <undefined>
* outlier_nvox : <undefined>
* outlier_pos : <undefined>
* outlier_sqr : <undefined>
* outlier_type : <undefined>
* output_type : NIFTI_GZ
* repol : True
* residuals : True
* session : <undefined>
* slice2vol_interp : <undefined>
* slice2vol_lambda : <undefined>
* slice2vol_niter : <undefined>
* slice_order : /work/PREDYS/derivatives/slspec_PREDYS.txt
* slm : linear
* use_cuda : True
Execution Inputs
----------------
* args : --ol_nstd=5
* cnr_maps : True
* dont_peas : False
* dont_sep_offs_move : False
* environ : {'FSLOUTPUTTYPE': 'NIFTI_GZ', 'OMP_NUM_THREADS': '8'}
* estimate_move_by_susceptibility : True
* fep : False
* field : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup/fieldmap_HZ.nii.gz
* field_mat : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup_to_eddy_reg/topup_reg_image_flirt.mat
* flm : quadratic
* fudge_factor : 10.0
* fwhm : <undefined>
* in_acqp : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_acqp.txt
* in_bval : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bval
* in_bvec : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bvec
* in_file : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.nii.gz
* in_index : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_index.txt
* in_mask : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/enhance_and_mask_b0/topup_imain_corrected_avg_mask.nii.gz
* in_topup_fieldcoef : <undefined>
* in_topup_movpar : <undefined>
* initrand : <undefined>
* interp : spline
* is_shelled : True
* json : <undefined>
* mbs_ksp : <undefined>
* mbs_lambda : <undefined>
* mbs_niter : <undefined>
* method : jac
* mporder : 8
* multiband_factor : <undefined>
* multiband_offset : <undefined>
* niter : 5
* num_threads : 8
* nvoxhp : 1000
* out_base : eddy_corrected
* outlier_nstd : <undefined>
* outlier_nvox : <undefined>
* outlier_pos : <undefined>
* outlier_sqr : <undefined>
* outlier_type : <undefined>
* output_type : NIFTI_GZ
* repol : True
* residuals : True
* session : <undefined>
* slice2vol_interp : <undefined>
* slice2vol_lambda : <undefined>
* slice2vol_niter : <undefined>
* slice_order : /work/PREDYS/derivatives/slspec_PREDYS.txt
* slm : linear
* use_cuda : True
Execution Outputs
-----------------
* out_cnr_maps : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected.eddy_cnr_maps.nii.gz
* out_corrected : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected.nii.gz
* out_movement_over_time : <undefined>
* out_movement_rms : <undefined>
* out_outlier_free : <undefined>
* out_outlier_map : <undefined>
* out_outlier_n_sqr_stdev_map : <undefined>
* out_outlier_n_stdev_map : <undefined>
* out_outlier_report : <undefined>
* out_parameter : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected.eddy_parameters
* out_residuals : <undefined>
* out_restricted_movement_rms : <undefined>
* out_rotated_bvecs : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected.eddy_rotated_bvecs
* out_shell_alignment_parameters : <undefined>
* out_shell_pe_translation_parameters : <undefined>
* outlier_free_data : <undefined>
* outlier_map : <undefined>
* outlier_n_sqr_stdev_map : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected.eddy_outlier_n_sqr_stdev_map
* outlier_n_stdev_map : <undefined>
* shell_PE_translation_parameters : <undefined>
Runtime info
------------
* cmdline : eddy_cuda --ol_nstd=5 --cnr_maps --estimate_move_by_susceptibility --field=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup/fieldmap_HZ --field_mat=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/topup_to_eddy_reg/topup_reg_image_flirt.mat --flm=quadratic --ff=10.0 --acqp=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_acqp.txt --bvals=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bval --bvecs=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.bvec --imain=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/pre_hmc_wf/rpe_concat/merge__merged.nii.gz --index=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/gather_inputs/eddy_index.txt --mask=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/pre_eddy_b0_ref_wf/enhance_and_mask_b0/topup_imain_corrected_avg_mask.nii.gz --interp=spline --data_is_shelled --resamp=jac --mporder=8 --niter=5 --nvoxhp=1000 --out=/work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy/eddy_corrected --repol --residuals --slspec=/work/PREDYS/derivatives/slspec_PREDYS.txt --slm=linear
* duration : 7069.33369
* hostname : gpu004.cluster
* prev_wd : /home/jsein
* working_dir : /work/temp_data_PREDYS/qsiprep_wf/single_subject_pilote2_wf/dwi_preproc_wf/hmc_sdc_wf/eddy
I hope it helps! You can check the command line for eddy to see what was really passed to the eddy command. See cmdline
above under Runtime info
.
Great spot - thanks! I think part of the problem may have been running with the --dwi_only flag, but it is strange QSIprep is not picking up the path to the bval/bvecs files. Is it correct that the acqparams file is be created by QSIprep? Top-up works fine when I do not try and run s2v.
Thank you again!
Yes the acqparam file is created by QSIPREP. Apparently there is a bug when using the --dwi_only
flag but I think that only affects the QSIRECON part (https://github.com/PennLINC/qsiprep/issues/432). I was able to run qsiprep with dwi_only
flag and get the s2v correction if I recall correctly.