smriprep icon indicating copy to clipboard operation
smriprep copied to clipboard

midthickness node Crashing

Open Mahdieh-dst opened this issue 2 years ago • 8 comments

Describe the bug SMRIprep crashed at: smriprep_wf.single_subject_DEFAULT_wf.anat_preproc_wf.surface_recon_wf.gifti_surface_wf.midthickness node.

What version of SMRIPrep are you running? 0.9.1

Exact command line executed command_midthickness0:

mris_expand -pial /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.pial -thickness -thickness_name /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.thickness /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.smoothwm 0.5 midthickness

command_midthickness1:

mris_expand -pial /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness1/rh.pial -thickness -thickness_name /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness1/rh.thickness /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness1/rh.smoothwm 0.5 midthickness

Are you positive that the input dataset is BIDS-compliant? Yes

sMRIPrep feedback information Please attach the full log written to the standard output and the crashfile(s), if generated.

The errors showing up in html report are:

Node Name: _midthickness0
 Node Name: _midthickness1 ......................... Node Name: _midthickness0:
 File: /work/bids-app-output/smriprep/sub-DEFAULT/log/20220715-004154_effd4881-5a95-48f2-89f5-7b18c0463602/crash-20220715-050808-root-_midthickness0-9a195751-b496-4d61-8615-8808e8c71839.txt
Working Directory: /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0
Inputs: • args: • distance: 0.5 • dt: • environ: {'SUBJECTS_DIR': '/opt/freesurfer/subjects'} • graymid: • in_file: /work/bids-app-output/freesurfer/sub-DEFAULT/surf/lh.smoothwm • nsurfaces: • out_name: midthickness • pial: • smooth_averages: • sphere: sphere • spring: • subjects_dir: /opt/freesurfer/subjects • thickness: True • thickness_name: • write_iterations: Traceback (most recent call last):
 File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node
   result["result"] = node.run(updatehash=updatehash)
 File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run
   result = self._run_interface(execute=True)
 File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface
   return self._run_command(execute)
 File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command
   raise NodeExecutionError(
nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _midthickness0.RuntimeError: subprocess exited with code 134. ........................ Node Name: _midthickness1
 File: /work/bids-app-output/smriprep/sub-DEFAULT/log/20220715-004154_effd4881-5a95-48f2-89f5-7b18c0463602/crash-20220715-050810-root-_midthickness1-c7f57842-5855-4fe4-97bc-bb97b1c9fbf1.txt
Working Directory: /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness1
Inputs: • args: • distance: 0.5 • dt: • environ: {'SUBJECTS_DIR': '/opt/freesurfer/subjects'} • graymid: • in_file: /work/bids-app-output/freesurfer/sub-DEFAULT/surf/rh.smoothwm • nsurfaces: • out_name: midthickness • pial: • smooth_averages: • sphere: sphere • spring: • subjects_dir: /opt/freesurfer/subjects • thickness: True • thickness_name: • write_iterations: Traceback (most recent call last): File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/plugins/multiproc.py", line 67, in run_node result["result"] = node.run(updatehash=updatehash) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 524, in run result = self._run_interface(execute=True) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 642, in _run_interface return self._run_command(execute) File "/opt/conda/lib/python3.9/site-packages/nipype/pipeline/engine/nodes.py", line 750, in _run_command raise NodeExecutionError( nipype.pipeline.engine.nodes.NodeExecutionError: Exception raised while executing Node _midthickness1.

RuntimeError: subprocess exited with code 134.

Mahdieh-dst avatar Jul 18 '22 04:07 Mahdieh-dst

Exit code 134 is likely a memory error. Does re-running resolve the issue? And how much memory do you have available?

effigies avatar Jul 18 '22 19:07 effigies

My laptop memory is 16GB. I increased memory resources allocated to Docker to 10GB and re-ran smriprep. However, it failed again at the same node (midthickness node)!

Mahdieh-dst avatar Jul 19 '22 13:07 Mahdieh-dst

Okay, let's debug, then. I assume you're running with smriprep-docker. Can you take your existing command and add --shell? From there you can copy-paste your mris_expand and get the full output.

effigies avatar Jul 19 '22 14:07 effigies

Thanks for your response! I get this: reading pial surface from /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.pial using distance as a % of thickness using thickness file /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.thickness expanding surface /work-local/pipeline-cache/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.smoothwm by 50.0% of thickness and writing it to midthickness reading thickness... *** buffer overflow detected ***: terminated Aborted

Mahdieh-dst avatar Jul 20 '22 08:07 Mahdieh-dst

Ah, that's a different problem. I think this is a limit on path lengths in FreeSurfer, and you're hitting it because of how deep your working directory is. If you're unable to mount your working directory to /work or similar, I would just run this manually outside of sMRIPrep for now:

SURF_DIR=$SUBJECTS_DIR/sub-DEFAULT/surf
mris_expand -pial $SURF_DIR/lh.pial -thickness $SURF_DIR/lh.smoothwm 0.5 $SURF_DIR/lh.midthickness
mris_expand -pial $SURF_DIR/rh.pial -thickness $SURF_DIR/rh.smoothwm 0.5 $SURF_DIR/rh.midthickness

effigies avatar Jul 20 '22 13:07 effigies

OK, I modified the working directory path, so I did not get the buffer overflow error anymore! However, I am getting a new error now:

reading pial surface from /wl/pydra/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.pial using distance as a % of thickness using thickness file /wl/pydra/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.thickness expanding surface /wl/pydra/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.smoothwm by 50.0% of thickness and writing it to midthickness error: No such file or directory error: MRISread(/wl/pydra/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow/_midthickness0/lh.smoothwm): could not open file error: No such file or directory error: mris_expand: MRISread(/wl/pydra/ShellCommandTask_ebe201396c7e952eaebfc597f6ca8d893314ecd4bebc83688d7e5fc7007bbf56/work/smriprep_wf/single_subject_DEFAULT_wf/anat_preproc_wf/surface_recon_wf/gifti_surface_wf/midthickness/mapflow//lh.smoothwm) failed

When I check inside the _midthickness0 directory, I can see that the "lh.smoothwm" file does exist and its size is 5.5M! So, I am not sure why I get this "No such file or directory" error! I appreciate your help on this

Mahdieh-dst avatar Jul 22 '22 03:07 Mahdieh-dst

Apologies, I missed this update.

Do you have FreeSurfer 7 installed outside this container? If so, it would be useful to verify that you also cannot run themris_expand command in your outside environment. That will distinguish between a failure of FreeSurfer to handle the file (for some reason) and a failure on our part to include some critical component of FreeSurfer.

effigies avatar Jul 27 '22 15:07 effigies

Thanks for your comments. I installed FreeSurfer 7 (7.2.0 ) outside the container. mris_expand could run successfully on the local machine! I am not sure why it fails when inside the container!

Mahdieh-dst avatar Aug 08 '22 03:08 Mahdieh-dst

I just experienced the same issue. In my case it was because the input filenames were too long (over 200 characters). Copying the files to a shorter directory solved my problem.

feilong avatar Jan 04 '23 17:01 feilong

Only so much we can do about long filenames. Apologies.

effigies avatar Nov 20 '23 01:11 effigies