TotalSegmentator icon indicating copy to clipboard operation
TotalSegmentator copied to clipboard

totalsegmentator changes global torch settings without restoring original state

Open Kenneth-Schroeder opened this issue 1 year ago • 2 comments

Hey all, I noticed that global torch settings are changed when totalsegmentator.python_api.totalsegmentator is called. They are not restored once the function finishes and can lead to significant performance degradation of follow-up torch functions.

I observed the following settings being changed:

  "torch_settings.num_threads": { # torch.set_num_threads(...)
    "old": 8,
    "new": 1
  },
  "cudnn_settings.benchmark": { # torch.backends.cudnn.benchmark = ...
    "old": false,
    "new": true
  }

Please make sure that totalsegmentator has no such side effects and either restores the global state or is isolated in it's own process.

Kenneth-Schroeder avatar Nov 07 '24 08:11 Kenneth-Schroeder

This is happening somewhere in the nnunet package. I will investigate that but might take some time. A fast solution would be to call TotalSegmentator from within python as a shell command via subprocess.call. Not very elegant but works.

wasserth avatar Nov 25 '24 19:11 wasserth

I committed a fix here: https://github.com/wasserth/TotalSegmentator/commit/1de45113756dad7c8d2d489fbc98a14d2e4c9f9a

This is not in master yet. I will merge the branch when a few other features are ready.

wasserth avatar Nov 27 '24 09:11 wasserth