SIRF icon indicating copy to clipboard operation
SIRF copied to clipboard

SIRF torch should be part of SIRF

Open danieldeidda opened this issue 10 months ago • 5 comments

currently SIRF torch is in SIRF exercises however it would make more sense for it to be part of SIRF itself

danieldeidda avatar Feb 03 '25 16:02 danieldeidda

I suppose you are talking about https://github.com/SyneRBI/SIRF-Exercises/blob/master/notebooks/Deep_Learning_PET/sirf_torch.py

Surely it is in the plan for this funding round of CCP SyneRBI to integrate SIRF with DL libraries.

paskino avatar Feb 11 '25 08:02 paskino

Hi both, I am looking at this at the moment. Would it make more sense to put it in SIRF or SIRF-contribs? (Also @KrisThielemans)

Imraj-Singh avatar Feb 14 '25 13:02 Imraj-Singh

@KrisThielemans and I had a chat about this a few weeks ago and we thought that would make sense to make it part of SIRF. @paskino yes that one.

danieldeidda avatar Feb 14 '25 14:02 danieldeidda

I think indeed SIRF, as it will become more and more essential. However, it needs a bit of a plan on how to do this, and what to put where.

Also, ideally we do this such that it works for any operator in SIRF (e.g. MR acquisition model, Reg.resampler), so avoid variable names like "sinogram" or even "image", at least in the main framework. With any luck, we can set it up for a CIL operator as well (which then starts to look like it should be part of CIL, which might make sense, but has some drawbacks). Indeed, a lot of this is going to be the same for CIL/SIRF/STIR, so it's not obvious to me where to put it.

(Note, there's also the Hessian (i.e. gradient of the gradient operator), as implemented by @gschramm for his exercises after our discussions.)

@Imraj-Singh , feel like making some proposal for discussion?

KrisThielemans avatar Feb 14 '25 14:02 KrisThielemans

So I had a think, and we could add it into the common functionality: i.e. SIRF_torch.py file to "SIRF/src/common/". See here: #1305.

This would be easier (and I think make more sense) than trying to integrate it deeper into the cmake as an option that when toggled requires torch as a dependency.

Yes, we will add the Hessian so we can backpropagate through the gradient (I think that's correct... I will double check.)

Perhaps we can move discussion to #1305 ?

Imraj-Singh avatar Feb 15 '25 13:02 Imraj-Singh