SimpleElastix
SimpleElastix copied to clipboard
Gradient based optimizer: how to control learning rate
This is more of a question than an issue, but since I cannot find a proper place to post this, so any help would be appreciated:) In vanila simpleITK, we have control over optimization hyperparameters like learning rate. This can be done with something like
registration_method.SetOptimizerAsGradientDescent(
learningRate=0.1,
numberOfIterations=500,
convergenceMinimumValue=1e-6,
convergenceWindowSize=20,
)
Question: is there a similar way to control learning rate in SimleElastix? I know that we can control certain optimization parameters by setting the parameter map, something like
parameterMapVector = sitk.GetDefaultParameterMap('rigid')
parameterMapVector['MaximumNumberOfIterations'] = ['5000']
parameterMapVector['NumberOfResolutions'] = ['4']
But I was not able to find any information regarding the learning rate.
Adding some more information: The reason why I would like to have direct control over the learning rate is because the problem I've been working with requires extremely accurate approximation of the transformation parameters. While the default behavior of the multi-resolution optimizer works okay, it is not to the level of accuracy that I'm hoping for (which is fine, since I understand that the default behavior of the optimizer is designed to work on most image registration applications). So this question could alternatively be, is there a way to use the previously evaluated parameters as the initial condition, and perform a more accurate estimation of the transformation params (e.g. set learning rate or max step size)?
I have solved this problem. See this example: https://github.com/SuperElastix/SimpleElastix/blob/master/Examples/ImageRegistrationMethodBSpline3/ImageRegistrationMethodBSpline3.py