hnn-core
hnn-core copied to clipboard
throw error for negative synaptic weights
Prior to #77, the only checks for positive synaptic weights were in _extract_drive_specs_from_hnn_params()
and have since been removed. We should add a function to drives.py that can be referenced by Network.add_XXX_drive()
that ensures that users, param files, and optimization routines alike don't attempt creating a drive with negative weight.
So, I spent some time looking at the optimization in #77 and trying to figure out if I could use another algorithm. I tried nelder mead which is gradient-free and L_BFGS_B which can approximate the gradient and both respect the constraints. However both seem to be slow on our example. It depends on the gradients and what is the condition number. I don't think I want to go down the path of trying all the solvers to figure out which is the best ... at least for now.
What I suggest is an added parameter to simulate_dipole
called truncate_negative_weights
and if that is True
then truncate the weights to 0. It is False
by default and raises an error but when we call it from within the optimization algorithm, it uses truncate_negative_weights=True
. Note that COBYLA will respect the constraints in the end (although I do see values like -1e-14 but we could just truncate that in optimize_evoked). It's just that in the intermediate iterations it can have negative values
My only concern with placing truncate_negative_weights
in simulate_dipole
is that it undoes a lot of the careful siloing we've been trying to create in the add_XXX_drive()
API. My preference would be to keep all drive-related specifications and parameter handling/checks out of simulated_dipole
whenever possible. See my latest suggestion in the optimization PR for a possible alternative solution.
fair enough!
Noting that we need to check if this is still relevant with the new optimization code