skorch
skorch copied to clipboard
training: fixed EarlyStopping._calc_new_threshold to account for negative scores
This fix is meant to account for training models with potentially negative scores as we see in the case of cost functions based on the log likelihood. The current fix ensures that abs_threshold_change is always positive, guaranteeing the intended monotonicity of the threshold.
Thanks for creating this PR. Just to ensure that I got things right: Let's assume we have a negative score, say, -10, and a threshold of 0.1. Then, with the current code, we would have:
abs_threshold_change = self.threshold * score = -1
Then, for lower_is_better = True, we would have new_threshold = score - abs_threshold_change = score - -1 = score + 1. However, we want score - 1 (vice versa forlower_is_better = False) . Is my understanding correct?
Ideally, we could have a unit test for this. I would need to be a bit tricky. Maybe we can create a custom scoring function that just returns from a list of predefined negative scores, and set the monitor value to that scoring function. WDYT?
Yes, you described the issue exactly as I see it.
Regarding the test: I am not as familiar with code, but I think I follow what you mean and I think it should work.
Yes, you described the issue exactly as I see it.
Great, thanks for confirming.
Regarding the test: I am not as familiar with code, but I think I follow what you mean and I think it should work.
Would you be interested in giving it a try? The tests reside here: https://github.com/skorch-dev/skorch/blob/9252477fd0de02d80e08b54203ca19fbe318c94e/skorch/tests/callbacks/test_training.py#L505
If not, it's fine too, just let me know.
Sure, I will give it a go.
@MaxBalmus gentle ping :)
Hi @BenjaminBossan - apologies for disappearing. Unfortunately, I couldn't find time to write the test, and with my current workload I find it a bit difficult to do it.