Agustinus Kristiadi
Agustinus Kristiadi
@runame @aleximmer ready to review!
The current idea is to add an arg in `BaseLaplace`: `logit_class_idx: int = -1`. Then, whenever Laplace flattens logits, it will use that as guidance. Test cases for conv last...
This seems to be a larger endeavor than expected: not only `*Laplace` classes but also all util functions and backends need to be checked. Marking it as a draft for...
Should be good now!
@aleximmer could you please double check and merge?
Thanks @aleximmer, that's a good point. I believe that can also be done even if `LLLaplace` is just a wrapper for the gradient-switch-off above. We still know how to get...
Supporting evidence for refactoring `LLLaplace`: #265
Also check this: https://github.com/aleximmer/Laplace/pull/144/files/824a38e0acf832ca6af4ca86c8b558964796d69d#r1508846116
I think this is because: https://github.com/pytorch/functorch/issues/1058
Users will be warned for now via #202. Ultimately, this shall be solved by #203.