smt
smt copied to clipboard
Hyperparameters optimization
KRG(use_het_noise=False, eval_noise=True, hyper_opt="TNC") crash
The gradient-based optimizer was badly adapted to noise evaluation (homo or hetero scedastic)
@Paul-Saves, @anfelopera, Is it a general recommendation: prefer Cobyla optimizer when evaluating noise or should we warn or even prevent such options combination eval_noise=True
and hyper_opt="TNC"
?
I think that there is a problem in the code and it should not happen. I am not using TNC and we do not tested it for noise @anfelopera . However, the error was found by repriem, who uses TNC and could not anymore since Andres refactored the noise evaluation
With the latest version of SMT, it just crash the code and give no error messages :(
Is there an error message or just a bad prediction? If the second, it's normal because we have not compute gradient w.r.t. the noise variance yet... I can code this gradient for the homoscedastic case but it has to wait a couple of weeks (a lot of duties right now). Is it ok for you? For the heteroscedastic case, we do not esimate the noise variance via maximum likelihood so it is not a problem if eval_noise=False and hyper_opt="TCN" (in theory). For instance, we can prevent TNC to consider eval_noise = True
@anfelopera No rush on my side. Suffice to know we will address the issue. I linked this to SMT 1.0 but in 1.0.1 it will be fine as well. :wink:
I tracked it in the code. Right now, most PLS based models, noisy models and SGP are strugling with hyperparameters optimization with derivatives. @anfelopera @hvalayer @NatOnera
I close this PR, it is 3 years old. This is not an issue but the gradient w.r.t the noise could be implemented as a perspective as mentionned by Andrés.
Closing