GPflowOpt icon indicating copy to clipboard operation
GPflowOpt copied to clipboard

NaNs in MLL output

Open thegnarwhals opened this issue 6 years ago • 1 comments

Hello,

Following the example in the documentation, I am trying to use GPFlowOpt to do a parameter search over a 9-dimensional parameter space. The objective function is a function of both accuracy and latency of the algorithm I am finding parameters for. Therefore it is somewhat non-deterministic in that if I evaluate the objective function of the same point in parameter space twice I will obtain different results because the latency depends on the temperature of my laptop etc. It seems to be working, but I am getting stdout outputs like:

iter #  0 - MLL [-7.3] - fmin [15.4]
iter #  1 - MLL [-6.5] - fmin [9.64]
iter #  2 - MLL [-13.5] - fmin [9.64]
iter #  3 - MLL [-7.44] - fmin [9.64]
iter #  4 - MLL [nan] - fmin [9.64]
iter #  5 - MLL [nan] - fmin [9.64]
iter #  6 - MLL [nan] - fmin [9.64]
iter #  7 - MLL [nan] - fmin [9.64]

The nans are worrying. Do you have any idea what is causing them? I have checked the output of each evaluation of the objective function and they all seem sensible, no NaNs there!

thegnarwhals avatar Dec 12 '18 17:12 thegnarwhals

I am a newer user for GPflowOpt and getting the same output. My guess is the GP model training is failing. Per the source code, the number listed in the brackets is the log likelihood. I notice that after NaNs appear, the same point is always proposed is subsequent iterations.

adowling2 avatar Jul 24 '19 10:07 adowling2