neuroptica icon indicating copy to clipboard operation
neuroptica copied to clipboard

loss function nan when using AbsSquared activation

Open twhughes opened this issue 5 years ago • 1 comments

running this mode

model_linear = neu.Sequential([
    neu.ClementsLayer(N),
    neu.Activation(neu.AbsSquared(N)),
    neu.DropMask(N, keep_ports=range(N_classes))
])

losses = neu.InSituAdam(model_linear, neu.CategoricalCrossEntropy, step_size=step_size).fit(x_train_flattened, y_train_onehot, epochs=n_epochs, batch_size=batch_size)

gives the warning:

  X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)
../neuroptica/neuroptica/losses.py:45: RuntimeWarning: invalid value encountered in true_divide
  X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)

And loss function is nan

When changing AbsSquared to Abs it works fine.

twhughes avatar Dec 07 '18 02:12 twhughes

Noting from our discussion earlier: this appears to be from the polar form of the the derivative whenever r=0.

ianwilliamson avatar Dec 17 '18 21:12 ianwilliamson