neuroptica
neuroptica copied to clipboard
loss function nan when using AbsSquared activation
running this mode
model_linear = neu.Sequential([
neu.ClementsLayer(N),
neu.Activation(neu.AbsSquared(N)),
neu.DropMask(N, keep_ports=range(N_classes))
])
losses = neu.InSituAdam(model_linear, neu.CategoricalCrossEntropy, step_size=step_size).fit(x_train_flattened, y_train_onehot, epochs=n_epochs, batch_size=batch_size)
gives the warning:
X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)
../neuroptica/neuroptica/losses.py:45: RuntimeWarning: invalid value encountered in true_divide
X_softmax = np.exp(X) / np.sum(np.exp(X), axis=0)
And loss function is nan
When changing AbsSquared
to Abs
it works fine.
Noting from our discussion earlier: this appears to be from the polar form of the the derivative whenever r=0.