keras
keras copied to clipboard
Implementing cauchy-schwarz divergence and negative log likelihood as custom loss functions in Keras
I am training a VGG-16 model toward a multi-class classification task with Tensorflow 2.4, Keras 2.4.0 versions. The y-true labels are one-hot encoded. I use a couple of custom loss functions, individually, to train the model. First, I created a custom cauchy-schwarz divergence loss function as shown below:
from math import sqrt
from math import log
from scipy.stats import gaussian_kde
from scipy import special
def cs_divergence(p1, p2):
"""p1 (numpy array): first pdfs, p2 (numpy array): second pdfs, Returns:float: CS divergence"""
r = range(0, p1.shape[0])
p1_kernel = gaussian_kde(p1)
p2_kernel = gaussian_kde(p2)
p1_computed = p1_kernel(r)
p2_computed = p2_kernel(r)
numerator = sum(p1_computed * p2_computed)
denominator = sqrt(sum(p1_computed ** 2) * sum(p2_computed**2))
return -log(numerator/denominator)
Then, I used a negative log likelihood custom loss function as shown below:
def nll(y_true, y_pred):
loss = -special.xlogy(y_true, y_pred) - special.xlogy(1-y_true, 1-y_pred)
return loss
And compiled the models as below during training the models individually with these losses:
sgd = SGD(lr=0.0001, decay=1e-6, momentum=0.9, nesterov=True)
model_vgg16.compile(optimizer=sgd,
loss=[cs_divergence],
metrics=['accuracy'])
and
sgd = SGD(lr=0.0001, decay=1e-6, momentum=0.9, nesterov=True)
model_vgg16.compile(optimizer=sgd,
loss=[nll],
metrics=['accuracy'])
I got the following errors when training the model with these loss function: With cs_divergence, I got the following error:
TypeError: 'NoneType' object cannot be interpreted as an integer
With nll custom loss, I got the following error:
NotImplementedError: Cannot convert a symbolic Tensor (IteratorGetNext:1) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported
I downgraded the Numpy version to 1.19.5 as discussed in https://stackoverflow.com/questions/58479556/notimplementederror-cannot-convert-a-symbolic-tensor-2nd-target0-to-a-numpy but it didn't help.
Could you share a colab notebook with us with demo inputs? That would help us use the functions and also debug. TIA!
Could you share a colab notebook with us with demo inputs? That would help us use the functions and also debug. TIA!
@ariG23498 PFA the colab notebook per your suggestions. Looking forward to your response. Many thanks!
Hey @sivaramakrishnan-rajaraman Thanks for the colab, unfortunately I could not load the data and was not able to run the code. I saw that you were using scipy and numpy code in the loss functions. Could tf.function be a solution to the issue?
@ariG23498 Not sure how to use it, however. It would be great if you can help with the complete code. For a sample dataset, you can download from https://www.tensorflow.org/datasets/catalog/malaria and split it across train and test sets.
@sivaramakrishnan-rajaraman Could you try again changing code from loss=[cs_divergence] to loss=cs_divergence . Also please go through the documentation of creating custom losses. Thanks!
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.