LovaszSoftmax icon indicating copy to clipboard operation
LovaszSoftmax copied to clipboard

lovasz_hinge as loss function Error

Open AmericaBG opened this issue 6 years ago • 1 comments

Hi!! Firstly, thank you very much for sharing your code. It's a great work!

I'm trying to train my model with lovasz_hinge as loss function:

model.compile(optimizer =opt,loss= [lovasz_hinge], metrics = [matthews_correlation])

But I have the next error:

File "C:\Users\Usuario\Anaconda3\envs\env_gpu\lib\site-packages\keras\optimizers.py", line 91, in get_gradients raise ValueError('An operation has None for gradient. '

ValueError: An operation has None for gradient. Please make sure that all of your ops have a gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax, K.round, K.eval

Do you know what it's the problem?

Thank you very much in advance!

AmericaBG avatar Oct 31 '19 11:10 AmericaBG

According to this comment, simply swap the argument position of labels and logits will help.

def lovasz_hinge(labels, logits, per_image=True, ignore=None):
    # No need to change the implementation of the function

Frost-Lee avatar Dec 23 '19 19:12 Frost-Lee