keras-yolo2 icon indicating copy to clipboard operation
keras-yolo2 copied to clipboard

why do we pass ground truth labels as input to the model

Open evilc3 opened this issue 6 years ago • 6 comments

the comments in the code say that its a hack or something ? so is it included becz. of some issues with keras or the actual yolo network needs it ? if it needs it then why?

evilc3 avatar Oct 17 '18 14:10 evilc3

What are you referring to? Is it these lines:

    input_image = Input(shape=(self.input_size, self.input_size, 3))
    self.true_boxes = Input(shape=(1, 1, 1, max_box_per_image, 4))

Those self.true_boxes are needed during training to compute cost. There is also code in frontend.py in the predict function like this:

    dummy_array = np.zeros((1, 1, 1, 1, self.max_box_per_image, 4))

That is sort of a hack. Its because the network requires all inputs even if they really aren't used during inferencing (prediction). I am not too familiar but I believe the way to avoid this would be to rebuild the network during inferencing but then adjust it not to have this input and loss. That is possible but a lot of work.

robertlugg avatar Dec 14 '18 21:12 robertlugg

actually is quite easy to generate the inference model, take a look here

rodrigo2019 avatar Dec 16 '18 16:12 rodrigo2019

Ah, thanks rodrigo. Perhaps we should all move to your repository!

robertlugg avatar Jan 02 '19 22:01 robertlugg

I think that merging in this repository should be a better ideia

rodrigo2019 avatar Jan 03 '19 01:01 rodrigo2019

@rodrigo2019 I don't understand the need for self.true_boxes, If we see in the Batch Generator's getitem method, both y_batch itself is getting filled with the ground truth annotations whereas b_batch getting overwritten with last 10 found annotations in the image. It doesn't make sense, I believe we should consider all the ground truth boxes in the loss function, which is calculated in the first half of the custom loss function. But for some reason, the author is replacing the variables with the values calculated from self.true_boxes. Does anyone know the reason behind it?

karthee320 avatar Feb 18 '19 06:02 karthee320

Maybe for him was a better way to process the data, I have a branch which I refactored the loss function, almost of the times I get a better mAP validation with this refactored formula, but some times the old formula wrote by @experiencor get a better results, so I didn't merged it into the master because I need more tests.

Btw the refactored formula wasn't wrote by me, someone showed it to me and I adpted the code to this repo

rodrigo2019 avatar Feb 20 '19 19:02 rodrigo2019