Christos Kyrkou
Christos Kyrkou
If you notice further down the final object confidence loss is the following: loss_conf = tf.reduce_sum(tf.square(true_box_conf-pred_box_conf) * conf_mask) / (nb_conf_box + 1e-6) / 2. which in turn can be view...
Perhaps you can try using upsampling and then apply normal conv2d/adder2d similar to what is discussed here https://distill.pub/2016/deconv-checkerboard/
I have the same issue. It progressively gets slower for some reason.
The way I interpret this is that all candidate boxes are over the threshold so the evaluation takes forever. This might happen because of a very low threshold or the...
The No obj accuracy is still very low. You need to change CONF_THRESHOLD for that. In the original config it is set to 0.05. I used CONF_THRESHOLD = 0.4. You...
@guruprasaad123 Good to hear? Did you manage to reproduce the accuracies reported in the repo for pascal_voc?
Thanks. I tried running it for 100 epochs achieving up to 46 map. I was wondering if running for more would increase performance. I noticed that the parameters in the...
Did you eventually managed to get the reported over 70% map?
> I solved this issue by removing evaluation at initial time. > it seemed that there are so many predicted box, so it takes long time at initial time. >...
Hi, Yes it seems that indeed there is an error there. The variable factor is used for scaling so it should only be applied once at the end. I corrected...