DBTNet
DBTNet copied to clipboard
About the loss function in 'Semantic Grouping Layer'.
I don't understand the code in 'dbt.py', line 216 to 221. How to get the divisor of 128 and 512? plz
I am also confused about these numbers. Could you further explain it? Thanks! @Jianlong-Fu
@Kurumi233 @wangyirui 128 is the batch_size, and I have replaced this number to self.batch_size. Such an item is for normalizing. 512 is the channel number of the last stage. Since in the beginning, I tuned the loss weight for the last stage, I added channels/512 for normalizing the scale of the loss.