WS_DAN
WS_DAN copied to clipboard
PyTorch version code is available. Thank wvinzh!
I run the tf code and got a 89+% acc, I think my implementation is almost the same as your tf version, so is there any details that you didn't mentioned in the paper?
pytorch version result:
Dataset | ACC | ACC Refine |
---|---|---|
CUB-200-2011 | 87.401 | 87.487 |
Stanford Cars | 92.837 | 93.595 |
FGVC-Aircraft | 89.319 | 89.769 |
All details are shown in this code. You can check the improvement of each module according to the table in the paper. In my experiment, attention regularization or center loss and feature scale are pretty important, which might be different between TensorFlow and PyTorch.
embeddings = end_points_1['embeddings']这里报错,KeyError: 'embeddings'这里是取的那一层的特征图啊?谢谢!
@tau-yihouxiang Thanks a lot! I found i ignored normalization when calculating center loss, and now i got 89.2% acc
@wvinzh Congratulations!
PyTorch version code is available: WS_DAN_PyTorch
@wvinzh For Stanford-Dog, you can try Mixed_7c instead of Mixed_6e as shown in train_sample_dog.sh