cvcZSL
cvcZSL copied to clipboard
The performance is lower than the results in your paper
Hi,
I tried to reproduce the results on AwA1, without any modification on your code and data. However, I got best_ep: 829, zsl: 0.6889, gzsl: seen=0.7759, unseen=0.5984, h=0.6757 zsl: 0.6889, gzsl: seen=0.7759, unseen=0.5984, h=0.6757 on AwA1 inductive training. Do you have any suggestion to reproduce the results in your paper?
We have tested several times in our machine that the results can be reproduced before uploading. So, there should be no problem in the code. One possible cause for the 2% gap might be that the environment difference.
How about the random seed? Did you use fixed seed or randomly initialize, such as np.random function in data_loader?
That could be a reason, but I remember the algorithm is quite stable.
Thanks for your reply. Maybe I use the different versions of PyTorch or CUDA.
(1) Would you share the random seed for training? (2) How about the environment you used? I tried PyTorch 0.4.1 and Python 3.5, nothing changed. The data_loader maybe the key to solving this problem but I have no idea to fix it.
Anyone who can reproduce the results of AwA1 and other datasets? I am very appreciate for your help!