DSS-pytorch
DSS-pytorch copied to clipboard
result compare
Have you compared the results with paper?
Sorry, I didn't compare with paper. (As I know, the author release the dataset now, you can train it by yourself ) --- I am not working on it recently, so i am sorry ~
Any news on that? :)
so,does anyone compare the result of this work with the paper?
I will do it in October. (I am finding a job recently, so forgive me.)
emm,i just want to make sure if this repo works well。
@holyhao I have update the code and results in pre-version,I will update results of v2(using learnable fusion) tomorrow
@holyhao I have update the code and results in pre-version,I will update results of v2(using learnable fusion) tomorrow
Thanks for your work. By the way,do you train the net on larger batchsize and lr. I see that you set batchszie only 8 and lr is even small.
- I think it's better to use larger lr (I find the loss curve decrease too slow at the beginning). --- but I did not try it (the learning in the paper is very small --- 1e-8)
- You can try larger batchsize(and can also use vgg with batch normalization --- I use pre-trained vgg without bn layer, and use a pre-trained model from caffe may increase the results, many projects find caffe-pretrained model have good performance). welcome to share your results, thank you ~
I try lr=1e-4 with backbone resnet18 ,it works fine and converges faster. But when i try larger batchsize like 32,48, it converges slow and gets worse val results. It confused me, as far as I known, larger batchsize should leads to better results. Do you have some ideas about this.
I did not have a machine with large memory(GPU), so i have no "engineering experiment" in large batch size. However, there is several discuss about bach-size:
- stack exchange
- 《Deep Learning》Ian p172 (Chinese version):small batch size may have regularization, with better generalization.
However, I think there must be some practical tip to train model in larger batch size. (I am sorry without good suggestions. :cry: )
Your reply really inspires me and thank you very much.