SETR-pytorch icon indicating copy to clipboard operation
SETR-pytorch copied to clipboard

doubt about patch embeddings

Open zaocan666 opened this issue 4 years ago • 4 comments

Hi~awsome repo. But I wonder if it is necessary to implement GELU and LayerNorm after linear layer to get patch embedding. Neither the ViT paper and code applies these layers. What I mentioned is in Line 269 and 270 in /SETR/transformer_model.py

zaocan666 avatar Jan 24 '21 07:01 zaocan666

transformer 文章里面是使用了这两个操作,你可以去看下原文中的模型结构。 Attention is all you need.

920232796 avatar Jan 24 '21 07:01 920232796

但ViT的文章和代码好像没有用

zaocan666 avatar Jan 24 '21 07:01 zaocan666

这个我也不清楚了 说实话我没看过那篇文章跟代码... 你可以去掉试试 我觉得影响不大应该。

920232796 avatar Jan 24 '21 07:01 920232796

好的,多谢

zaocan666 avatar Jan 24 '21 07:01 zaocan666