UNeXt-pytorch icon indicating copy to clipboard operation
UNeXt-pytorch copied to clipboard

Embedding dimension 768?

Open Capchenxi opened this issue 3 years ago • 1 comments

Hello, Quick question, I noticed that in the paper you said "In our experiments, we set H to 768 unless stated otherwise."

  1. Why 768? Is there any reason for this number? since 768 = 3256256 , I wonder if that number is relative to the shape of your input images or they have other special meanings?
  2. In the code you offer, there is an embedding vectors containing [32, 64, 128, 512], what is the relationship between the numbers in this vector and 768? Are these numbers the "stated otherwise"? Thanks!

Capchenxi avatar Apr 14 '22 06:04 Capchenxi

Same question. 768 is inconsistent with the embed_dim(=[32, 64, 128, 512]) in the code. What's the relationship between them? I'm trying to train 3D medical images with this method and need to adjust these hyperparameters. Looking forward to the reply! Thanks!

dydxdt avatar May 07 '22 04:05 dydxdt

The embedding dimension is 768 while tokenizing. Please refer this line: https://github.com/jeya-maria-jose/UNeXt-pytorch/blob/6ad0855114a35afbf81decf5dc912cd8de70476a/archs.py#L163

jeya-maria-jose avatar Dec 21 '22 23:12 jeya-maria-jose