UNeXt-pytorch
UNeXt-pytorch copied to clipboard
Embedding dimension 768?
Hello, Quick question, I noticed that in the paper you said "In our experiments, we set H to 768 unless stated otherwise."
- Why 768? Is there any reason for this number? since 768 = 3256256 , I wonder if that number is relative to the shape of your input images or they have other special meanings?
- In the code you offer, there is an embedding vectors containing [32, 64, 128, 512], what is the relationship between the numbers in this vector and 768? Are these numbers the "stated otherwise"? Thanks!
Same question. 768 is inconsistent with the embed_dim(=[32, 64, 128, 512]) in the code. What's the relationship between them? I'm trying to train 3D medical images with this method and need to adjust these hyperparameters. Looking forward to the reply! Thanks!
The embedding dimension is 768 while tokenizing. Please refer this line: https://github.com/jeya-maria-jose/UNeXt-pytorch/blob/6ad0855114a35afbf81decf5dc912cd8de70476a/archs.py#L163