ViT-Adapter icon indicating copy to clipboard operation
ViT-Adapter copied to clipboard

In deformable attention, why sampling_offsets.bias is initialized as arithmetic progression and set to be no gradient

Open peterant330 opened this issue 2 years ago • 2 comments

Hi, This is really cool work. But I have some difficulties to understand these code:

https://github.com/czczup/ViT-Adapter/blob/968f6b008bdc4f84e2a637c986acc139b38e8083/detection/ops/modules/ms_deform_attn.py#L66-L72

I am curious about the mechanism behind how you initialize sampling_offsets.bias and why it is frozen during the training.

peterant330 avatar Jul 10 '23 12:07 peterant330

sampling_offsets.bias is not frozen during training, because the no_grad here will not take effect.

About the initialization, in simple terms, this initialization is to place the sampling points on the circumference around the quiry point.

You can watch this video for more information about deformable attention.

czczup avatar Jul 10 '23 14:07 czczup

sampling_offsets.bias is not frozen during training, because the no_grad here will not take effect.

About the initialization, in simple terms, this initialization is to place the sampling points on the circumference around the quiry point.

You can watch this video for more information about deformable attention.

Thanks for your explanation. I guess you want to make the sampling points to form a circle around the query. However, I don't understand why the length of thetas is n_heads rather than n_points, and what is the function of the for loop. If you only have one head but multiple sampling points, then I guess you will have n points that form a line starting from the reference point.

peterant330 avatar Jul 10 '23 15:07 peterant330