walkwithfastai.github.io icon indicating copy to clipboard operation
walkwithfastai.github.io copied to clipboard

Fix: adding import nececeary for using attention layers as bottlenecks

Open rasmuspjohansson opened this issue 1 year ago • 1 comments

Adding the following import from fastai.layers import _get_norm in order for the following code to be able to run

def BatchNormZero(nf, ndim=2, **kwargs):\n", " "BatchNorm layer with nf features and ndim initialized depending on norm_type. Weights initialized to zero."\n", " return _get_norm('BatchNorm', nf, ndim, zero=True, **kwargs)\n",

After this fix it becomes possible to train a timm model based unet with attention or double_attention as bottleneck

rasmuspjohansson avatar Jun 20 '24 12:06 rasmuspjohansson

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

Thanks!

muellerzr avatar Jul 03 '24 10:07 muellerzr