rasmus johansson
rasmus johansson
Evaluating models usually goes faster with GPU enabled. This simple change enables model_weights_as_vector() and model_weights_as_dict() to work also when torch use GPU for computations.
The function textsize() is deptricated and does not work on newer python (3.9) The new solutiuon is validated to work under python 3.9
Adding the following import from fastai.layers import _get_norm in order for the following code to be able to run def BatchNormZero(nf, ndim=2, **kwargs):\n", " \"BatchNorm layer with `nf` features and...