BiSeNet icon indicating copy to clipboard operation
BiSeNet copied to clipboard

Freezing of batch normalization layers during fine-tuning

Open himanshunaidu opened this issue 8 months ago • 0 comments

Greetings, First of all, really appreciate the work you have done. This has been really useful for my experiments.

Now, to my question, I have seen that generally, for fine-tuning, the batch normalization layers are frozen. Did the authors of this repository try something like that?

Just curious about it. I do plan to implement it myself (any tips on that would be appreciated btw), but I was wondering if the authors of this repo or the paper have any insights on it.

himanshunaidu avatar May 01 '25 05:05 himanshunaidu