tf_unet
tf_unet copied to clipboard
How to fine-tune a U-Net pre-trained using this library?
Given I have pre-trained a model using the code, how to fine-tune it using a different dataset? How to replace the soft-max or freeze Conv layers?
Thanks a lot.
The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function
The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function
Thank you very much for the reply. I had seen some people did fine-tuning using the method that you mentioned. But for the source code, where and how can I do it if I would like to replace the softmax? In the "init" of Class "unet", I can see "logits, self.variables, self.filters, self.offset = create_conv_net(self.x, self.keep_prob, channels, n_class, layers,...)". Do I need to do something for "logits"?
It would be great if you could give me a specific example for doing that.
Thanks again.
https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax
https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax
Sorry - I did not follow you. Have no idea why you sent me the link of two source files.
It would be appreciated if you could provide a specific example for doing fine-tuning with you implementation.
Thanks.