tf_unet icon indicating copy to clipboard operation
tf_unet copied to clipboard

How to fine-tune a U-Net pre-trained using this library?

Open pity2003 opened this issue 6 years ago • 4 comments

Given I have pre-trained a model using the code, how to fine-tune it using a different dataset? How to replace the soft-max or freeze Conv layers?

Thanks a lot.

pity2003 avatar Mar 15 '19 18:03 pity2003

The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function

jakeret avatar Mar 18 '19 13:03 jakeret

The package doesn't provide an out of the box solution for this. You could use the list ofvariables and pass and adapter version of it to the minimize function

Thank you very much for the reply. I had seen some people did fine-tuning using the method that you mentioned. But for the source code, where and how can I do it if I would like to replace the softmax? In the "init" of Class "unet", I can see "logits, self.variables, self.filters, self.offset = create_conv_net(self.x, self.keep_prob, channels, n_class, layers,...)". Do I need to do something for "logits"?

It would be great if you could give me a specific example for doing that.

Thanks again.

pity2003 avatar Mar 18 '19 14:03 pity2003

https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax

jakeret avatar Mar 19 '19 12:03 jakeret

https://github.com/jakeret/tf_unet/search?q=softmax&unscoped_q=softmax

Sorry - I did not follow you. Have no idea why you sent me the link of two source files.

It would be appreciated if you could provide a specific example for doing fine-tuning with you implementation.

Thanks.

pity2003 avatar Mar 19 '19 12:03 pity2003