adanet icon indicating copy to clipboard operation
adanet copied to clipboard

Adding different loss to tf.estimator.Head

Open le-dawg opened this issue 5 years ago • 2 comments

Hi all,

I seek to optimize an ensemble for binary classification. My established baseline uses the binary_crossentropy loss provided by keras. Using the same notation yields unsupported callable, because it seems the base_head.Head()does not tie into the default tf implementation.

What can I do train with the binary crossetropy?

le-dawg avatar Apr 14 '20 19:04 le-dawg

Have you tried tf.estimator.BinaryClassHead()? I believe it uses the same loss under the hood, specifically sigmoid_cross_entropy.

cweill avatar Apr 17 '20 22:04 cweill

Well, I figured that out in the meantime. But a problem persists: BinaryClassHead It uses the correct loss but when I predict using a simple_dnn AdaNet of three iterations via estimator.predict() the network always predicts with class 0.

The same head works on the canned tf.estimator.LinearClassifier which I would suspect to be a problem regarding an incorrect loss function. I can't troubleshoot the AdaNet estimator any deeper than this. I will take any help I can!

le-dawg avatar Apr 19 '20 22:04 le-dawg