fast-bert
fast-bert copied to clipboard
i write a model for single class, but get y_pred like this [0.4,0.5]
I write a model for a single class and labels is 2 label neg and pos, but get y_pred like this [0.4,0.5]
then i can't get F1 because fbeta is error! and in your code:
def fbeta(
y_pred: Tensor,
y_true: Tensor,
thresh: float = 0.3,
beta: float = 2,
eps: float = 1e-9,
sigmoid: bool = True,
):
"Computes the f_beta between `preds` and `targets`"
beta2 = beta ** 2
if sigmoid:
y_pred = y_pred.sigmoid()
y_pred = (y_pred > thresh).float()
y_true = y_true.float()
TP = (y_pred * y_true).sum(dim=1)
why not like this [0.4]?
Traceback (most recent call last):
File "/home/wac/fast-bert/single_classifier.py", line 76, in <module>
optimizer_type="lamb")
File "/home/wac/fast-bert/fast_bert/learner_cls.py", line 405, in fit
results = self.validate()
File "/home/wac/fast-bert/fast_bert/learner_cls.py", line 523, in validate
all_logits, all_labels
File "/home/wac/fast-bert/fast_bert/metrics.py", line 112, in F1
return fbeta(y_pred, y_true, thresh=threshold, beta=1)
File "/home/wac/fast-bert/fast_bert/metrics.py", line 58, in fbeta
TP = (y_pred * y_true).sum(dim=1)
File "/home/wac/.local/lib/python3.6/site-packages/apex/amp/wrap.py", line 53, in wrapper
return orig_fn(*args, **kwargs)
RuntimeError: The size of tensor a (2) must match the size of tensor b (860) at non-singleton dimension 1
I think this is because y_pred contains values for pos and neg. I think you would thus need to customize the fbeta by adding logic to convert y_pred to 'pos' or 'neg' based on these values - perhaps whichever has the highest value, then compare that to y_true.
softmax on final layer