Verify if the model is running correctly
Is the result obtained by running the example (affinity. yaml) correct, and is the result similar to yours? (affinity_affinity.json) { "affinity_pred_value": 2.6687729358673096, "affinity_probability_binary": 0.388904333114624, "affinity_pred_value1": 2.8420357704162598, "affinity_probability_binary1": 0.3521701693534851, "affinity_pred_value2": 2.4955101013183594, "affinity_probability_binary2": 0.42563849687576294 }
These are my results:
{ "affinity_pred_value": 2.5308895111083984, "affinity_probability_binary": 0.40284594893455505, "affinity_pred_value1": 2.6539573669433594, "affinity_probability_binary1": 0.3593267500400543, "affinity_pred_value2": 2.4078216552734375, "affinity_probability_binary2": 0.4463651478290558 }
As the saying goes, close enough for government work.
These are my results:
{ "affinity_pred_value": 2.5308895111083984, "affinity_probability_binary": 0.40284594893455505, "affinity_pred_value1": 2.6539573669433594, "affinity_probability_binary1": 0.3593267500400543, "affinity_pred_value2": 2.4078216552734375, "affinity_probability_binary2": 0.4463651478290558 }
As the saying goes, close enough for government work.
Very similar. Thank you.