evaluate icon indicating copy to clipboard operation
evaluate copied to clipboard

Add Precision@k and Recall@k metrics

Open Andron00e opened this issue 1 year ago • 0 comments

Previous Precision and Recall metrics, supported by evaluate are only sklearn clones. It would be great to add an top k version for those metrics.

For example, a simple implementation of P@k metric is:

def precision_at_k(y_true, y_score, k):
    df = pd.DataFrame({'true': y_true, 'score': y_score}).sort('score')
    threshold = df.iloc[int(k*len(df)),1]
    y_pred = pd.Series([1 if i >= threshold else 0 for i in df['score']])
    return metrics.precision_score(y_true, y_pred)

Andron00e avatar Mar 02 '24 16:03 Andron00e