argilla icon indicating copy to clipboard operation
argilla copied to clipboard

[FEATURE] Add additional annotator agreement metrics

Open plaguss opened this issue 1 year ago • 2 comments

Is your feature request related to a problem? Please describe. I would like to see more agreement metrics for the FeedbackDataset.

Describe the solution you'd like We can add the metrics of nltk.metrics.agrement initially (the ones in Inter-Coder Agreement for Computational Linguistics).

Describe alternatives you've considered Don't add more metrics for the moment.

Additional context It should be added after the metrics module is merged. More information can be found here.

plaguss avatar Nov 20 '23 09:11 plaguss

I think the basic ones to add might be: (all of which are included in nltk.metrics.agrement)

  • Cohen's Kappa (for two annotators) (https://en.wikipedia.org/wiki/Cohen%27s_kappa)
    • Fleiss' Kappa (for multiple annotators) (https://en.wikipedia.org/wiki/Fleiss%27_kappa)
    • Weighted Kappa (for labels that are ordinal) (https://datatab.net/tutorial/weighted-cohens-kappa)
  • Bennet, Albert and Goldstein's S (https://en.wikipedia.org/wiki/Bennett,_Alpert_and_Goldstein%27s_S)
  • Scott's Pi (https://en.wikipedia.org/wiki/Scott%27s_Pi)

Also, I could not make the KrippendorfAlpha in the module work, it throws some error. If there is no issue yet regarding it, I can open one.

kursathalat avatar Dec 04 '23 12:12 kursathalat

Hi @kursathalat thanks for reviewing this! Please talk to @davidberenstein1957 to see the status of the metrics module.

plaguss avatar Dec 04 '23 13:12 plaguss

This issue is stale because it has been open for 90 days with no activity.

github-actions[bot] avatar Mar 04 '24 01:03 github-actions[bot]

This issue was closed because it has been inactive for 30 days since being marked as stale.

github-actions[bot] avatar Apr 04 '24 01:04 github-actions[bot]