argilla
argilla copied to clipboard
[FEATURE] Add additional annotator agreement metrics
Is your feature request related to a problem? Please describe.
I would like to see more agreement metrics for the FeedbackDataset
.
Describe the solution you'd like
We can add the metrics of nltk.metrics.agrement
initially (the ones in Inter-Coder Agreement for
Computational Linguistics).
Describe alternatives you've considered Don't add more metrics for the moment.
Additional context It should be added after the metrics module is merged. More information can be found here.
I think the basic ones to add might be: (all of which are included in nltk.metrics.agrement)
- Cohen's Kappa (for two annotators) (https://en.wikipedia.org/wiki/Cohen%27s_kappa)
- Fleiss' Kappa (for multiple annotators) (https://en.wikipedia.org/wiki/Fleiss%27_kappa)
- Weighted Kappa (for labels that are ordinal) (https://datatab.net/tutorial/weighted-cohens-kappa)
- Bennet, Albert and Goldstein's S (https://en.wikipedia.org/wiki/Bennett,_Alpert_and_Goldstein%27s_S)
- Scott's Pi (https://en.wikipedia.org/wiki/Scott%27s_Pi)
Also, I could not make the KrippendorfAlpha
in the module work, it throws some error. If there is no issue yet regarding it, I can open one.
Hi @kursathalat thanks for reviewing this! Please talk to @davidberenstein1957 to see the status of the metrics module.
This issue is stale because it has been open for 90 days with no activity.
This issue was closed because it has been inactive for 30 days since being marked as stale.