evaluate icon indicating copy to clipboard operation
evaluate copied to clipboard

Feat: Add normalize option to CER and WER metrics for normalized score calculation

Open skyil7 opened this issue 9 months ago • 2 comments

This pull request introduces a normalize option to the compute() function of both the CER and WER metrics. When set to True, the metrics will calculate and return normalized scores.

This addresses the feature request raised in issue #161 from 2022, which has remained unaddressed. This implementation allows users to calculate CER and WER scores ranging from 0 to 100%, as requested in the issue.

The normalized CER is calculated as:

CER_normalized = (Insertions + Substitutions + Deletions) / (Insertions + Substitutions + Deletions + Correct Characters)

The normalized WER is calculated similarly, at the word level.

skyil7 avatar Mar 24 '25 08:03 skyil7

Hello @lhoestq,

I hope you're doing well.

I'm writing to gently follow up on this PR. It's a small and straightforward change that introduces normalized versions of WER and CER for ASR evaluation.

The goal is to provide a more robust metric against outliers, which can heavily skew the standard scores. Although the implementation is minimal, I believe this addition offers significant value to researchers.

Since the change is quite small, I hope it will be quick to review. Please let me know if you have any feedback.

Thank you!

skyil7 avatar Aug 14 '25 04:08 skyil7

I'd love this to be implemented!

saattrupdan avatar Nov 27 '25 11:11 saattrupdan