LAVIS
LAVIS copied to clipboard
BLIP image text retrival evaluation score
Hello, I appreciate the work you've done. I would like to ask you a question about how to interpret the image text retrieval score. I received a score like this: {"txt_r1": 0.15360983102918588, "txt_r5": 0.7680491551459293, "txt_r10": 1.228878648233487, "txt_r_mean": 0.7168458781362007, "img_r1": 0.097007324052966, "img_r5": 0.5577921133045545, "img_r10": 1.261095212688558, "img_r_mean": 0.6386315500153595, "r_mean": 0.6777387140757801, "agg_metrics": 0.7168458781362007}. Could you please explain if it is a good score and why some metrics are greater than 1? Thank you very much!