Weighted-Boxes-Fusion
Weighted-Boxes-Fusion copied to clipboard
How to set weights in wbf correctly?
Hi, @ZFTurbo ! Thank you for your amazing approach and code!
I have a question: what is a right way of choosing weights parameter? I have several models and in case if I choose weights greater than 1, for example [2,1], I get confidence values greater than 1 in result. But if I choose normalized weights (the sum of weights is equal to 1), I get values less than 0.5 in result.
Hi. It's actually strange. Confidences shouldn't be higher than 1.0 in case initital confidences are lower than 1.0: https://github.com/ZFTurbo/Weighted-Boxes-Fusion/blob/master/ensemble_boxes/ensemble_boxes_wbf.py#L193
Can you provide some example?
I use wbf function like this:
wbf_weights = [2,1]
bboxes_after, scores_after, _ = weighted_boxes_fusion(bboxes_before, scores_before, labels, weights=wbf_weights, iou_thr=0.5, skip_box_thr=0.5)
I have two models. Here is what i got:
Before WBF scores: [array([0.9977786 , 0.9974312 , 0.99665546, 0.9961797 , 0.99595106, 0.9959025 , 0.9951055 , 0.99473274, 0.9934974 , 0.9925368 , 0.9921852 , 0.9919086 , 0.991533 , 0.99142426, 0.99049884, 0.9898454 , 0.9889338 , 0.98852545, 0.9825737 , 0.98184353, 0.9745685 , 0.97205484, 0.9484585 , 0.90423274, 0.44000337, 0.3230264 , 0.16097704, 0.15068197, 0.08912976, 0.06615119, 0.99742246, 0.9960743 , 0.9958955 , 0.9958204 , 0.995802 , 0.99573505, 0.9952885 , 0.99428236, 0.9939891 , 0.99330455, 0.9929404 , 0.9925984 , 0.99258256, 0.9911237 , 0.9907608 , 0.9903923 , 0.99019754, 0.98959094, 0.9863783 , 0.984846 , 0.9816514 , 0.9790066 , 0.9477571 , 0.9353405 , 0.4686266 , 0.24165814, 0.22806387, 0.22264646, 0.20754859, 0.997681 , 0.99736696, 0.99682957, 0.9964348 , 0.99589866, 0.9953875 , 0.9947024 , 0.99452895, 0.9935161 , 0.9927846 , 0.99199355, 0.99100304, 0.99014574, 0.9899568 , 0.9886205 , 0.9881203 , 0.9879832 , 0.98791677, 0.9871027 , 0.9866355 , 0.97719175, 0.9754893 , 0.9459295 , 0.9322661 , 0.63471746, 0.33397028, 0.15991715, 0.12448829, 0.11588743, 0.08867898, 0.9976018 , 0.9970757 , 0.99680626, 0.99643886, 0.9961377 , 0.99587256, 0.9941677 , 0.9938611 , 0.99382716, 0.9927401 , 0.9925801 , 0.9925648 , 0.99234265, 0.9920265 , 0.9910938 , 0.99099874, 0.98975676, 0.9889409 , 0.98578864, 0.984951 , 0.9795656 , 0.970707 , 0.9673084 , 0.9159516 , 0.57548994, 0.32638064, 0.19865827, 0.17283562, 0.10290644, 0.09215149], dtype=float32), array([0.996139 , 0.9956868 , 0.9952134 , 0.9937417 , 0.993212 , 0.9919069 , 0.99064064, 0.99043673, 0.9898113 , 0.9892559 , 0.98924875, 0.9874286 , 0.98657423, 0.9856828 , 0.9849792 , 0.974309 , 0.97400403, 0.97044206, 0.9690861 , 0.96349394, 0.9585932 , 0.93487936, 0.90693283, 0.852883 , 0.45325774, 0.44711384, 0.30352426, 0.2919561 , 0.19369787, 0.09917659, 0.08308521, 0.9965281 , 0.99601424, 0.99529415, 0.9949421 , 0.9942247 , 0.9918616 , 0.98984665, 0.98919535, 0.9888659 , 0.9865979 , 0.98601323, 0.9854423 , 0.98368675, 0.9834471 , 0.9833239 , 0.98294455, 0.9740772 , 0.9740751 , 0.9716181 , 0.9626577 , 0.952945 , 0.94795144, 0.89805925, 0.8482659 , 0.43133533, 0.38223752, 0.37388977, 0.27312437, 0.25848874, 0.08012464, 0.07549525, 0.05387503, 0.9958483 , 0.9951709 , 0.9947866 , 0.99453473, 0.9940996 , 0.9923167 , 0.9914009 , 0.99068356, 0.98950726, 0.9892786 , 0.98838764, 0.9878847 , 0.9865064 , 0.98346335, 0.9831896 , 0.9795554 , 0.97210294, 0.9679366 , 0.96512824, 0.94997126, 0.9485284 , 0.94398665, 0.9196354 , 0.8935131 , 0.46899638, 0.37480652, 0.3710628 , 0.264169 , 0.2175137 , 0.09871504, 0.09574395, 0.06547204, 0.05086546, 0.9956038 , 0.9954972 , 0.9954686 , 0.9952924 , 0.9938783 , 0.9931137 , 0.993082 , 0.9928517 , 0.98994863, 0.9898476 , 0.98955846, 0.9861065 , 0.9839573 , 0.98330057, 0.98238593, 0.97939336, 0.9750014 , 0.97404695, 0.97242546, 0.95809156, 0.9512658 , 0.94955885, 0.9210341 , 0.91206634, 0.5457824 , 0.44105437, 0.32224816, 0.31169373, 0.16845997, 0.08233506, 0.06334827, 0.0596779 ], dtype=float32)] After WBF scores: [1.4954932 1.4946989 1.4934922 1.4933244 1.4922881 1.4903606 1.490118 1.4885001 1.4868019 1.486702 1.4858279 1.485031 1.4845722 1.4841385 1.4829789 1.4826114 1.4774904 1.4751894 1.4627302 1.4621084 1.4598495 1.4508471 1.3907044 1.3776555 0.9887324]