deep-person-reid icon indicating copy to clipboard operation
deep-person-reid copied to clipboard

Feature distances not discriminatory

Open mishravishnu opened this issue 3 years ago • 7 comments

Hi,

I am using torchreid as a feature extractor for my custom dataset. When I find euclidean or cosine distances between probe and gallery images, the distances are not very well separated between matching and non-matching candidates. For example, if the probe image is a person in a red dress, then the right match will have a distance of 0.1 whereas someone in a yellow dress will have a distance of 0.2. This does not seem like a big enough separation considering how different the subjects look.

Is this expected behavior for out-of-distribution data? Do I have no other choice apart from retraining the model on my custom dataset?

I am using OSNet model for feature extraction.

mishravishnu avatar Jan 07 '22 15:01 mishravishnu

i have the same issue , the embeddings are entangled

Mohamed209 avatar Jan 08 '22 12:01 Mohamed209

More than checking case by case, you should probably do an overall accuracy test. I've tried few different reid models: DeepSort, ResNet50 and OSNet. With all of them, a similarity threshold of 0.1 or 0.05 for example would change my overall accuracy a lot.

corentin87 avatar Jan 12 '22 23:01 corentin87

hey guys did u find any solution?.. I too am facing this problem

lakshaydulani avatar Apr 07 '22 12:04 lakshaydulani

This did not work out for me. Tried a DualNorm based implementation instead. That gave me very good results for my use case.

mishravishnu avatar Apr 08 '22 13:04 mishravishnu

@mishravishnu can u please provide the link for any repo u used?

lakshaydulani avatar Apr 08 '22 13:04 lakshaydulani

https://github.com/BJTUJia/person_reID_DualNorm

mishravishnu avatar Apr 08 '22 13:04 mishravishnu

@mishravishnu do u mind sharing ur code (only reID part) and what threshold you are using..thanks

lakshaydulani avatar Apr 09 '22 08:04 lakshaydulani