triplet-reid
triplet-reid copied to clipboard
Code for reproducing the results of our "In Defense of the Triplet Loss for Person Re-Identification" paper.
Please provide the 'args.json' file which was used during training in Market-1501 TF Checkpoint Release, so that 'embed.py' fiile could be used for generating embeddings.
So this is a branch I had lying around for a while which adds a lot of things. I'm not merging it yet, as I still want to add these...
I use the same triplet loss (with BatchHard, Euclidean distance and Soft-margin) on the fine-grained categorization dataset [CUB-200-2011.](http://www.vision.caltech.edu/visipedia/CUB-200-2011.html) It aims to distinguish different species of birds (200 categorization, 5994 image...
Hello, authors. I was wondering, if you could provide some extra details about training on CUHK03. There is [third-party](https://github.com/Cysu/open-reid) re-implementation of your work. This implementation shows almost the same performance...
I follow the README.md,but i can't run the train.py and the error is ValueError: Unable to configure handler 'stderr': bad argument type for built-in operation  can you tell me...
I've been doing some experiments with your batch hard triplet loss function and different architectures/datasets. On MARS I manage to reproduce the results from your paper (network seems to converge),...
Sorry it's not the key point of your paper. After succesfully getting result of your batch hard loss function. I tried to check the result of lifted loss. When I...
Thanks your work firstly. I have some question want to ask you: (1) In your paper 3.3 , your set batch size to 72 containing 18 persons with 4 images....
Hi, we have been doing some experiments to reproduce your results on the Market1501 and MARS dataset, and when using the exactly the same hyperparameters and training strategy in your...
Hi, I am new to deep learning, and thus may not understand your paper fully, hope is all right with you. I tried to implement the batch_hard using Inception_resnet_v1 and...