facenet-pytorch
facenet-pytorch copied to clipboard
128 Embeddings
Is there a way to use old models with 128 embeddings instead of 512?
I think:
- Low latency compute distance.
- Features extracted are condensed
- Reduce resource storage vector embed
Thanks for answer. I meant to ask how to implement old models that use 128 embeddings, not the benefits of using 512 embeddings.
Add an extra nn.Linear(512,128, bias=False)?