facenet-pytorch icon indicating copy to clipboard operation
facenet-pytorch copied to clipboard

128 Embeddings

Open duckroll opened this issue 3 years ago • 3 comments

Is there a way to use old models with 128 embeddings instead of 512?

duckroll avatar Mar 29 '21 09:03 duckroll

I think:

  • Low latency compute distance.
  • Features extracted are condensed
  • Reduce resource storage vector embed

docongminh avatar Mar 29 '21 09:03 docongminh

Thanks for answer. I meant to ask how to implement old models that use 128 embeddings, not the benefits of using 512 embeddings.

duckroll avatar Mar 29 '21 09:03 duckroll

Add an extra nn.Linear(512,128, bias=False)?

lcc157 avatar Apr 28 '21 07:04 lcc157