Michael Feil

Results 125 comments of Michael Feil

Sounds useful for me. Also Instuction Tuned Models have often Stop sequences instead of a stop token. Just for clarification, this is on a sequence of valid tokens in the...

@ferboz For this reason trying to add "CodeGen" (v1) the decoder part of codet5p-2B https://github.com/OpenNMT/CTranslate2/pull/1230 The smaller codet5p-770M and 220M are fully supported.

Maybe related to: https://github.com/LibreTranslate/LibreTranslate/issues/441

@OlivierDehaene I compared TEI on batch inference. Takeaways: - embedding 2048 random sentences is (Tokenization / Queing is fast -> *7.10x* speedup on my system -> great work. - TEI...

Thanks for the details. Probably this precision I would suggest adding a PR that validates, that e.g. `Cosine-Distance` of `TEI` vs `sentence-transformers` is `>0.999` - compare to https://github.com/qdrant/fastembed/pull/54/files