vec2text
vec2text copied to clipboard
What's happening in the example?
I didn't find the example that clear but I have a guess at what's happening. Might be worth spelling out something to the effect of trying to map text to an embedding and then right back to the original text, or perhaps a smattering of points in the pre-image of the embedding, not really sure because it's not clear from what's written.
My two cents.
@startakovsky can you be more specific? which example, and what did you think is confusing
Looks like the example takes in text and outputs text. What's happening?
vec2text
has a function invert_strings
which takes a list of strings and produces a list of strings.
The name of the function was confusing to me.
In my mind it's a misnomer if what is actually happening is:
- Input List of strings
- Produce embeddings associated to those strings
- Then run invert_embeddings under the hood
Maybe this is because this whole thing seems to be about:
$\mathcal{E}(strings) = embeddings
$\mathcal{E}^{-1}(embeddings) = strings
And so maybe what would be helpful is thinking about this like:
The goal of invert_strings
is to find similar strings. The way we do that is we embed each input, the running our algorithm to find the inverse of the embedding, landing on a semantically similar list of strings
The goal of inverse embedding is to find strings, when embedded that produce the embeddings.
@jxmorris12 does that help?