You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I didn't find the example that clear but I have a guess at what's happening. Might be worth spelling out something to the effect of trying to map text to an embedding and then right back to the original text, or perhaps a smattering of points in the pre-image of the embedding, not really sure because it's not clear from what's written.
My two cents.
The text was updated successfully, but these errors were encountered:
Looks like the example takes in text and outputs text. What's happening?
vec2text has a function invert_strings which takes a list of strings and produces a list of strings.
The name of the function was confusing to me.
In my mind it's a misnomer if what is actually happening is:
Input List of strings
Produce embeddings associated to those strings
Then run invert_embeddings under the hood
Maybe this is because this whole thing seems to be about:
$\mathcal{E}(strings) = embeddings
$\mathcal{E}^{-1}(embeddings) = strings
And so maybe what would be helpful is thinking about this like:
The goal of invert_strings is to find similar strings. The way we do that is we embed each input, the running our algorithm to find the inverse of the embedding, landing on a semantically similar list of strings
The goal of inverse embedding is to find strings, when embedded that produce the embeddings.
I didn't find the example that clear but I have a guess at what's happening. Might be worth spelling out something to the effect of trying to map text to an embedding and then right back to the original text, or perhaps a smattering of points in the pre-image of the embedding, not really sure because it's not clear from what's written.
My two cents.
The text was updated successfully, but these errors were encountered: