forked from bozheng-hit/ELMo
-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to get to full sentence Embedding? #70
Comments
Hi all, I have the same question: does anybody know how to compute the fixed mean-pooling? Thanks! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi. In the tfhub of elmo (https://tfhub.dev/google/elmo/2) there is an output like you provide:
elmo: the weighted sum of the 3 layers, where the weights are trainable. This tensor has shape [batch_size, max_length, 1024]
I believe it's the equivalent for
-1 for an average of 3 layers. (default)
I want to take the output you provide (elmo) and turn it into sentence embedding of elmo:
default: a fixed mean-pooling of all contextualized word representations with shape [batch_size, 1024].
How do I do this fixed mean pooling?
How do I get sentence embedding from your output?
Produces different outputs, but no output is in shape of (2, 1024) which I want (2 sentences embedding)
How can I do this max pooling in order to reach output of (2, 1024)?
Thanks!
The text was updated successfully, but these errors were encountered: