Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Code release & topic quality #1

Open
linkstrife opened this issue Nov 23, 2020 · 0 comments
Open

Code release & topic quality #1

linkstrife opened this issue Nov 23, 2020 · 0 comments

Comments

@linkstrife
Copy link

linkstrife commented Nov 23, 2020

Hi, is there any chance that the codes can be released recently?

BTW, I'm curious about the classification performances of the normalized topic vectors generated by different models, which I think will show whether the topic vectors are actually more representative and distinguishable.

Another question is that can we remove the KLD in perplexity computation? Since the definition of perplexity only involves two conditional probabilities p(z|d) and p (w|z) (which does not include the KL term in ELBO), and we do not concern if the KLD (I haven't seen any paper discuss about the effect of KLD for NTMs) is low enough that the decoder is able to generate new samples with samples drawn from MF-Gaussian (for NTMs, the generated vectors only contains log-likelihoods, which I think is less helpful) ?

Thank for sharing your excellent work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant