You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, is there any chance that the codes can be released recently?
BTW, I'm curious about the classification performances of the normalized topic vectors generated by different models, which I think will show whether the topic vectors are actually more representative and distinguishable.
Another question is that can we remove the KLD in perplexity computation? Since the definition of perplexity only involves two conditional probabilities p(z|d) and p (w|z) (which does not include the KL term in ELBO), and we do not concern if the KLD (I haven't seen any paper discuss about the effect of KLD for NTMs) is low enough that the decoder is able to generate new samples with samples drawn from MF-Gaussian (for NTMs, the generated vectors only contains log-likelihoods, which I think is less helpful) ?
Thank for sharing your excellent work!
The text was updated successfully, but these errors were encountered:
Hi, is there any chance that the codes can be released recently?
BTW, I'm curious about the classification performances of the normalized topic vectors generated by different models, which I think will show whether the topic vectors are actually more representative and distinguishable.
Another question is that can we remove the KLD in perplexity computation? Since the definition of perplexity only involves two conditional probabilities p(z|d) and p (w|z) (which does not include the KL term in ELBO), and we do not concern if the KLD (I haven't seen any paper discuss about the effect of KLD for NTMs) is low enough that the decoder is able to generate new samples with samples drawn from MF-Gaussian (for NTMs, the generated vectors only contains log-likelihoods, which I think is less helpful) ?
Thank for sharing your excellent work!
The text was updated successfully, but these errors were encountered: