deep-belief-nets-for-topic-modeling icon indicating copy to clipboard operation
deep-belief-nets-for-topic-modeling copied to clipboard

Recalculating D in contrastive divergence

Open alexminnaar opened this issue 9 years ago • 0 comments

In __contrastive_divergence_rsm__ You are passing in D. Then you are using D as the number of trials in the multinomial sample in

for i in xrange(len(vis)):
            neg_vis[i] = random.multinomial(D[i], softmax_value[i], size=1)

Then you are recomputing D in

D = sum(neg_vis, axis=1)

Which is summing the results of the multinomial trials across events, which will always be equal to the number of trials. Therefore it does not look like D ever actually changes, so it does not look like there is any need to recompute it in __contrastive_divergence_rsm__

alexminnaar avatar Oct 01 '15 17:10 alexminnaar