LV_groundhog icon indicating copy to clipboard operation
LV_groundhog copied to clipboard

Question about large vocabulary parameter

Open wenhuchen opened this issue 8 years ago • 0 comments

def init_extra_parameters(model, state): # May want to add skip_init later
    model.large_W_0_enc_approx_embdr = eval(state['weight_init_fn'])(state['large_vocab_source'], state['rank_n_approx'], -1, state['weight_scale'], model.rng)
    model.large_W_0_dec_approx_embdr = eval(state['weight_init_fn'])(state['large_vocab_target'], state['rank_n_approx'], -1, state['weight_scale'], model.rng)
    model.large_W2_dec_deep_softmax = eval(state['weight_init_fn'])(state['rank_n_approx'], state['large_vocab_target'], -1, state['weight_scale'], model.rng)
    model.large_b_dec_deep_softmax = init_bias(state['large_vocab_target'], 0., model.rng)

I assume that it's large vocabulary to embedding mapping matrix, right? But where did you apply it in the computational graph, I have found these parameter elsewhere in the layer.

wenhuchen avatar Mar 02 '16 17:03 wenhuchen