LV_groundhog
LV_groundhog copied to clipboard
Do you have any plan to implement large vocabulary in blocks? I will appreciate if you could share it.
``` def init_extra_parameters(model, state): # May want to add skip_init later model.large_W_0_enc_approx_embdr = eval(state['weight_init_fn'])(state['large_vocab_source'], state['rank_n_approx'], -1, state['weight_scale'], model.rng) model.large_W_0_dec_approx_embdr = eval(state['weight_init_fn'])(state['large_vocab_target'], state['rank_n_approx'], -1, state['weight_scale'], model.rng) model.large_W2_dec_deep_softmax = eval(state['weight_init_fn'])(state['rank_n_approx'], state['large_vocab_target'], -1,...
Hello everyone, We're trying to use this for an NMT system and apply preprocessing/postprocessing techniques to reduce the UNK issue. We have trained a model, but we're getting the following...