sb-nmt icon indicating copy to clipboard operation
sb-nmt copied to clipboard

Code for Synchronous Bidirectional Neural Machine Translation (SB-NMT)

Results 3 sb-nmt issues
Sort by recently updated
recently updated
newest added

您好,为什么预测的的起始位置标志使用的是pad的id 0 ?还有l2r、r2l标志使用的id 2和3,和一部分词的id重复吗?您的vocab_file里面有l2r、r2l标志吗? 还有在计算encoder input的时候,为什么要把所有的embedding再加一个值?这里remove的功能没看明白。 def transformer_prepare_encoder(inputs, hparams): """Prepare one shard of the model for the encoder. """ # Flatten inputs. ishape_static = inputs.shape.as_list() encoder_input = inputs encoder_padding =...

您好,我在看代码时,训练阶段decode过程中,计算Multihead-self attention时,我看只用了对未来词的掩码信息,没有使用pad部分的掩码,这里不是特别理解,请指教。 def transformer_prepare_decoder(targets_l2r, targets_r2l, hparams): """Prepare one shard of the model for the decoder. """ decoder_self_attention_bias = ( common_attention.attention_bias_lower_triangle(tf.shape(targets_l2r)[1])) ## [1, 1, length, length] decoder_input_l2r = common_layers.shift_left_3d(targets_l2r) decoder_input_r2l =...

@wszlong Hello Senior: Can you please give five data of the train set before exeuce the ./datagen.sh for I don't understand the format of the following three lines well. Thanks...