bjuthjliu

Results 2 issues of bjuthjliu

Source Code: ```python padding_num = -2 ** 32 + 1 if type in ("k", "key", "keys"): key_masks = tf.to_float(key_masks) key_masks = tf.tile(key_masks, [tf.shape(inputs)[0] // tf.shape(key_masks)[0], 1]) # (h*N, seqlen) key_masks...

def forward(self, inputs): # Avoid breaking if the last batch has a different size batch_size = inputs.size(0) if batch_size != self.batch_size: self.batch_size = batch_size encoded = self.encoder(inputs) output, hidden =...