Tao Shen

Results 1 comments of Tao Shen

I'm having a similar error as @phquanta when setting "finetune_batch_size" >=2. It seems that self.max_len would always be 0 when data_type is set as 'finetune'. This causes no 'A' padding...