InternLM-XComposer
InternLM-XComposer copied to clipboard
interleav_wrap has no padding bug
There is no padding when wrap_embeds = torch.cat(wrap_embeds_list) in interleav_wrap function, Is this a bug in it?
Hi,have you solved this question? I met the same question.
The default batch size is 1 thus no padding.
The default batch size is 1 thus no padding.
Hi, if I want to train or inference with a larger batch size, how do I modify the code?
默认批量大小为 1,因此没有填充。
你好,如果我想用更大的批量进行训练或推理,我该如何修改代码?
你好,请问你现在解决padding的问题了吗?padding的数值应该是多少呢?