Liu Xiaochen
Liu Xiaochen
I divided the large English corpus into several subsets and ran multiple CorenLp commands simultaneously, but the following error always occurs after a period of time: """ Exception in thread...
In 《attention is all you need》,lrate in noam decay they used is formulated as: "lrate = d−0.5 · min(step_num−0.5, step_num · warmup_steps−1.5)" But in your code,I found there is an...
在A100上进行batch测试时没有问题,换到A10上会报错,打印出来next token logits,padding部分对应的输出值是nan。采用的是left padding,pad值为0,请问这是什么原因导致的呢? infer代码: messages = ['' + f"原始输入文本" + '' for data in messages] res = tokenizer(messages, padding=True) input_ids_list = res['input_ids'] input_ids = torch.LongTensor(input_ids_list).to(model.device) outputs = model.generate(input_ids=input_ids) 报错: