docker-prompt-generator
docker-prompt-generator copied to clipboard
从文本中生成时报错
错误信息
Setting pad_token_id to eos_token_id:50256 for open-end generation.
Input length of input_ids is 88, but max_length is set to 79. This can lead to unexpected behavior. You should consider increasing max_new_tokens.
Setting pad_token_id to eos_token_id:50256 for open-end generation.
Input length of input_ids is 88, but max_length is set to 74. This can lead to unexpected behavior. You should consider increasing max_new_tokens.
Setting pad_token_id to eos_token_id:50256 for open-end generation.
Input length of input_ids is 88, but max_length is set to 88. This can lead to unexpected behavior. You should consider increasing max_new_tokens.
Setting pad_token_id to eos_token_id:50256 for open-end generation.
Input length of input_ids is 88, but max_length is set to 78. This can lead to unexpected behavior. You should consider increasing max_new_tokens.
Setting pad_token_id to eos_token_id:50256 for open-end generation.
Input length of input_ids is 88, but max_length is set to 81. This can lead to unexpected behavior. You should consider increasing max_new_tokens.
请问这个需要设置什么参数吗?
看上去是 transformer 的报错,这个issue稍挂起下,我看看锁定部分依赖之后,是否会解决。
如果没有,我就单独起一个 bugfix 分支解决和优化这部分的问题。