optimum icon indicating copy to clipboard operation
optimum copied to clipboard

Proper handling of left-padded inputs

Open njhill opened this issue 2 years ago • 2 comments

At least for some models including Codegen, I'm observing very inconsistent outputs using ORTModelForCausalLM when the same inputs have different amounts of left padding (but correct corresponding attention mask). In other words an equivalent problem to the one with vanilla transformers reported in https://github.com/huggingface/transformers/issues/21080 and with fixes in https://github.com/huggingface/transformers/pull/21853 and https://github.com/huggingface/transformers/pull/22069.

This comment alludes to something w.r.t. handling of position_ids which I was wondering might be related.

njhill avatar Mar 10 '23 21:03 njhill

Thank you for the report, will have a look shortly!

fxmarty avatar Mar 11 '23 17:03 fxmarty

This issue has been marked as stale because it has been open for 30 days with no activity. This thread will be automatically closed in 5 days if no further activity occurs.

github-actions[bot] avatar Jun 13 '25 02:06 github-actions[bot]