torchchat
torchchat copied to clipboard
don't default max_seq_length to 128 for executorch models
🐛 Describe the bug
ExecuTorch has a bug right now so we need to default max_seq_length to 128. Once this has been fixed remove the default here and during export. https://github.com/pytorch/torchchat/pull/1184
Versions
N/A