stanford_alpaca
stanford_alpaca copied to clipboard
Token input limit only 500?
When I prompted Alpaca with ~800 tokens I got this error: Token indices sequence length is longer than the specified maximum sequence length for this model (521 > 512). Running this sequence through the model will result in indexing errors
Is this alpaca/llama as a whole, or my huggingface transformers, or what?