litgpt
litgpt copied to clipboard
NotImplementedError: max_seq_length 264 needs to be >= 857
hi When I run sequentially.py, the following error is prompted. Increasing the number of GPUs also prompts this error.
raise NotImplementedError(f"max_seq_length {model.max_seq_length} needs to be >= {max_returned_tokens - 1}") NotImplementedError: max_seq_length 264 needs to be >= 857
how can i do , thanks
Hi! I don't think this should happen. Can you share the exact command that you ran, the complete error stacktrace, and any changes you made to the repository?