ata
ata copied to clipboard
Add support for greater token lengths
Currently if you try to use the gpt-4o model with >2048 token length, the following exception occurs upon prompting:
thread '<unnamed>' panicked at ata/src/prompt.rs:81:20:
called `Option::unwrap()` on a `None` value
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
I am requesting that we add support for a great token length, as the pricing and performance is much better.
Reproduction:
- Change model to gpt-4o, token length >2048
- Try prompting
Correction, it appears the exception is occurring due to max token selection. Would be nice to be able to support greater than 2048 tokens.
Have you tried https://github.com/simonw/llm by any chance? It has a chat mode too and is generally much better than ata if you ask me
Have you tried https://github.com/simonw/llm by any chance? It has a chat mode too and is generally much better than ata if you ask me
I haven't, thank you for the rec!