ontogpt
ontogpt copied to clipboard
Adjust context limit dynamically
Primarily of relevance to the GPT-16k models (see also #133 ), but also to any model with a context limit >4k.
Limits are currently hardcoded in may places, but shouldn't be if we actually want to take advantage of larger context.
For the pubmed-annotate
command in particular, some abstracts alone may exceed the context size, but we'd like to be able to parse more text at once without chunking.