ontogpt icon indicating copy to clipboard operation
ontogpt copied to clipboard

Adjust context limit dynamically

Open caufieldjh opened this issue 8 months ago • 0 comments

Primarily of relevance to the GPT-16k models (see also #133 ), but also to any model with a context limit >4k. Limits are currently hardcoded in may places, but shouldn't be if we actually want to take advantage of larger context. For the pubmed-annotate command in particular, some abstracts alone may exceed the context size, but we'd like to be able to parse more text at once without chunking.

caufieldjh avatar Oct 20 '23 16:10 caufieldjh