AutoGPT icon indicating copy to clipboard operation
AutoGPT copied to clipboard

Auto-switch to gpt-35-turbo, gpt-4 and gpt-4-32k when number of tokens exceeded by query

Open NKT00 opened this issue 2 years ago • 9 comments

Duplicates

  • [X] I have searched the existing issues

Summary 💡

There's a few bug reports close to this, but, would it not make sense to get rid of the error SYSTEM: Command get_text_summary returned: Error: This model's maximum context length is 4097 tokens. However, your messages resulted in 5113 tokens. Please reduce the length of the messages. by simply swapping model, just for that query, when the length of the query is below the limit for another model?

gpt-35-turbo is 4096 tokens, whereas the token limits for gpt-4 and gpt-4-32k are 8192 and 32768 respectively. This could be implemented easily.

Examples 🌈

No response

Motivation 🔦

Everything that pulls a website page fails, as the webpages are too big, generally. However, some are only slightly too big, and could be run through a different model to downsize them first.

NKT00 avatar Apr 26 '23 23:04 NKT00

I strongly support this proposal. This should be easy to implement and would definitely help make this thing being actually useful.

dimitar-d-d avatar Apr 28 '23 19:04 dimitar-d-d

you could probably use some sort of preprocessor/preparation stage prior to passing such contexts to the LLM

Boostrix avatar Apr 28 '23 19:04 Boostrix

you could probably use some sort of preprocessor/preparation stage prior to passing such contexts to the LLM

Yep. I could do that and I do it. I end up splitting my text assignments into three separate runs of AutoGPT just to not to get an error... This, however, is time consuming and impractical.

The ability of the tool to dynamically call the larger LLM model when applicable, and combined with better chunking, will definitely reduce the amount of fatal errors.

dimitar-d-d avatar Apr 28 '23 19:04 dimitar-d-d

has there been any workaround to this? I thought auto-gpt used gpt-4 which had greater token limit than 3.5 but I'm still getting the 4097 max token limit

Rykimaruh avatar May 04 '23 21:05 Rykimaruh

it depends on the level of OpenAI API access you've got

Boostrix avatar May 05 '23 05:05 Boostrix

I'd like to say that there should be a switching mechanism that switches between all of the supported APIs, not just OpenAI's models/APIs.

@p-i- perhaps if and when the repository gets around to implementing the APIs as plugins, maybe add a plugin object that reports the rate limit associated with that API, so that AutoGPT can completely switch plugins, not just models.

anonhostpi avatar May 05 '23 05:05 anonhostpi

t there should be a switching mechanism that switches between all of the supported APIs, not just OpenAI's models/APIs.

Which seems to be work in progress #2158

maybe add a plugin object that reports the rate limit associated with that API, so that AutoGPT can completely switch plugins, not just models.

:+1: the basic idea is this #3466

Boostrix avatar May 05 '23 05:05 Boostrix

love ya boostrix, which one are you on the Discord Server?

anonhostpi avatar May 05 '23 05:05 anonhostpi

I'd like to say that there should be a switching mechanism that switches between all of the supported APIs, not just OpenAI's models/APIs.

that's a form of feature scaling, and #3466 - #528

but agreed, if one model fails, there should be an option try another one - even if that's not the preferred one

Boostrix avatar May 09 '23 07:05 Boostrix

To fix this issue, the batch summarization approach introduced by the PR #4652 can also be applied to summarize_text function in text.py

kinance avatar Jun 13 '23 15:06 kinance

gpt-3.5-turbo-16k is here.

unitythemaker avatar Jun 14 '23 09:06 unitythemaker

This issue has automatically been marked as stale because it has not had any activity in the last 50 days. You can unstale it by commenting or removing the label. Otherwise, this issue will be closed in 10 days.

github-actions[bot] avatar Sep 06 '23 21:09 github-actions[bot]

This issue was closed automatically because it has been stale for 10 days with no activity.

github-actions[bot] avatar Sep 19 '23 01:09 github-actions[bot]

This issue was closed automatically because it has been stale for more than 10 days without any activity.

prathamesh-0909 avatar Mar 18 '24 09:03 prathamesh-0909