gpt-migrate icon indicating copy to clipboard operation
gpt-migrate copied to clipboard

adding max_tokens to cli

Open ctr26 opened this issue 1 year ago • 6 comments

Adding max tokens to cli as using gpt3.5 crashes when the context length is 10k

ctr26 avatar Jul 02 '23 10:07 ctr26

@ctr26 This will be solved with #2 - we'll have a mapping of model -> context window (max tokens) and we'll break down files and prompts accordingly.

joshpxyne avatar Jul 03 '23 22:07 joshpxyne

Ааа

marina727 avatar Jul 04 '23 16:07 marina727

I don't have access to gpt-4-32k. How can I use gpt-4? I get this error even with these changes

openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, you requested 10601 tokens (601 in the messages, 10000 in the completion). Please reduce the length of the messages or completion.
        max_tokens: int = typer.Option(8192),
    ):

    ai = AI(
        model=model,
        temperature=temperature,
        max_tokens=int(max_tokens),
    )

gianpaj avatar Jul 04 '23 18:07 gianpaj

@gianpaj The output also contributes to the number of tokens. If your model has a max context window of 8k, you're probably better off making max_tokens 4k or so.

joshpxyne avatar Jul 04 '23 19:07 joshpxyne

vvdd

danixv9 avatar Jul 05 '23 09:07 danixv9

Is it also possible to also add something that can change the openai base url? So this could work with the microsoft azure openai endpoint or proxies

Ran-Mewo avatar Jul 11 '23 14:07 Ran-Mewo

@Ran-Mewo Yes definitely, I'll try to get to this later - feel free to also submit a PR for this if you'd like

joshpxyne avatar Jul 14 '23 06:07 joshpxyne