langchaingo icon indicating copy to clipboard operation
langchaingo copied to clipboard

Man/mix length Call options not working

Open guidoveritone opened this issue 1 year ago • 1 comments

Model: llama3 LangchaingoVersion: v0.1.10

I was trying to use the llms.WithMaxLength and llms.WithMinLength to set some output limits, but seems like the model doesn't respect these options.

callOptions = append(callOptions, llms.WithMaxLength(50), llms.WithMinLength(10))

then I run the model as the following:

contentResponse, err = o.Llm.GenerateContent(ctx, content, callOptions...)
..
	for _, choice := range contentResponse.Choices {
		output += choice.Content
		errors += choice.StopReason
	}
..

But I have responses with lengths higher than the one that I set, I don't know if this is a bug or if I am using the wrong option.

Also noticed that if I use WithMaxTokens(2) it works, but it feels like the model is just cutting off its response since I asked

prompt: "Man is naturally evil or is corrupted by society?"

And the model gave me:

output: "A Classic"

but the problem is, if I increase the MaxTokens value, I get:

output: "A classic debate!\n\nThe idea that man is corrupted by society, also known as the "social corruption" or "societal influence" theory, suggests that human nature is inherently good and that societal factors, such as culture, norms, and institutions"

guidoveritone avatar May 29 '24 13:05 guidoveritone

Model: llama3 LangchaingoVersion: v0.1.10

I was trying to use the llms.WithMaxLength and llms.WithMinLength to set some output limits, but seems like the model doesn't respect these options.

callOptions = append(callOptions, llms.WithMaxLength(50), llms.WithMinLength(10))

then I run the model as the following:

contentResponse, err = o.Llm.GenerateContent(ctx, content, callOptions...)
..
	for _, choice := range contentResponse.Choices {
		output += choice.Content
		errors += choice.StopReason
	}
..

But I have responses with lengths higher than the one that I set, I don't know if this is a bug or if I am using the wrong option.

Also noticed that if I use WithMaxTokens(2) it works, but it feels like the model is just cutting off its response since I asked

prompt: "Man is naturally evil or is corrupted by society?"

And the model gave me:

output: "A Classic"

but the problem is, if I increase the MaxTokens value, I get:

output: "A classic debate!\n\nThe idea that man is corrupted by society, also known as the "social corruption" or "societal influence" theory, suggests that human nature is inherently good and that societal factors, such as culture, norms, and institutions"

ollama haven't supporte to WithMaxLength, only WithMaxTokens

devalexandre avatar Jun 26 '24 14:06 devalexandre