ruby-openai
ruby-openai copied to clipboard
Configure maximum length
Hey everyone 👋
In Chat GPT UI, you can configure the maximum length / token the model(assistance) should return. Is it possible to configure this at the moment?
Fairly sure this is "max_tokens" as per the API Docs and seems to work fine.
https://platform.openai.com/docs/api-reference/chat/create
Thanks @MyklClason
Is this configurable in this SDK?
Check the docs for "Chat." You just set it as one of the parameters. max_tokens: 500
or something.
response = client.chat(
parameters: {
model: "gpt-3.5-turbo", # Required.
messages: [{ role: "user", content: "Hello!"}], # Required.
temperature: 0.7,
max_tokens: 500,
})
puts response.dig("choices", 0, "message", "content")
# => "Hello! How may I assist you today?"
P.S. That model isn't the one you want to be using, so make sure to update that.
Thanks for the question @eslamodeh and the answer @MyklClason !