ollama icon indicating copy to clipboard operation
ollama copied to clipboard

allow temperature to be set on command line ( w/out using a modelfile )

Open pracplayopen opened this issue 1 year ago • 6 comments

would be super helpful to set temperature for models via command line, rather than having to create a separate model file for every model and temperature combination.

pracplayopen avatar Jun 28 '24 18:06 pracplayopen

Trying to get a handle on the use case here. Are you looking to do something like:

$ ollama run --temperature 0.7 gemma2
>>>

instead of:

$ ollama run gemma2
>>> /set parameter temperature 0.7
>>>

rick-github avatar Jun 28 '24 18:06 rick-github

yes exactly.

the second example is useful for one-off tests, but since '/set parameter temperature' can't be scripted, it limits all situations where the prompt is generated outside of ollama (eg when prompt isn't created interactively).

pracplayopen avatar Jun 28 '24 18:06 pracplayopen

Understood. We currently use expect for this sort of scripting, and command line args would be generally useful.

#!/bin/bash

temperature=0
num_ctx=2048

eval set -- $(getopt --options=t:,n: --longoptions=temperature:,num_ctx: --name "$0" -- "$@")

while : ; do
  case "$1" in
    -t|--temperature)   temperature=$2
                        shift 2 ;;
    -n|--num_ctx)       num_ctx=$2
                        shift 2 ;;
    --)                 shift
                        break ;;
    *)                  exit 1 ;;
  esac
done

args=("${@:1:2}")
[ "${#@}" == 2 ] && { command=interact ; } || { command='send "'"${@:3}"'\r" ; expect ">>>" close' ; }

expect -f <(cat <<EOF
spawn ollama ${args[*]}
expect ">>>"
send "/set parameter temperature $temperature\r" ; expect ">>>"
send "/set parameter num_ctx $num_ctx\r" ; expect ">>>"
$command
EOF)

rick-github avatar Jun 28 '24 20:06 rick-github

yes, it can also be done w/curl using REST api:

curl http://localhost:11434/api/generate -d '{
  "model": "llama3",
  "prompt": "who was president of US in 2023",
  "options": {
    "temperature": 0
  }
}'

but since temperature and the prompt sort of "go-together" in terms of how much they can impact the response, it would be faster to be able to specify both (vs just prompt as is now possible) on the the built-in command line, in one place.

pracplayopen avatar Jun 28 '24 21:06 pracplayopen

Looking at this briefly... because based on the above two workarounds, conceptually this feature seems like it should be easy to add. Perhaps people who know more about the source might comment further

I was able to see that it appears that the 'runOptions' structure corresponds to the same options listed in 'ollama run --help' in the cli

So ideally the goal would be to parse a new option --temperature into runOptions, eg:


type runOptions struct {
	Model       string
	ParentModel string
	Prompt      string
	Messages    []api.Message
	WordWrap    bool
	Format      string
	System      string
	Template    string
	Images      []api.ImageData
	Options     map[string]interface{}
	MultiModal  bool
	KeepAlive   *api.Duration
        Temperature float64 // not sure if this is valid decl of golang field
}

then questions might be:

  1. where does parsing for cli-based run command options occur?
  2. what is done when '/set temperature' is called during interactive chat?

pracplayopen avatar Jun 29 '24 08:06 pracplayopen

My question about this on StackExchange has over 10k views now, so it seems that many people would appreciate this feature!

joliss avatar Dec 10 '24 14:12 joliss

Patch allowing that: https://github.com/ollama/ollama/pull/8340

pacien avatar Jan 09 '25 22:01 pacien